Apr 17 14:18:28.354222 ip-10-0-138-3 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 17 14:18:28.354236 ip-10-0-138-3 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 17 14:18:28.354245 ip-10-0-138-3 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 17 14:18:28.354566 ip-10-0-138-3 systemd[1]: Failed to start Kubernetes Kubelet. Apr 17 14:18:39.718440 ip-10-0-138-3 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 17 14:18:39.718458 ip-10-0-138-3 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 2ad3d26a8a1f4bd585aff15d3c2abe4b -- Apr 17 14:21:21.548181 ip-10-0-138-3 systemd[1]: Starting Kubernetes Kubelet... Apr 17 14:21:22.117734 ip-10-0-138-3 kubenswrapper[2575]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 14:21:22.117734 ip-10-0-138-3 kubenswrapper[2575]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 17 14:21:22.117734 ip-10-0-138-3 kubenswrapper[2575]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 14:21:22.117734 ip-10-0-138-3 kubenswrapper[2575]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 17 14:21:22.117734 ip-10-0-138-3 kubenswrapper[2575]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 14:21:22.118979 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.118422 2575 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 17 14:21:22.123839 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.123821 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 14:21:22.123839 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.123839 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 14:21:22.123924 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.123843 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 14:21:22.123924 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.123846 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 14:21:22.123924 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.123849 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 14:21:22.123924 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.123852 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 14:21:22.123924 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.123855 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 14:21:22.123924 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.123858 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 14:21:22.123924 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.123860 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 14:21:22.123924 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.123863 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 14:21:22.123924 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.123866 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 14:21:22.123924 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.123891 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 14:21:22.123924 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.123894 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 14:21:22.123924 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.123897 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 14:21:22.123924 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.123900 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 14:21:22.123924 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.123902 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 14:21:22.123924 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.123905 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 14:21:22.123924 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.123908 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 14:21:22.123924 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.123910 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 14:21:22.123924 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.123913 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 14:21:22.123924 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.123915 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 14:21:22.123924 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.123918 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 14:21:22.124422 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.123920 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 14:21:22.124422 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.123923 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 14:21:22.124422 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.123925 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 14:21:22.124422 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.123928 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 14:21:22.124422 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.123931 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 14:21:22.124422 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.123934 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 14:21:22.124422 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.123936 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 17 14:21:22.124422 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.123939 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 14:21:22.124422 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.123942 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 14:21:22.124422 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.123944 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 14:21:22.124422 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.123947 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 14:21:22.124422 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.123949 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 14:21:22.124422 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.123952 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 14:21:22.124422 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.123954 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 14:21:22.124422 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.123957 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 14:21:22.124422 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.123959 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 14:21:22.124422 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.123962 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 14:21:22.124422 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.123964 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 14:21:22.124422 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.123967 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 14:21:22.124908 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.123971 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 14:21:22.124908 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.123975 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 14:21:22.124908 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.123977 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 14:21:22.124908 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.123980 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 14:21:22.124908 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.123982 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 14:21:22.124908 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.123984 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 14:21:22.124908 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.123987 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 14:21:22.124908 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.123990 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 14:21:22.124908 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.123992 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 14:21:22.124908 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.123994 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 14:21:22.124908 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.123997 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 14:21:22.124908 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.123999 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 14:21:22.124908 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124002 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 14:21:22.124908 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124004 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 14:21:22.124908 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124008 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 14:21:22.124908 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124011 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 14:21:22.124908 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124013 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 14:21:22.124908 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124016 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 14:21:22.124908 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124021 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 14:21:22.125382 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124024 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 14:21:22.125382 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124028 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 14:21:22.125382 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124031 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 14:21:22.125382 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124035 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 14:21:22.125382 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124037 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 14:21:22.125382 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124040 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 14:21:22.125382 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124043 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 14:21:22.125382 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124045 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 14:21:22.125382 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124048 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 14:21:22.125382 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124051 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 14:21:22.125382 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124053 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 14:21:22.125382 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124056 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 14:21:22.125382 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124059 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 14:21:22.125382 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124062 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 14:21:22.125382 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124064 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 14:21:22.125382 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124067 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 14:21:22.125382 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124070 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 14:21:22.125382 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124072 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 14:21:22.125382 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124075 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 14:21:22.125382 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124077 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 14:21:22.125977 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124081 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 14:21:22.125977 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124084 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 14:21:22.125977 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124086 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 14:21:22.125977 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124089 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 14:21:22.125977 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124092 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 14:21:22.125977 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124096 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 14:21:22.125977 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124479 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 14:21:22.125977 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124486 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 14:21:22.125977 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124491 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 14:21:22.125977 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124494 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 14:21:22.125977 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124497 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 14:21:22.125977 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124500 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 14:21:22.125977 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124503 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 14:21:22.125977 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124506 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 14:21:22.125977 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124509 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 14:21:22.125977 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124512 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 14:21:22.125977 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124514 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 14:21:22.125977 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124516 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 14:21:22.125977 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124519 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 14:21:22.125977 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124522 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 14:21:22.126455 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124524 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 17 14:21:22.126455 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124526 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 14:21:22.126455 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124529 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 14:21:22.126455 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124532 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 14:21:22.126455 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124536 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 14:21:22.126455 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124539 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 14:21:22.126455 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124542 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 14:21:22.126455 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124544 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 14:21:22.126455 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124547 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 14:21:22.126455 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124549 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 14:21:22.126455 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124552 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 14:21:22.126455 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124554 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 14:21:22.126455 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124556 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 14:21:22.126455 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124559 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 14:21:22.126455 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124562 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 14:21:22.126455 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124564 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 14:21:22.126455 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124567 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 14:21:22.126455 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124569 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 14:21:22.126455 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124572 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 14:21:22.126944 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124575 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 14:21:22.126944 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124578 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 14:21:22.126944 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124581 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 14:21:22.126944 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124584 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 14:21:22.126944 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124587 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 14:21:22.126944 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124589 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 14:21:22.126944 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124592 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 14:21:22.126944 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124594 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 14:21:22.126944 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124596 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 14:21:22.126944 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124599 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 14:21:22.126944 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124601 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 14:21:22.126944 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124604 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 14:21:22.126944 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124606 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 14:21:22.126944 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124608 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 14:21:22.126944 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124611 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 14:21:22.126944 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124613 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 14:21:22.126944 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124616 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 14:21:22.126944 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124618 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 14:21:22.126944 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124621 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 14:21:22.126944 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124623 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 14:21:22.127405 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124626 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 14:21:22.127405 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124628 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 14:21:22.127405 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124630 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 14:21:22.127405 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124633 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 14:21:22.127405 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124635 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 14:21:22.127405 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124637 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 14:21:22.127405 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124640 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 14:21:22.127405 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124642 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 14:21:22.127405 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124645 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 14:21:22.127405 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124647 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 14:21:22.127405 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124649 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 14:21:22.127405 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124653 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 14:21:22.127405 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124656 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 14:21:22.127405 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124659 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 14:21:22.127405 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124661 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 14:21:22.127405 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124664 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 14:21:22.127405 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124666 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 14:21:22.127405 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124669 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 14:21:22.127405 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124671 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 14:21:22.127405 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124674 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 14:21:22.127886 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124676 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 14:21:22.127886 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124680 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 14:21:22.127886 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124682 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 14:21:22.127886 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124684 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 14:21:22.127886 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124687 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 14:21:22.127886 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124689 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 14:21:22.127886 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124691 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 14:21:22.127886 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124694 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 14:21:22.127886 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124696 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 14:21:22.127886 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124699 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 14:21:22.127886 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124701 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 14:21:22.127886 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124704 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 14:21:22.127886 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.124706 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 14:21:22.127886 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126359 2575 flags.go:64] FLAG: --address="0.0.0.0" Apr 17 14:21:22.127886 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126370 2575 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 17 14:21:22.127886 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126377 2575 flags.go:64] FLAG: --anonymous-auth="true" Apr 17 14:21:22.127886 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126382 2575 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 17 14:21:22.127886 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126387 2575 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 17 14:21:22.127886 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126390 2575 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 17 14:21:22.127886 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126395 2575 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 17 14:21:22.127886 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126400 2575 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 17 14:21:22.128421 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126404 2575 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 17 14:21:22.128421 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126409 2575 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 17 14:21:22.128421 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126412 2575 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 17 14:21:22.128421 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126415 2575 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 17 14:21:22.128421 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126419 2575 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 17 14:21:22.128421 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126421 2575 flags.go:64] FLAG: --cgroup-root="" Apr 17 14:21:22.128421 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126424 2575 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 17 14:21:22.128421 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126427 2575 flags.go:64] FLAG: --client-ca-file="" Apr 17 14:21:22.128421 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126430 2575 flags.go:64] FLAG: --cloud-config="" Apr 17 14:21:22.128421 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126433 2575 flags.go:64] FLAG: --cloud-provider="external" Apr 17 14:21:22.128421 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126436 2575 flags.go:64] FLAG: --cluster-dns="[]" Apr 17 14:21:22.128421 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126441 2575 flags.go:64] FLAG: --cluster-domain="" Apr 17 14:21:22.128421 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126444 2575 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 17 14:21:22.128421 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126447 2575 flags.go:64] FLAG: --config-dir="" Apr 17 14:21:22.128421 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126449 2575 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 17 14:21:22.128421 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126453 2575 flags.go:64] FLAG: --container-log-max-files="5" Apr 17 14:21:22.128421 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126457 2575 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 17 14:21:22.128421 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126460 2575 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 17 14:21:22.128421 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126463 2575 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 17 14:21:22.128421 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126467 2575 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 17 14:21:22.128421 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126470 2575 flags.go:64] FLAG: --contention-profiling="false" Apr 17 14:21:22.128421 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126473 2575 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 17 14:21:22.128421 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126476 2575 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 17 14:21:22.128421 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126480 2575 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 17 14:21:22.128421 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126482 2575 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 17 14:21:22.129062 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126488 2575 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 17 14:21:22.129062 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126491 2575 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 17 14:21:22.129062 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126494 2575 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 17 14:21:22.129062 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126497 2575 flags.go:64] FLAG: --enable-load-reader="false" Apr 17 14:21:22.129062 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126500 2575 flags.go:64] FLAG: --enable-server="true" Apr 17 14:21:22.129062 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126503 2575 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 17 14:21:22.129062 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126507 2575 flags.go:64] FLAG: --event-burst="100" Apr 17 14:21:22.129062 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126510 2575 flags.go:64] FLAG: --event-qps="50" Apr 17 14:21:22.129062 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126514 2575 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 17 14:21:22.129062 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126517 2575 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 17 14:21:22.129062 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126520 2575 flags.go:64] FLAG: --eviction-hard="" Apr 17 14:21:22.129062 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126524 2575 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 17 14:21:22.129062 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126527 2575 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 17 14:21:22.129062 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126530 2575 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 17 14:21:22.129062 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126533 2575 flags.go:64] FLAG: --eviction-soft="" Apr 17 14:21:22.129062 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126536 2575 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 17 14:21:22.129062 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126539 2575 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 17 14:21:22.129062 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126542 2575 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 17 14:21:22.129062 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126545 2575 flags.go:64] FLAG: --experimental-mounter-path="" Apr 17 14:21:22.129062 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126548 2575 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 17 14:21:22.129062 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126550 2575 flags.go:64] FLAG: --fail-swap-on="true" Apr 17 14:21:22.129062 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126553 2575 flags.go:64] FLAG: --feature-gates="" Apr 17 14:21:22.129062 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126557 2575 flags.go:64] FLAG: --file-check-frequency="20s" Apr 17 14:21:22.129062 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126560 2575 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 17 14:21:22.129062 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126564 2575 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 17 14:21:22.129710 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126567 2575 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 17 14:21:22.129710 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126570 2575 flags.go:64] FLAG: --healthz-port="10248" Apr 17 14:21:22.129710 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126573 2575 flags.go:64] FLAG: --help="false" Apr 17 14:21:22.129710 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126577 2575 flags.go:64] FLAG: --hostname-override="ip-10-0-138-3.ec2.internal" Apr 17 14:21:22.129710 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126580 2575 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 17 14:21:22.129710 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126583 2575 flags.go:64] FLAG: --http-check-frequency="20s" Apr 17 14:21:22.129710 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126587 2575 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 17 14:21:22.129710 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126590 2575 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 17 14:21:22.129710 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126593 2575 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 17 14:21:22.129710 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126596 2575 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 17 14:21:22.129710 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126599 2575 flags.go:64] FLAG: --image-service-endpoint="" Apr 17 14:21:22.129710 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126601 2575 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 17 14:21:22.129710 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126604 2575 flags.go:64] FLAG: --kube-api-burst="100" Apr 17 14:21:22.129710 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126607 2575 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 17 14:21:22.129710 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126611 2575 flags.go:64] FLAG: --kube-api-qps="50" Apr 17 14:21:22.129710 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126614 2575 flags.go:64] FLAG: --kube-reserved="" Apr 17 14:21:22.129710 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126618 2575 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 17 14:21:22.129710 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126621 2575 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 17 14:21:22.129710 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126624 2575 flags.go:64] FLAG: --kubelet-cgroups="" Apr 17 14:21:22.129710 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126627 2575 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 17 14:21:22.129710 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126630 2575 flags.go:64] FLAG: --lock-file="" Apr 17 14:21:22.129710 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126633 2575 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 17 14:21:22.129710 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126636 2575 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 17 14:21:22.129710 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126639 2575 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 17 14:21:22.130316 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126645 2575 flags.go:64] FLAG: --log-json-split-stream="false" Apr 17 14:21:22.130316 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126648 2575 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 17 14:21:22.130316 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126651 2575 flags.go:64] FLAG: --log-text-split-stream="false" Apr 17 14:21:22.130316 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126654 2575 flags.go:64] FLAG: --logging-format="text" Apr 17 14:21:22.130316 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126657 2575 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 17 14:21:22.130316 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126660 2575 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 17 14:21:22.130316 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126663 2575 flags.go:64] FLAG: --manifest-url="" Apr 17 14:21:22.130316 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126666 2575 flags.go:64] FLAG: --manifest-url-header="" Apr 17 14:21:22.130316 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126676 2575 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 17 14:21:22.130316 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126679 2575 flags.go:64] FLAG: --max-open-files="1000000" Apr 17 14:21:22.130316 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126683 2575 flags.go:64] FLAG: --max-pods="110" Apr 17 14:21:22.130316 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126686 2575 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 17 14:21:22.130316 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126690 2575 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 17 14:21:22.130316 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126692 2575 flags.go:64] FLAG: --memory-manager-policy="None" Apr 17 14:21:22.130316 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126695 2575 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 17 14:21:22.130316 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126699 2575 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 17 14:21:22.130316 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126702 2575 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 17 14:21:22.130316 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126705 2575 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 17 14:21:22.130316 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126713 2575 flags.go:64] FLAG: --node-status-max-images="50" Apr 17 14:21:22.130316 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126716 2575 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 17 14:21:22.130316 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126719 2575 flags.go:64] FLAG: --oom-score-adj="-999" Apr 17 14:21:22.130316 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126722 2575 flags.go:64] FLAG: --pod-cidr="" Apr 17 14:21:22.130316 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126725 2575 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 17 14:21:22.130862 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126731 2575 flags.go:64] FLAG: --pod-manifest-path="" Apr 17 14:21:22.130862 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126735 2575 flags.go:64] FLAG: --pod-max-pids="-1" Apr 17 14:21:22.130862 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126738 2575 flags.go:64] FLAG: --pods-per-core="0" Apr 17 14:21:22.130862 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126741 2575 flags.go:64] FLAG: --port="10250" Apr 17 14:21:22.130862 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126744 2575 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 17 14:21:22.130862 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126747 2575 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0df97fec7a6820f11" Apr 17 14:21:22.130862 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126750 2575 flags.go:64] FLAG: --qos-reserved="" Apr 17 14:21:22.130862 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126753 2575 flags.go:64] FLAG: --read-only-port="10255" Apr 17 14:21:22.130862 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126756 2575 flags.go:64] FLAG: --register-node="true" Apr 17 14:21:22.130862 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126759 2575 flags.go:64] FLAG: --register-schedulable="true" Apr 17 14:21:22.130862 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126762 2575 flags.go:64] FLAG: --register-with-taints="" Apr 17 14:21:22.130862 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126766 2575 flags.go:64] FLAG: --registry-burst="10" Apr 17 14:21:22.130862 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126769 2575 flags.go:64] FLAG: --registry-qps="5" Apr 17 14:21:22.130862 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126772 2575 flags.go:64] FLAG: --reserved-cpus="" Apr 17 14:21:22.130862 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126774 2575 flags.go:64] FLAG: --reserved-memory="" Apr 17 14:21:22.130862 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126778 2575 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 17 14:21:22.130862 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126781 2575 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 17 14:21:22.130862 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126784 2575 flags.go:64] FLAG: --rotate-certificates="false" Apr 17 14:21:22.130862 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126787 2575 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 17 14:21:22.130862 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126790 2575 flags.go:64] FLAG: --runonce="false" Apr 17 14:21:22.130862 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126793 2575 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 17 14:21:22.130862 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126796 2575 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 17 14:21:22.130862 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126799 2575 flags.go:64] FLAG: --seccomp-default="false" Apr 17 14:21:22.130862 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126802 2575 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 17 14:21:22.130862 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126805 2575 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 17 14:21:22.130862 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126808 2575 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 17 14:21:22.131529 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126812 2575 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 17 14:21:22.131529 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126815 2575 flags.go:64] FLAG: --storage-driver-password="root" Apr 17 14:21:22.131529 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126818 2575 flags.go:64] FLAG: --storage-driver-secure="false" Apr 17 14:21:22.131529 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126821 2575 flags.go:64] FLAG: --storage-driver-table="stats" Apr 17 14:21:22.131529 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126824 2575 flags.go:64] FLAG: --storage-driver-user="root" Apr 17 14:21:22.131529 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126827 2575 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 17 14:21:22.131529 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126830 2575 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 17 14:21:22.131529 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126837 2575 flags.go:64] FLAG: --system-cgroups="" Apr 17 14:21:22.131529 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126840 2575 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 17 14:21:22.131529 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126846 2575 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 17 14:21:22.131529 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126849 2575 flags.go:64] FLAG: --tls-cert-file="" Apr 17 14:21:22.131529 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126852 2575 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 17 14:21:22.131529 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126858 2575 flags.go:64] FLAG: --tls-min-version="" Apr 17 14:21:22.131529 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126861 2575 flags.go:64] FLAG: --tls-private-key-file="" Apr 17 14:21:22.131529 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126864 2575 flags.go:64] FLAG: --topology-manager-policy="none" Apr 17 14:21:22.131529 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126878 2575 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 17 14:21:22.131529 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126882 2575 flags.go:64] FLAG: --topology-manager-scope="container" Apr 17 14:21:22.131529 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126885 2575 flags.go:64] FLAG: --v="2" Apr 17 14:21:22.131529 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126890 2575 flags.go:64] FLAG: --version="false" Apr 17 14:21:22.131529 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126895 2575 flags.go:64] FLAG: --vmodule="" Apr 17 14:21:22.131529 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126899 2575 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 17 14:21:22.131529 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.126902 2575 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 17 14:21:22.131529 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.127002 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 14:21:22.131529 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.127006 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 14:21:22.132121 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.127010 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 14:21:22.132121 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.127013 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 14:21:22.132121 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.127016 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 14:21:22.132121 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.127018 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 14:21:22.132121 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.127021 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 14:21:22.132121 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.127024 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 14:21:22.132121 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.127026 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 14:21:22.132121 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.127029 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 14:21:22.132121 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.127032 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 14:21:22.132121 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.127035 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 14:21:22.132121 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.127038 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 14:21:22.132121 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.127040 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 14:21:22.132121 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.127043 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 14:21:22.132121 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.127045 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 14:21:22.132121 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.127048 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 14:21:22.132121 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.127052 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 14:21:22.132121 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.127055 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 14:21:22.132121 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.127058 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 14:21:22.132121 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.127060 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 14:21:22.132619 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.127063 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 14:21:22.132619 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.127065 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 14:21:22.132619 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.127068 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 14:21:22.132619 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.127070 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 14:21:22.132619 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.127073 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 14:21:22.132619 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.127077 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 14:21:22.132619 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.127080 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 14:21:22.132619 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.127083 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 14:21:22.132619 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.127085 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 14:21:22.132619 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.127088 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 14:21:22.132619 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.127090 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 14:21:22.132619 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.127093 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 14:21:22.132619 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.127096 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 14:21:22.132619 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.127098 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 14:21:22.132619 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.127101 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 14:21:22.132619 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.127103 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 14:21:22.132619 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.127106 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 14:21:22.132619 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.127108 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 14:21:22.132619 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.127111 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 14:21:22.132619 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.127113 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 14:21:22.133151 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.127116 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 14:21:22.133151 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.127119 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 14:21:22.133151 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.127126 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 14:21:22.133151 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.127128 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 14:21:22.133151 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.127131 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 14:21:22.133151 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.127133 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 14:21:22.133151 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.127136 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 14:21:22.133151 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.127138 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 14:21:22.133151 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.127144 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 14:21:22.133151 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.127147 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 14:21:22.133151 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.127149 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 14:21:22.133151 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.127152 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 14:21:22.133151 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.127154 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 14:21:22.133151 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.127157 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 14:21:22.133151 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.127160 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 14:21:22.133151 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.127162 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 14:21:22.133151 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.127165 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 14:21:22.133151 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.127168 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 14:21:22.133151 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.127171 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 14:21:22.133151 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.127173 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 14:21:22.133648 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.127176 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 14:21:22.133648 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.127178 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 14:21:22.133648 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.127181 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 14:21:22.133648 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.127183 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 14:21:22.133648 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.127188 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 14:21:22.133648 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.127191 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 14:21:22.133648 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.127194 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 14:21:22.133648 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.127197 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 14:21:22.133648 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.127199 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 14:21:22.133648 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.127202 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 14:21:22.133648 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.127205 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 14:21:22.133648 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.127207 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 14:21:22.133648 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.127210 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 17 14:21:22.133648 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.127212 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 14:21:22.133648 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.127215 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 14:21:22.133648 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.127218 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 14:21:22.133648 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.127220 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 14:21:22.133648 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.127223 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 14:21:22.133648 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.127225 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 14:21:22.133648 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.127228 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 14:21:22.134170 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.127233 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 14:21:22.134170 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.127236 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 14:21:22.134170 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.127238 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 14:21:22.134170 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.127241 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 14:21:22.134170 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.127243 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 14:21:22.134170 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.127249 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 14:21:22.134170 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.133953 2575 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 17 14:21:22.134170 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.133971 2575 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 17 14:21:22.134170 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134020 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 14:21:22.134170 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134038 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 14:21:22.134170 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134042 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 14:21:22.134170 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134045 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 14:21:22.134170 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134048 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 14:21:22.134170 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134051 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 14:21:22.134170 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134053 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 14:21:22.134568 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134056 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 14:21:22.134568 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134058 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 14:21:22.134568 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134061 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 14:21:22.134568 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134063 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 14:21:22.134568 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134066 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 14:21:22.134568 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134068 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 14:21:22.134568 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134072 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 14:21:22.134568 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134074 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 17 14:21:22.134568 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134077 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 14:21:22.134568 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134080 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 14:21:22.134568 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134082 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 14:21:22.134568 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134085 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 14:21:22.134568 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134087 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 14:21:22.134568 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134090 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 14:21:22.134568 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134092 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 14:21:22.134568 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134094 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 14:21:22.134568 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134097 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 14:21:22.134568 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134099 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 14:21:22.134568 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134102 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 14:21:22.134568 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134105 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 14:21:22.135060 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134107 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 14:21:22.135060 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134109 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 14:21:22.135060 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134112 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 14:21:22.135060 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134115 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 14:21:22.135060 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134119 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 14:21:22.135060 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134123 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 14:21:22.135060 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134126 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 14:21:22.135060 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134128 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 14:21:22.135060 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134131 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 14:21:22.135060 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134133 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 14:21:22.135060 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134136 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 14:21:22.135060 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134138 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 14:21:22.135060 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134141 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 14:21:22.135060 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134143 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 14:21:22.135060 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134146 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 14:21:22.135060 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134148 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 14:21:22.135060 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134151 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 14:21:22.135060 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134153 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 14:21:22.135060 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134156 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 14:21:22.135060 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134158 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 14:21:22.135542 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134160 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 14:21:22.135542 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134163 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 14:21:22.135542 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134165 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 14:21:22.135542 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134168 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 14:21:22.135542 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134170 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 14:21:22.135542 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134172 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 14:21:22.135542 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134175 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 14:21:22.135542 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134178 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 14:21:22.135542 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134182 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 14:21:22.135542 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134185 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 14:21:22.135542 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134187 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 14:21:22.135542 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134190 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 14:21:22.135542 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134193 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 14:21:22.135542 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134195 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 14:21:22.135542 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134197 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 14:21:22.135542 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134200 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 14:21:22.135542 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134203 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 14:21:22.135542 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134206 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 14:21:22.135542 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134209 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 14:21:22.136009 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134211 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 14:21:22.136009 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134214 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 14:21:22.136009 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134216 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 14:21:22.136009 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134219 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 14:21:22.136009 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134223 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 14:21:22.136009 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134228 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 14:21:22.136009 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134231 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 14:21:22.136009 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134233 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 14:21:22.136009 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134236 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 14:21:22.136009 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134239 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 14:21:22.136009 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134242 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 14:21:22.136009 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134245 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 14:21:22.136009 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134247 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 14:21:22.136009 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134250 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 14:21:22.136009 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134252 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 14:21:22.136009 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134255 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 14:21:22.136009 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134257 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 14:21:22.136009 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134260 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 14:21:22.136009 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134262 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 14:21:22.136477 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134265 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 14:21:22.136477 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.134270 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 14:21:22.136477 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134369 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 14:21:22.136477 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134374 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 14:21:22.136477 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134377 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 14:21:22.136477 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134380 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 14:21:22.136477 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134383 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 17 14:21:22.136477 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134386 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 14:21:22.136477 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134388 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 14:21:22.136477 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134391 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 14:21:22.136477 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134394 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 14:21:22.136477 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134397 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 14:21:22.136477 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134401 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 14:21:22.136477 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134403 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 14:21:22.136477 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134406 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 14:21:22.136477 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134408 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 14:21:22.136993 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134411 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 14:21:22.136993 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134414 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 14:21:22.136993 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134416 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 14:21:22.136993 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134419 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 14:21:22.136993 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134421 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 14:21:22.136993 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134423 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 14:21:22.136993 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134426 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 14:21:22.136993 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134429 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 14:21:22.136993 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134431 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 14:21:22.136993 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134433 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 14:21:22.136993 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134437 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 14:21:22.136993 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134441 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 14:21:22.136993 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134443 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 14:21:22.136993 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134446 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 14:21:22.136993 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134449 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 14:21:22.136993 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134451 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 14:21:22.136993 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134453 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 14:21:22.136993 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134456 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 14:21:22.136993 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134458 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 14:21:22.137463 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134462 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 14:21:22.137463 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134465 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 14:21:22.137463 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134468 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 14:21:22.137463 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134471 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 14:21:22.137463 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134473 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 14:21:22.137463 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134476 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 14:21:22.137463 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134478 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 14:21:22.137463 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134481 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 14:21:22.137463 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134484 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 14:21:22.137463 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134487 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 14:21:22.137463 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134490 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 14:21:22.137463 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134492 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 14:21:22.137463 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134495 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 14:21:22.137463 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134497 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 14:21:22.137463 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134500 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 14:21:22.137463 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134502 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 14:21:22.137463 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134504 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 14:21:22.137463 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134507 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 14:21:22.137463 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134509 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 14:21:22.137940 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134511 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 14:21:22.137940 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134514 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 14:21:22.137940 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134516 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 14:21:22.137940 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134519 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 14:21:22.137940 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134521 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 14:21:22.137940 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134524 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 14:21:22.137940 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134526 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 14:21:22.137940 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134528 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 14:21:22.137940 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134531 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 14:21:22.137940 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134533 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 14:21:22.137940 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134536 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 14:21:22.137940 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134538 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 14:21:22.137940 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134540 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 14:21:22.137940 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134543 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 14:21:22.137940 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134545 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 14:21:22.137940 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134548 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 14:21:22.137940 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134550 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 14:21:22.137940 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134553 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 14:21:22.137940 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134555 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 14:21:22.137940 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134557 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 14:21:22.138425 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134560 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 14:21:22.138425 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134563 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 14:21:22.138425 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134565 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 14:21:22.138425 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134568 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 14:21:22.138425 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134571 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 14:21:22.138425 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134574 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 14:21:22.138425 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134576 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 14:21:22.138425 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134579 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 14:21:22.138425 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134581 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 14:21:22.138425 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134584 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 14:21:22.138425 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134586 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 14:21:22.138425 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134589 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 14:21:22.138425 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134591 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 14:21:22.138425 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:22.134593 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 14:21:22.138425 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.134598 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 14:21:22.138425 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.135687 2575 server.go:962] "Client rotation is on, will bootstrap in background" Apr 17 14:21:22.139789 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.139774 2575 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 17 14:21:22.140982 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.140970 2575 server.go:1019] "Starting client certificate rotation" Apr 17 14:21:22.141080 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.141062 2575 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 14:21:22.141109 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.141099 2575 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 14:21:22.173161 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.173136 2575 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 14:21:22.177140 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.177117 2575 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 14:21:22.198294 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.198262 2575 log.go:25] "Validated CRI v1 runtime API" Apr 17 14:21:22.204900 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.204865 2575 log.go:25] "Validated CRI v1 image API" Apr 17 14:21:22.205409 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.205392 2575 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 14:21:22.206291 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.206269 2575 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 17 14:21:22.211897 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.211859 2575 fs.go:135] Filesystem UUIDs: map[1ab1a94d-82df-4c2d-b5b0-0f00302297fe:/dev/nvme0n1p4 49ffc3b4-89ce-48e5-94aa-cd6209eb5473:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2] Apr 17 14:21:22.211987 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.211894 2575 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 17 14:21:22.218202 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.218085 2575 manager.go:217] Machine: {Timestamp:2026-04-17 14:21:22.215528198 +0000 UTC m=+0.522281819 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3198458 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2f17dc11ca4b8e5299130265b23288 SystemUUID:ec2f17dc-11ca-4b8e-5299-130265b23288 BootID:2ad3d26a-8a1f-4bd5-85af-f15d3c2abe4b Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:9e:7c:75:8f:ad Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:9e:7c:75:8f:ad Speed:0 Mtu:9001} {Name:ovs-system MacAddress:3a:28:4b:e0:f6:01 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 17 14:21:22.218202 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.218196 2575 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 17 14:21:22.218312 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.218283 2575 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 17 14:21:22.219512 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.219483 2575 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 17 14:21:22.219654 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.219516 2575 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-138-3.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 17 14:21:22.219699 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.219663 2575 topology_manager.go:138] "Creating topology manager with none policy" Apr 17 14:21:22.219699 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.219672 2575 container_manager_linux.go:306] "Creating device plugin manager" Apr 17 14:21:22.219699 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.219685 2575 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 14:21:22.220603 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.220583 2575 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-j4rmg" Apr 17 14:21:22.220603 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.220597 2575 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 14:21:22.222527 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.222516 2575 state_mem.go:36] "Initialized new in-memory state store" Apr 17 14:21:22.222633 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.222625 2575 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 17 14:21:22.225729 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.225719 2575 kubelet.go:491] "Attempting to sync node with API server" Apr 17 14:21:22.225774 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.225735 2575 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 17 14:21:22.225774 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.225748 2575 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 17 14:21:22.225774 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.225758 2575 kubelet.go:397] "Adding apiserver pod source" Apr 17 14:21:22.225774 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.225766 2575 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 17 14:21:22.226991 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.226979 2575 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 14:21:22.227039 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.226996 2575 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 14:21:22.227978 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.227960 2575 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-j4rmg" Apr 17 14:21:22.231082 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.231061 2575 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 17 14:21:22.232739 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.232726 2575 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 17 14:21:22.236153 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.236112 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 17 14:21:22.237499 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.237482 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 17 14:21:22.237499 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.237500 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 17 14:21:22.237605 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.237507 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 17 14:21:22.237605 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.237516 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 17 14:21:22.237605 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.237527 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 17 14:21:22.237605 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.237536 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 17 14:21:22.237605 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.237542 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 17 14:21:22.237605 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.237554 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 17 14:21:22.237605 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.237564 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 17 14:21:22.237605 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.237580 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 17 14:21:22.237605 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.237590 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 17 14:21:22.238760 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.238734 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 17 14:21:22.238812 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.238773 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 17 14:21:22.241971 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.241952 2575 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 14:21:22.242902 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.242890 2575 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 17 14:21:22.242960 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.242935 2575 server.go:1295] "Started kubelet" Apr 17 14:21:22.243035 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.243007 2575 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 17 14:21:22.243107 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.243065 2575 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 17 14:21:22.243142 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.243131 2575 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 17 14:21:22.243966 ip-10-0-138-3 systemd[1]: Started Kubernetes Kubelet. Apr 17 14:21:22.244076 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.244008 2575 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 14:21:22.244958 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.244939 2575 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 17 14:21:22.246247 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.246232 2575 server.go:317] "Adding debug handlers to kubelet server" Apr 17 14:21:22.247292 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.247267 2575 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-138-3.ec2.internal" not found Apr 17 14:21:22.252980 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.252957 2575 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 17 14:21:22.254171 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.254154 2575 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 17 14:21:22.254514 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:22.254487 2575 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 17 14:21:22.255003 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.254987 2575 factory.go:55] Registering systemd factory Apr 17 14:21:22.255077 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.255010 2575 factory.go:223] Registration of the systemd container factory successfully Apr 17 14:21:22.255202 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.255192 2575 factory.go:153] Registering CRI-O factory Apr 17 14:21:22.255234 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.255204 2575 factory.go:223] Registration of the crio container factory successfully Apr 17 14:21:22.255264 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.255252 2575 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 17 14:21:22.255295 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.255272 2575 factory.go:103] Registering Raw factory Apr 17 14:21:22.255295 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.255282 2575 manager.go:1196] Started watching for new ooms in manager Apr 17 14:21:22.255513 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.255492 2575 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 17 14:21:22.255513 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.255492 2575 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 17 14:21:22.255642 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.255520 2575 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 17 14:21:22.255642 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.255638 2575 reconstruct.go:97] "Volume reconstruction finished" Apr 17 14:21:22.255642 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.255644 2575 reconciler.go:26] "Reconciler: start to sync state" Apr 17 14:21:22.255804 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.255768 2575 manager.go:319] Starting recovery of all containers Apr 17 14:21:22.255804 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:22.255785 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-3.ec2.internal\" not found" Apr 17 14:21:22.256977 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.256956 2575 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 14:21:22.261418 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:22.261398 2575 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-138-3.ec2.internal\" not found" node="ip-10-0-138-3.ec2.internal" Apr 17 14:21:22.263024 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.263004 2575 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-138-3.ec2.internal" not found Apr 17 14:21:22.268339 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.268322 2575 manager.go:324] Recovery completed Apr 17 14:21:22.272357 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.272343 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 14:21:22.274432 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.274417 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-3.ec2.internal" event="NodeHasSufficientMemory" Apr 17 14:21:22.274493 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.274444 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-3.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 14:21:22.274493 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.274454 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-3.ec2.internal" event="NodeHasSufficientPID" Apr 17 14:21:22.274923 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.274909 2575 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 17 14:21:22.274923 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.274921 2575 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 17 14:21:22.275003 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.274938 2575 state_mem.go:36] "Initialized new in-memory state store" Apr 17 14:21:22.278495 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.278478 2575 policy_none.go:49] "None policy: Start" Apr 17 14:21:22.278495 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.278493 2575 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 17 14:21:22.278620 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.278502 2575 state_mem.go:35] "Initializing new in-memory state store" Apr 17 14:21:22.340519 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.319994 2575 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-138-3.ec2.internal" not found Apr 17 14:21:22.340519 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.331996 2575 manager.go:341] "Starting Device Plugin manager" Apr 17 14:21:22.340519 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:22.332027 2575 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 17 14:21:22.340519 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.332038 2575 server.go:85] "Starting device plugin registration server" Apr 17 14:21:22.340519 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.332341 2575 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 17 14:21:22.340519 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.332356 2575 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 17 14:21:22.340519 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.332464 2575 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 17 14:21:22.340519 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.332539 2575 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 17 14:21:22.340519 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.332551 2575 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 17 14:21:22.340519 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:22.333130 2575 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 17 14:21:22.340519 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:22.333173 2575 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-138-3.ec2.internal\" not found" Apr 17 14:21:22.396305 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.396221 2575 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 17 14:21:22.397480 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.397464 2575 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 17 14:21:22.397562 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.397496 2575 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 17 14:21:22.397562 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.397515 2575 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 17 14:21:22.397562 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.397521 2575 kubelet.go:2451] "Starting kubelet main sync loop" Apr 17 14:21:22.397562 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:22.397555 2575 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 17 14:21:22.400010 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.399990 2575 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 14:21:22.432994 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.432964 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 14:21:22.434505 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.434488 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-3.ec2.internal" event="NodeHasSufficientMemory" Apr 17 14:21:22.434594 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.434524 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-3.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 14:21:22.434594 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.434535 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-3.ec2.internal" event="NodeHasSufficientPID" Apr 17 14:21:22.434594 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.434559 2575 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-138-3.ec2.internal" Apr 17 14:21:22.444795 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.444771 2575 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-138-3.ec2.internal" Apr 17 14:21:22.444897 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:22.444797 2575 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-138-3.ec2.internal\": node \"ip-10-0-138-3.ec2.internal\" not found" Apr 17 14:21:22.498004 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.497953 2575 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-3.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-138-3.ec2.internal"] Apr 17 14:21:22.502432 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.502412 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-3.ec2.internal" Apr 17 14:21:22.502533 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.502422 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-3.ec2.internal" Apr 17 14:21:22.527483 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.527462 2575 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-3.ec2.internal" Apr 17 14:21:22.531729 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.531716 2575 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-3.ec2.internal" Apr 17 14:21:22.556636 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.556615 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/fcb38c6600a2cdeab28c1b9fce28f12c-config\") pod \"kube-apiserver-proxy-ip-10-0-138-3.ec2.internal\" (UID: \"fcb38c6600a2cdeab28c1b9fce28f12c\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-3.ec2.internal" Apr 17 14:21:22.556707 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.556642 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/2b09b5827d36ff6ab7618b3998f1064e-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-3.ec2.internal\" (UID: \"2b09b5827d36ff6ab7618b3998f1064e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-3.ec2.internal" Apr 17 14:21:22.556707 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.556663 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2b09b5827d36ff6ab7618b3998f1064e-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-3.ec2.internal\" (UID: \"2b09b5827d36ff6ab7618b3998f1064e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-3.ec2.internal" Apr 17 14:21:22.559658 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.559644 2575 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 14:21:22.561339 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.561325 2575 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 14:21:22.657657 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.657575 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/2b09b5827d36ff6ab7618b3998f1064e-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-3.ec2.internal\" (UID: \"2b09b5827d36ff6ab7618b3998f1064e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-3.ec2.internal" Apr 17 14:21:22.657657 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.657607 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2b09b5827d36ff6ab7618b3998f1064e-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-3.ec2.internal\" (UID: \"2b09b5827d36ff6ab7618b3998f1064e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-3.ec2.internal" Apr 17 14:21:22.657657 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.657627 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/fcb38c6600a2cdeab28c1b9fce28f12c-config\") pod \"kube-apiserver-proxy-ip-10-0-138-3.ec2.internal\" (UID: \"fcb38c6600a2cdeab28c1b9fce28f12c\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-3.ec2.internal" Apr 17 14:21:22.657842 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.657670 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/fcb38c6600a2cdeab28c1b9fce28f12c-config\") pod \"kube-apiserver-proxy-ip-10-0-138-3.ec2.internal\" (UID: \"fcb38c6600a2cdeab28c1b9fce28f12c\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-3.ec2.internal" Apr 17 14:21:22.657842 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.657675 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/2b09b5827d36ff6ab7618b3998f1064e-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-3.ec2.internal\" (UID: \"2b09b5827d36ff6ab7618b3998f1064e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-3.ec2.internal" Apr 17 14:21:22.657842 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.657678 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2b09b5827d36ff6ab7618b3998f1064e-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-3.ec2.internal\" (UID: \"2b09b5827d36ff6ab7618b3998f1064e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-3.ec2.internal" Apr 17 14:21:22.863350 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.863314 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-3.ec2.internal" Apr 17 14:21:22.865849 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:22.865836 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-3.ec2.internal" Apr 17 14:21:23.141065 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.141024 2575 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 17 14:21:23.141673 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.141211 2575 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 14:21:23.141673 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.141213 2575 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 14:21:23.141673 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.141211 2575 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 14:21:23.226956 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.226839 2575 apiserver.go:52] "Watching apiserver" Apr 17 14:21:23.230200 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.230171 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-16 14:16:22 +0000 UTC" deadline="2027-12-25 04:20:39.56754537 +0000 UTC" Apr 17 14:21:23.230200 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.230198 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14797h59m16.337349278s" Apr 17 14:21:23.238324 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.238302 2575 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 17 14:21:23.238642 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.238620 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-9cwz2","openshift-network-operator/iptables-alerter-jrx2k","openshift-ovn-kubernetes/ovnkube-node-x4dft","kube-system/kube-apiserver-proxy-ip-10-0-138-3.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4dhg6","openshift-dns/node-resolver-kvtp7","openshift-image-registry/node-ca-58lbn","openshift-multus/network-metrics-daemon-ccsgf","kube-system/konnectivity-agent-bx274","openshift-cluster-node-tuning-operator/tuned-6xth9","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-3.ec2.internal","openshift-multus/multus-additional-cni-plugins-vp9jm","openshift-multus/multus-gxf4n"] Apr 17 14:21:23.240966 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.240946 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9cwz2" Apr 17 14:21:23.241065 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:23.241009 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9cwz2" podUID="39622ed6-2176-4d02-9823-0818852bbb2d" Apr 17 14:21:23.243103 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.243085 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-jrx2k" Apr 17 14:21:23.247258 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.245618 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-x4dft" Apr 17 14:21:23.247258 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.246906 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-j7vwj\"" Apr 17 14:21:23.247258 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.247109 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 17 14:21:23.247258 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.247142 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 17 14:21:23.247518 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.247371 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 17 14:21:23.247972 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.247938 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 17 14:21:23.248372 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.248349 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 17 14:21:23.248543 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.248527 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 17 14:21:23.248606 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.248545 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4dhg6" Apr 17 14:21:23.249832 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.249792 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 17 14:21:23.249832 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.249803 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-b5swr\"" Apr 17 14:21:23.250039 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.249862 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 17 14:21:23.250039 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.249918 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 17 14:21:23.250626 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.250604 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-kvtp7" Apr 17 14:21:23.251017 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.251003 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 17 14:21:23.251107 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.251006 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-v2v5k\"" Apr 17 14:21:23.251107 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.251007 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 17 14:21:23.251107 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.251007 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 17 14:21:23.252778 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.252765 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-58lbn" Apr 17 14:21:23.252838 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.252779 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 17 14:21:23.253060 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.253045 2575 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 17 14:21:23.253163 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.253139 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 17 14:21:23.253206 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.253191 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-shpd6\"" Apr 17 14:21:23.255045 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.255030 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 17 14:21:23.255116 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.255043 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 17 14:21:23.255116 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.255077 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-nkhvw\"" Apr 17 14:21:23.255202 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.255132 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ccsgf" Apr 17 14:21:23.255202 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:23.255186 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ccsgf" podUID="d0eca2ae-83f6-462e-b7d9-9ab1592717a8" Apr 17 14:21:23.255295 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.255224 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 17 14:21:23.257290 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.257273 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-bx274" Apr 17 14:21:23.259565 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.259544 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-fnm6z\"" Apr 17 14:21:23.259717 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.259571 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 17 14:21:23.259789 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.259733 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 17 14:21:23.259975 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.259961 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-6xth9" Apr 17 14:21:23.260850 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.260833 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/31c597ae-daa7-47cf-855c-9a1613df2d3f-device-dir\") pod \"aws-ebs-csi-driver-node-4dhg6\" (UID: \"31c597ae-daa7-47cf-855c-9a1613df2d3f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4dhg6" Apr 17 14:21:23.260938 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.260862 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf-host-slash\") pod \"ovnkube-node-x4dft\" (UID: \"5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-x4dft" Apr 17 14:21:23.260938 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.260897 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf-etc-openvswitch\") pod \"ovnkube-node-x4dft\" (UID: \"5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-x4dft" Apr 17 14:21:23.260938 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.260920 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/3081116a-d0b8-4212-ab0e-d65967b55e93-iptables-alerter-script\") pod \"iptables-alerter-jrx2k\" (UID: \"3081116a-d0b8-4212-ab0e-d65967b55e93\") " pod="openshift-network-operator/iptables-alerter-jrx2k" Apr 17 14:21:23.261109 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.260966 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3081116a-d0b8-4212-ab0e-d65967b55e93-host-slash\") pod \"iptables-alerter-jrx2k\" (UID: \"3081116a-d0b8-4212-ab0e-d65967b55e93\") " pod="openshift-network-operator/iptables-alerter-jrx2k" Apr 17 14:21:23.261109 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.261001 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/822ddccf-2c1a-42b8-8b4f-b676b2aa2c9f-host\") pod \"node-ca-58lbn\" (UID: \"822ddccf-2c1a-42b8-8b4f-b676b2aa2c9f\") " pod="openshift-image-registry/node-ca-58lbn" Apr 17 14:21:23.261109 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.261031 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4rmz\" (UniqueName: \"kubernetes.io/projected/d0eca2ae-83f6-462e-b7d9-9ab1592717a8-kube-api-access-d4rmz\") pod \"network-metrics-daemon-ccsgf\" (UID: \"d0eca2ae-83f6-462e-b7d9-9ab1592717a8\") " pod="openshift-multus/network-metrics-daemon-ccsgf" Apr 17 14:21:23.261109 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.261057 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf-run-ovn\") pod \"ovnkube-node-x4dft\" (UID: \"5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-x4dft" Apr 17 14:21:23.261109 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.261081 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/31c597ae-daa7-47cf-855c-9a1613df2d3f-kubelet-dir\") pod \"aws-ebs-csi-driver-node-4dhg6\" (UID: \"31c597ae-daa7-47cf-855c-9a1613df2d3f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4dhg6" Apr 17 14:21:23.261109 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.261107 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e8508015-adfb-42aa-acfc-92b24ec90241-tmp-dir\") pod \"node-resolver-kvtp7\" (UID: \"e8508015-adfb-42aa-acfc-92b24ec90241\") " pod="openshift-dns/node-resolver-kvtp7" Apr 17 14:21:23.261369 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.261132 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d0eca2ae-83f6-462e-b7d9-9ab1592717a8-metrics-certs\") pod \"network-metrics-daemon-ccsgf\" (UID: \"d0eca2ae-83f6-462e-b7d9-9ab1592717a8\") " pod="openshift-multus/network-metrics-daemon-ccsgf" Apr 17 14:21:23.261369 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.261155 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf-host-cni-netd\") pod \"ovnkube-node-x4dft\" (UID: \"5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-x4dft" Apr 17 14:21:23.261369 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.261180 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-x4dft\" (UID: \"5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-x4dft" Apr 17 14:21:23.261369 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.261204 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/31c597ae-daa7-47cf-855c-9a1613df2d3f-sys-fs\") pod \"aws-ebs-csi-driver-node-4dhg6\" (UID: \"31c597ae-daa7-47cf-855c-9a1613df2d3f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4dhg6" Apr 17 14:21:23.261369 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.261246 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e8508015-adfb-42aa-acfc-92b24ec90241-hosts-file\") pod \"node-resolver-kvtp7\" (UID: \"e8508015-adfb-42aa-acfc-92b24ec90241\") " pod="openshift-dns/node-resolver-kvtp7" Apr 17 14:21:23.261369 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.261288 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmwrs\" (UniqueName: \"kubernetes.io/projected/e8508015-adfb-42aa-acfc-92b24ec90241-kube-api-access-xmwrs\") pod \"node-resolver-kvtp7\" (UID: \"e8508015-adfb-42aa-acfc-92b24ec90241\") " pod="openshift-dns/node-resolver-kvtp7" Apr 17 14:21:23.261369 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.261327 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/f386792e-2dbd-4e36-af17-6dbd71a6ad31-agent-certs\") pod \"konnectivity-agent-bx274\" (UID: \"f386792e-2dbd-4e36-af17-6dbd71a6ad31\") " pod="kube-system/konnectivity-agent-bx274" Apr 17 14:21:23.261369 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.261355 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf-host-kubelet\") pod \"ovnkube-node-x4dft\" (UID: \"5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-x4dft" Apr 17 14:21:23.261724 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.261380 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf-host-cni-bin\") pod \"ovnkube-node-x4dft\" (UID: \"5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-x4dft" Apr 17 14:21:23.261724 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.261404 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzwjz\" (UniqueName: \"kubernetes.io/projected/31c597ae-daa7-47cf-855c-9a1613df2d3f-kube-api-access-lzwjz\") pod \"aws-ebs-csi-driver-node-4dhg6\" (UID: \"31c597ae-daa7-47cf-855c-9a1613df2d3f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4dhg6" Apr 17 14:21:23.261724 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.261428 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf-log-socket\") pod \"ovnkube-node-x4dft\" (UID: \"5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-x4dft" Apr 17 14:21:23.261724 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.261466 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zrz7\" (UniqueName: \"kubernetes.io/projected/822ddccf-2c1a-42b8-8b4f-b676b2aa2c9f-kube-api-access-8zrz7\") pod \"node-ca-58lbn\" (UID: \"822ddccf-2c1a-42b8-8b4f-b676b2aa2c9f\") " pod="openshift-image-registry/node-ca-58lbn" Apr 17 14:21:23.261724 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.261585 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf-host-run-netns\") pod \"ovnkube-node-x4dft\" (UID: \"5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-x4dft" Apr 17 14:21:23.261724 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.261615 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf-var-lib-openvswitch\") pod \"ovnkube-node-x4dft\" (UID: \"5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-x4dft" Apr 17 14:21:23.261724 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.261639 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf-node-log\") pod \"ovnkube-node-x4dft\" (UID: \"5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-x4dft" Apr 17 14:21:23.261724 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.261675 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf-env-overrides\") pod \"ovnkube-node-x4dft\" (UID: \"5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-x4dft" Apr 17 14:21:23.262017 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.261724 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/822ddccf-2c1a-42b8-8b4f-b676b2aa2c9f-serviceca\") pod \"node-ca-58lbn\" (UID: \"822ddccf-2c1a-42b8-8b4f-b676b2aa2c9f\") " pod="openshift-image-registry/node-ca-58lbn" Apr 17 14:21:23.262017 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.261747 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7q5wk\" (UniqueName: \"kubernetes.io/projected/39622ed6-2176-4d02-9823-0818852bbb2d-kube-api-access-7q5wk\") pod \"network-check-target-9cwz2\" (UID: \"39622ed6-2176-4d02-9823-0818852bbb2d\") " pod="openshift-network-diagnostics/network-check-target-9cwz2" Apr 17 14:21:23.262017 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.261766 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf-systemd-units\") pod \"ovnkube-node-x4dft\" (UID: \"5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-x4dft" Apr 17 14:21:23.262017 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.261787 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf-host-run-ovn-kubernetes\") pod \"ovnkube-node-x4dft\" (UID: \"5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-x4dft" Apr 17 14:21:23.262017 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.261804 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf-ovnkube-config\") pod \"ovnkube-node-x4dft\" (UID: \"5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-x4dft" Apr 17 14:21:23.262017 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.261821 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/f386792e-2dbd-4e36-af17-6dbd71a6ad31-konnectivity-ca\") pod \"konnectivity-agent-bx274\" (UID: \"f386792e-2dbd-4e36-af17-6dbd71a6ad31\") " pod="kube-system/konnectivity-agent-bx274" Apr 17 14:21:23.262017 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.261835 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf-run-openvswitch\") pod \"ovnkube-node-x4dft\" (UID: \"5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-x4dft" Apr 17 14:21:23.262017 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.261849 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf-ovnkube-script-lib\") pod \"ovnkube-node-x4dft\" (UID: \"5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-x4dft" Apr 17 14:21:23.262017 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.261864 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/31c597ae-daa7-47cf-855c-9a1613df2d3f-registration-dir\") pod \"aws-ebs-csi-driver-node-4dhg6\" (UID: \"31c597ae-daa7-47cf-855c-9a1613df2d3f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4dhg6" Apr 17 14:21:23.262017 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.261895 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/31c597ae-daa7-47cf-855c-9a1613df2d3f-etc-selinux\") pod \"aws-ebs-csi-driver-node-4dhg6\" (UID: \"31c597ae-daa7-47cf-855c-9a1613df2d3f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4dhg6" Apr 17 14:21:23.262017 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.261911 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfwsw\" (UniqueName: \"kubernetes.io/projected/3081116a-d0b8-4212-ab0e-d65967b55e93-kube-api-access-tfwsw\") pod \"iptables-alerter-jrx2k\" (UID: \"3081116a-d0b8-4212-ab0e-d65967b55e93\") " pod="openshift-network-operator/iptables-alerter-jrx2k" Apr 17 14:21:23.262017 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.261933 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf-run-systemd\") pod \"ovnkube-node-x4dft\" (UID: \"5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-x4dft" Apr 17 14:21:23.262017 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.261947 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf-ovn-node-metrics-cert\") pod \"ovnkube-node-x4dft\" (UID: \"5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-x4dft" Apr 17 14:21:23.262017 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.261969 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pfnr\" (UniqueName: \"kubernetes.io/projected/5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf-kube-api-access-4pfnr\") pod \"ovnkube-node-x4dft\" (UID: \"5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-x4dft" Apr 17 14:21:23.262017 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.262014 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/31c597ae-daa7-47cf-855c-9a1613df2d3f-socket-dir\") pod \"aws-ebs-csi-driver-node-4dhg6\" (UID: \"31c597ae-daa7-47cf-855c-9a1613df2d3f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4dhg6" Apr 17 14:21:23.262629 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.262348 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-5nmjv\"" Apr 17 14:21:23.262629 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.262427 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 17 14:21:23.262629 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.262531 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 17 14:21:23.262727 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.262688 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-vp9jm" Apr 17 14:21:23.265055 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.265039 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-gxf4n" Apr 17 14:21:23.265307 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.265201 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 17 14:21:23.265307 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.265229 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 17 14:21:23.265473 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.265325 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-6hqpg\"" Apr 17 14:21:23.265473 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.265332 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 17 14:21:23.265473 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.265356 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 17 14:21:23.265569 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.265554 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 17 14:21:23.267529 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.267507 2575 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 14:21:23.267629 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.267566 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 17 14:21:23.267629 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.267581 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-q5qqg\"" Apr 17 14:21:23.285688 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.285667 2575 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-f5mkn" Apr 17 14:21:23.290724 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.290702 2575 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-f5mkn" Apr 17 14:21:23.356226 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.356205 2575 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 17 14:21:23.362986 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.362963 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkj42\" (UniqueName: \"kubernetes.io/projected/0d1f7ee5-7856-41fe-9646-4344f387a26d-kube-api-access-xkj42\") pod \"tuned-6xth9\" (UID: \"0d1f7ee5-7856-41fe-9646-4344f387a26d\") " pod="openshift-cluster-node-tuning-operator/tuned-6xth9" Apr 17 14:21:23.363087 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.363001 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/f386792e-2dbd-4e36-af17-6dbd71a6ad31-konnectivity-ca\") pod \"konnectivity-agent-bx274\" (UID: \"f386792e-2dbd-4e36-af17-6dbd71a6ad31\") " pod="kube-system/konnectivity-agent-bx274" Apr 17 14:21:23.363087 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.363029 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf-run-openvswitch\") pod \"ovnkube-node-x4dft\" (UID: \"5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-x4dft" Apr 17 14:21:23.363087 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.363049 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf-ovnkube-script-lib\") pod \"ovnkube-node-x4dft\" (UID: \"5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-x4dft" Apr 17 14:21:23.363087 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.363065 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/31c597ae-daa7-47cf-855c-9a1613df2d3f-etc-selinux\") pod \"aws-ebs-csi-driver-node-4dhg6\" (UID: \"31c597ae-daa7-47cf-855c-9a1613df2d3f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4dhg6" Apr 17 14:21:23.363274 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.363116 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf-run-openvswitch\") pod \"ovnkube-node-x4dft\" (UID: \"5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-x4dft" Apr 17 14:21:23.363274 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.363147 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/31c597ae-daa7-47cf-855c-9a1613df2d3f-etc-selinux\") pod \"aws-ebs-csi-driver-node-4dhg6\" (UID: \"31c597ae-daa7-47cf-855c-9a1613df2d3f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4dhg6" Apr 17 14:21:23.363274 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.363164 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/63505374-1c69-4d7f-853d-90e9526b6d12-tuning-conf-dir\") pod \"multus-additional-cni-plugins-vp9jm\" (UID: \"63505374-1c69-4d7f-853d-90e9526b6d12\") " pod="openshift-multus/multus-additional-cni-plugins-vp9jm" Apr 17 14:21:23.363274 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.363196 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tfwsw\" (UniqueName: \"kubernetes.io/projected/3081116a-d0b8-4212-ab0e-d65967b55e93-kube-api-access-tfwsw\") pod \"iptables-alerter-jrx2k\" (UID: \"3081116a-d0b8-4212-ab0e-d65967b55e93\") " pod="openshift-network-operator/iptables-alerter-jrx2k" Apr 17 14:21:23.363274 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.363223 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf-run-systemd\") pod \"ovnkube-node-x4dft\" (UID: \"5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-x4dft" Apr 17 14:21:23.363274 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.363248 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf-ovn-node-metrics-cert\") pod \"ovnkube-node-x4dft\" (UID: \"5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-x4dft" Apr 17 14:21:23.363274 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.363272 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/31c597ae-daa7-47cf-855c-9a1613df2d3f-socket-dir\") pod \"aws-ebs-csi-driver-node-4dhg6\" (UID: \"31c597ae-daa7-47cf-855c-9a1613df2d3f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4dhg6" Apr 17 14:21:23.363565 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.363299 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/207ded88-4793-4bf3-9d1a-a6775c96a280-cnibin\") pod \"multus-gxf4n\" (UID: \"207ded88-4793-4bf3-9d1a-a6775c96a280\") " pod="openshift-multus/multus-gxf4n" Apr 17 14:21:23.363565 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.363324 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/0d1f7ee5-7856-41fe-9646-4344f387a26d-etc-modprobe-d\") pod \"tuned-6xth9\" (UID: \"0d1f7ee5-7856-41fe-9646-4344f387a26d\") " pod="openshift-cluster-node-tuning-operator/tuned-6xth9" Apr 17 14:21:23.363565 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.363351 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/0d1f7ee5-7856-41fe-9646-4344f387a26d-etc-sysctl-d\") pod \"tuned-6xth9\" (UID: \"0d1f7ee5-7856-41fe-9646-4344f387a26d\") " pod="openshift-cluster-node-tuning-operator/tuned-6xth9" Apr 17 14:21:23.363565 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.363353 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf-run-systemd\") pod \"ovnkube-node-x4dft\" (UID: \"5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-x4dft" Apr 17 14:21:23.363565 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.363375 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0d1f7ee5-7856-41fe-9646-4344f387a26d-var-lib-kubelet\") pod \"tuned-6xth9\" (UID: \"0d1f7ee5-7856-41fe-9646-4344f387a26d\") " pod="openshift-cluster-node-tuning-operator/tuned-6xth9" Apr 17 14:21:23.363565 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.363404 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf-etc-openvswitch\") pod \"ovnkube-node-x4dft\" (UID: \"5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-x4dft" Apr 17 14:21:23.363565 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.363418 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/31c597ae-daa7-47cf-855c-9a1613df2d3f-socket-dir\") pod \"aws-ebs-csi-driver-node-4dhg6\" (UID: \"31c597ae-daa7-47cf-855c-9a1613df2d3f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4dhg6" Apr 17 14:21:23.363565 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.363440 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/207ded88-4793-4bf3-9d1a-a6775c96a280-multus-socket-dir-parent\") pod \"multus-gxf4n\" (UID: \"207ded88-4793-4bf3-9d1a-a6775c96a280\") " pod="openshift-multus/multus-gxf4n" Apr 17 14:21:23.363565 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.363469 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf-etc-openvswitch\") pod \"ovnkube-node-x4dft\" (UID: \"5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-x4dft" Apr 17 14:21:23.363565 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.363493 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/207ded88-4793-4bf3-9d1a-a6775c96a280-host-run-netns\") pod \"multus-gxf4n\" (UID: \"207ded88-4793-4bf3-9d1a-a6775c96a280\") " pod="openshift-multus/multus-gxf4n" Apr 17 14:21:23.363565 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.363523 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/207ded88-4793-4bf3-9d1a-a6775c96a280-host-run-multus-certs\") pod \"multus-gxf4n\" (UID: \"207ded88-4793-4bf3-9d1a-a6775c96a280\") " pod="openshift-multus/multus-gxf4n" Apr 17 14:21:23.363565 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.363550 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/207ded88-4793-4bf3-9d1a-a6775c96a280-etc-kubernetes\") pod \"multus-gxf4n\" (UID: \"207ded88-4793-4bf3-9d1a-a6775c96a280\") " pod="openshift-multus/multus-gxf4n" Apr 17 14:21:23.363565 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.363555 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/f386792e-2dbd-4e36-af17-6dbd71a6ad31-konnectivity-ca\") pod \"konnectivity-agent-bx274\" (UID: \"f386792e-2dbd-4e36-af17-6dbd71a6ad31\") " pod="kube-system/konnectivity-agent-bx274" Apr 17 14:21:23.364101 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.363575 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/0d1f7ee5-7856-41fe-9646-4344f387a26d-etc-tuned\") pod \"tuned-6xth9\" (UID: \"0d1f7ee5-7856-41fe-9646-4344f387a26d\") " pod="openshift-cluster-node-tuning-operator/tuned-6xth9" Apr 17 14:21:23.364101 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.363607 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/3081116a-d0b8-4212-ab0e-d65967b55e93-iptables-alerter-script\") pod \"iptables-alerter-jrx2k\" (UID: \"3081116a-d0b8-4212-ab0e-d65967b55e93\") " pod="openshift-network-operator/iptables-alerter-jrx2k" Apr 17 14:21:23.364101 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.363625 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/31c597ae-daa7-47cf-855c-9a1613df2d3f-kubelet-dir\") pod \"aws-ebs-csi-driver-node-4dhg6\" (UID: \"31c597ae-daa7-47cf-855c-9a1613df2d3f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4dhg6" Apr 17 14:21:23.364101 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.363615 2575 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 17 14:21:23.364101 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.363648 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/207ded88-4793-4bf3-9d1a-a6775c96a280-host-var-lib-cni-bin\") pod \"multus-gxf4n\" (UID: \"207ded88-4793-4bf3-9d1a-a6775c96a280\") " pod="openshift-multus/multus-gxf4n" Apr 17 14:21:23.364101 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.363672 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/207ded88-4793-4bf3-9d1a-a6775c96a280-host-var-lib-cni-multus\") pod \"multus-gxf4n\" (UID: \"207ded88-4793-4bf3-9d1a-a6775c96a280\") " pod="openshift-multus/multus-gxf4n" Apr 17 14:21:23.364101 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.363684 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/31c597ae-daa7-47cf-855c-9a1613df2d3f-kubelet-dir\") pod \"aws-ebs-csi-driver-node-4dhg6\" (UID: \"31c597ae-daa7-47cf-855c-9a1613df2d3f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4dhg6" Apr 17 14:21:23.364101 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.363695 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/207ded88-4793-4bf3-9d1a-a6775c96a280-hostroot\") pod \"multus-gxf4n\" (UID: \"207ded88-4793-4bf3-9d1a-a6775c96a280\") " pod="openshift-multus/multus-gxf4n" Apr 17 14:21:23.364101 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.363726 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e8508015-adfb-42aa-acfc-92b24ec90241-tmp-dir\") pod \"node-resolver-kvtp7\" (UID: \"e8508015-adfb-42aa-acfc-92b24ec90241\") " pod="openshift-dns/node-resolver-kvtp7" Apr 17 14:21:23.364101 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.363752 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d0eca2ae-83f6-462e-b7d9-9ab1592717a8-metrics-certs\") pod \"network-metrics-daemon-ccsgf\" (UID: \"d0eca2ae-83f6-462e-b7d9-9ab1592717a8\") " pod="openshift-multus/network-metrics-daemon-ccsgf" Apr 17 14:21:23.364101 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.363740 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf-ovnkube-script-lib\") pod \"ovnkube-node-x4dft\" (UID: \"5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-x4dft" Apr 17 14:21:23.364101 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.363771 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-x4dft\" (UID: \"5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-x4dft" Apr 17 14:21:23.364101 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.363806 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/31c597ae-daa7-47cf-855c-9a1613df2d3f-sys-fs\") pod \"aws-ebs-csi-driver-node-4dhg6\" (UID: \"31c597ae-daa7-47cf-855c-9a1613df2d3f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4dhg6" Apr 17 14:21:23.364101 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.363829 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-x4dft\" (UID: \"5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-x4dft" Apr 17 14:21:23.364101 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:23.363885 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:21:23.364101 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.363921 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7ls7\" (UniqueName: \"kubernetes.io/projected/207ded88-4793-4bf3-9d1a-a6775c96a280-kube-api-access-l7ls7\") pod \"multus-gxf4n\" (UID: \"207ded88-4793-4bf3-9d1a-a6775c96a280\") " pod="openshift-multus/multus-gxf4n" Apr 17 14:21:23.364101 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.363930 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/31c597ae-daa7-47cf-855c-9a1613df2d3f-sys-fs\") pod \"aws-ebs-csi-driver-node-4dhg6\" (UID: \"31c597ae-daa7-47cf-855c-9a1613df2d3f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4dhg6" Apr 17 14:21:23.364101 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.363948 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/0d1f7ee5-7856-41fe-9646-4344f387a26d-etc-sysconfig\") pod \"tuned-6xth9\" (UID: \"0d1f7ee5-7856-41fe-9646-4344f387a26d\") " pod="openshift-cluster-node-tuning-operator/tuned-6xth9" Apr 17 14:21:23.365015 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:23.363983 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0eca2ae-83f6-462e-b7d9-9ab1592717a8-metrics-certs podName:d0eca2ae-83f6-462e-b7d9-9ab1592717a8 nodeName:}" failed. No retries permitted until 2026-04-17 14:21:23.86396778 +0000 UTC m=+2.170721381 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d0eca2ae-83f6-462e-b7d9-9ab1592717a8-metrics-certs") pod "network-metrics-daemon-ccsgf" (UID: "d0eca2ae-83f6-462e-b7d9-9ab1592717a8") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:21:23.365015 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.364012 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0d1f7ee5-7856-41fe-9646-4344f387a26d-etc-kubernetes\") pod \"tuned-6xth9\" (UID: \"0d1f7ee5-7856-41fe-9646-4344f387a26d\") " pod="openshift-cluster-node-tuning-operator/tuned-6xth9" Apr 17 14:21:23.365015 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.364035 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e8508015-adfb-42aa-acfc-92b24ec90241-tmp-dir\") pod \"node-resolver-kvtp7\" (UID: \"e8508015-adfb-42aa-acfc-92b24ec90241\") " pod="openshift-dns/node-resolver-kvtp7" Apr 17 14:21:23.365015 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.364052 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e8508015-adfb-42aa-acfc-92b24ec90241-hosts-file\") pod \"node-resolver-kvtp7\" (UID: \"e8508015-adfb-42aa-acfc-92b24ec90241\") " pod="openshift-dns/node-resolver-kvtp7" Apr 17 14:21:23.365015 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.364082 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xmwrs\" (UniqueName: \"kubernetes.io/projected/e8508015-adfb-42aa-acfc-92b24ec90241-kube-api-access-xmwrs\") pod \"node-resolver-kvtp7\" (UID: \"e8508015-adfb-42aa-acfc-92b24ec90241\") " pod="openshift-dns/node-resolver-kvtp7" Apr 17 14:21:23.365015 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.364099 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/3081116a-d0b8-4212-ab0e-d65967b55e93-iptables-alerter-script\") pod \"iptables-alerter-jrx2k\" (UID: \"3081116a-d0b8-4212-ab0e-d65967b55e93\") " pod="openshift-network-operator/iptables-alerter-jrx2k" Apr 17 14:21:23.365015 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.364106 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/f386792e-2dbd-4e36-af17-6dbd71a6ad31-agent-certs\") pod \"konnectivity-agent-bx274\" (UID: \"f386792e-2dbd-4e36-af17-6dbd71a6ad31\") " pod="kube-system/konnectivity-agent-bx274" Apr 17 14:21:23.365015 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.364105 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e8508015-adfb-42aa-acfc-92b24ec90241-hosts-file\") pod \"node-resolver-kvtp7\" (UID: \"e8508015-adfb-42aa-acfc-92b24ec90241\") " pod="openshift-dns/node-resolver-kvtp7" Apr 17 14:21:23.365015 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.364143 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/0d1f7ee5-7856-41fe-9646-4344f387a26d-etc-systemd\") pod \"tuned-6xth9\" (UID: \"0d1f7ee5-7856-41fe-9646-4344f387a26d\") " pod="openshift-cluster-node-tuning-operator/tuned-6xth9" Apr 17 14:21:23.365015 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.364166 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf-log-socket\") pod \"ovnkube-node-x4dft\" (UID: \"5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-x4dft" Apr 17 14:21:23.365015 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.364187 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf-node-log\") pod \"ovnkube-node-x4dft\" (UID: \"5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-x4dft" Apr 17 14:21:23.365015 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.364225 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf-node-log\") pod \"ovnkube-node-x4dft\" (UID: \"5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-x4dft" Apr 17 14:21:23.365015 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.364225 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf-host-run-ovn-kubernetes\") pod \"ovnkube-node-x4dft\" (UID: \"5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-x4dft" Apr 17 14:21:23.365015 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.364252 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf-ovnkube-config\") pod \"ovnkube-node-x4dft\" (UID: \"5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-x4dft" Apr 17 14:21:23.365015 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.364257 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf-log-socket\") pod \"ovnkube-node-x4dft\" (UID: \"5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-x4dft" Apr 17 14:21:23.365015 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.364269 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0d1f7ee5-7856-41fe-9646-4344f387a26d-run\") pod \"tuned-6xth9\" (UID: \"0d1f7ee5-7856-41fe-9646-4344f387a26d\") " pod="openshift-cluster-node-tuning-operator/tuned-6xth9" Apr 17 14:21:23.365015 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.364284 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/31c597ae-daa7-47cf-855c-9a1613df2d3f-registration-dir\") pod \"aws-ebs-csi-driver-node-4dhg6\" (UID: \"31c597ae-daa7-47cf-855c-9a1613df2d3f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4dhg6" Apr 17 14:21:23.365828 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.364286 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf-host-run-ovn-kubernetes\") pod \"ovnkube-node-x4dft\" (UID: \"5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-x4dft" Apr 17 14:21:23.365828 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.364306 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/63505374-1c69-4d7f-853d-90e9526b6d12-os-release\") pod \"multus-additional-cni-plugins-vp9jm\" (UID: \"63505374-1c69-4d7f-853d-90e9526b6d12\") " pod="openshift-multus/multus-additional-cni-plugins-vp9jm" Apr 17 14:21:23.365828 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.364333 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/63505374-1c69-4d7f-853d-90e9526b6d12-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-vp9jm\" (UID: \"63505374-1c69-4d7f-853d-90e9526b6d12\") " pod="openshift-multus/multus-additional-cni-plugins-vp9jm" Apr 17 14:21:23.365828 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.364342 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/31c597ae-daa7-47cf-855c-9a1613df2d3f-registration-dir\") pod \"aws-ebs-csi-driver-node-4dhg6\" (UID: \"31c597ae-daa7-47cf-855c-9a1613df2d3f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4dhg6" Apr 17 14:21:23.365828 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.364364 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/207ded88-4793-4bf3-9d1a-a6775c96a280-host-run-k8s-cni-cncf-io\") pod \"multus-gxf4n\" (UID: \"207ded88-4793-4bf3-9d1a-a6775c96a280\") " pod="openshift-multus/multus-gxf4n" Apr 17 14:21:23.365828 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.364385 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0d1f7ee5-7856-41fe-9646-4344f387a26d-sys\") pod \"tuned-6xth9\" (UID: \"0d1f7ee5-7856-41fe-9646-4344f387a26d\") " pod="openshift-cluster-node-tuning-operator/tuned-6xth9" Apr 17 14:21:23.365828 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.364402 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4pfnr\" (UniqueName: \"kubernetes.io/projected/5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf-kube-api-access-4pfnr\") pod \"ovnkube-node-x4dft\" (UID: \"5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-x4dft" Apr 17 14:21:23.365828 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.364420 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/31c597ae-daa7-47cf-855c-9a1613df2d3f-device-dir\") pod \"aws-ebs-csi-driver-node-4dhg6\" (UID: \"31c597ae-daa7-47cf-855c-9a1613df2d3f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4dhg6" Apr 17 14:21:23.365828 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.364444 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/207ded88-4793-4bf3-9d1a-a6775c96a280-os-release\") pod \"multus-gxf4n\" (UID: \"207ded88-4793-4bf3-9d1a-a6775c96a280\") " pod="openshift-multus/multus-gxf4n" Apr 17 14:21:23.365828 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.364519 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/31c597ae-daa7-47cf-855c-9a1613df2d3f-device-dir\") pod \"aws-ebs-csi-driver-node-4dhg6\" (UID: \"31c597ae-daa7-47cf-855c-9a1613df2d3f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4dhg6" Apr 17 14:21:23.365828 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.364561 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf-host-slash\") pod \"ovnkube-node-x4dft\" (UID: \"5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-x4dft" Apr 17 14:21:23.365828 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.364604 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ntmv\" (UniqueName: \"kubernetes.io/projected/63505374-1c69-4d7f-853d-90e9526b6d12-kube-api-access-6ntmv\") pod \"multus-additional-cni-plugins-vp9jm\" (UID: \"63505374-1c69-4d7f-853d-90e9526b6d12\") " pod="openshift-multus/multus-additional-cni-plugins-vp9jm" Apr 17 14:21:23.365828 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.364612 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf-host-slash\") pod \"ovnkube-node-x4dft\" (UID: \"5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-x4dft" Apr 17 14:21:23.365828 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.364639 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3081116a-d0b8-4212-ab0e-d65967b55e93-host-slash\") pod \"iptables-alerter-jrx2k\" (UID: \"3081116a-d0b8-4212-ab0e-d65967b55e93\") " pod="openshift-network-operator/iptables-alerter-jrx2k" Apr 17 14:21:23.365828 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.364662 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/822ddccf-2c1a-42b8-8b4f-b676b2aa2c9f-host\") pod \"node-ca-58lbn\" (UID: \"822ddccf-2c1a-42b8-8b4f-b676b2aa2c9f\") " pod="openshift-image-registry/node-ca-58lbn" Apr 17 14:21:23.365828 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.364680 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3081116a-d0b8-4212-ab0e-d65967b55e93-host-slash\") pod \"iptables-alerter-jrx2k\" (UID: \"3081116a-d0b8-4212-ab0e-d65967b55e93\") " pod="openshift-network-operator/iptables-alerter-jrx2k" Apr 17 14:21:23.365828 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.364685 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d4rmz\" (UniqueName: \"kubernetes.io/projected/d0eca2ae-83f6-462e-b7d9-9ab1592717a8-kube-api-access-d4rmz\") pod \"network-metrics-daemon-ccsgf\" (UID: \"d0eca2ae-83f6-462e-b7d9-9ab1592717a8\") " pod="openshift-multus/network-metrics-daemon-ccsgf" Apr 17 14:21:23.366630 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.364708 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf-run-ovn\") pod \"ovnkube-node-x4dft\" (UID: \"5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-x4dft" Apr 17 14:21:23.366630 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.364716 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf-ovnkube-config\") pod \"ovnkube-node-x4dft\" (UID: \"5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-x4dft" Apr 17 14:21:23.366630 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.364747 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/822ddccf-2c1a-42b8-8b4f-b676b2aa2c9f-host\") pod \"node-ca-58lbn\" (UID: \"822ddccf-2c1a-42b8-8b4f-b676b2aa2c9f\") " pod="openshift-image-registry/node-ca-58lbn" Apr 17 14:21:23.366630 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.364750 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/63505374-1c69-4d7f-853d-90e9526b6d12-system-cni-dir\") pod \"multus-additional-cni-plugins-vp9jm\" (UID: \"63505374-1c69-4d7f-853d-90e9526b6d12\") " pod="openshift-multus/multus-additional-cni-plugins-vp9jm" Apr 17 14:21:23.366630 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.364781 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf-run-ovn\") pod \"ovnkube-node-x4dft\" (UID: \"5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-x4dft" Apr 17 14:21:23.366630 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.364811 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/63505374-1c69-4d7f-853d-90e9526b6d12-cnibin\") pod \"multus-additional-cni-plugins-vp9jm\" (UID: \"63505374-1c69-4d7f-853d-90e9526b6d12\") " pod="openshift-multus/multus-additional-cni-plugins-vp9jm" Apr 17 14:21:23.366630 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.364838 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/63505374-1c69-4d7f-853d-90e9526b6d12-cni-binary-copy\") pod \"multus-additional-cni-plugins-vp9jm\" (UID: \"63505374-1c69-4d7f-853d-90e9526b6d12\") " pod="openshift-multus/multus-additional-cni-plugins-vp9jm" Apr 17 14:21:23.366630 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.364866 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/63505374-1c69-4d7f-853d-90e9526b6d12-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-vp9jm\" (UID: \"63505374-1c69-4d7f-853d-90e9526b6d12\") " pod="openshift-multus/multus-additional-cni-plugins-vp9jm" Apr 17 14:21:23.366630 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.364931 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf-host-cni-netd\") pod \"ovnkube-node-x4dft\" (UID: \"5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-x4dft" Apr 17 14:21:23.366630 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.364948 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/207ded88-4793-4bf3-9d1a-a6775c96a280-cni-binary-copy\") pod \"multus-gxf4n\" (UID: \"207ded88-4793-4bf3-9d1a-a6775c96a280\") " pod="openshift-multus/multus-gxf4n" Apr 17 14:21:23.366630 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.364975 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/207ded88-4793-4bf3-9d1a-a6775c96a280-host-var-lib-kubelet\") pod \"multus-gxf4n\" (UID: \"207ded88-4793-4bf3-9d1a-a6775c96a280\") " pod="openshift-multus/multus-gxf4n" Apr 17 14:21:23.366630 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.364994 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf-host-cni-netd\") pod \"ovnkube-node-x4dft\" (UID: \"5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-x4dft" Apr 17 14:21:23.366630 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.365021 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/0d1f7ee5-7856-41fe-9646-4344f387a26d-etc-sysctl-conf\") pod \"tuned-6xth9\" (UID: \"0d1f7ee5-7856-41fe-9646-4344f387a26d\") " pod="openshift-cluster-node-tuning-operator/tuned-6xth9" Apr 17 14:21:23.366630 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.365044 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0d1f7ee5-7856-41fe-9646-4344f387a26d-host\") pod \"tuned-6xth9\" (UID: \"0d1f7ee5-7856-41fe-9646-4344f387a26d\") " pod="openshift-cluster-node-tuning-operator/tuned-6xth9" Apr 17 14:21:23.366630 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.365091 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf-host-kubelet\") pod \"ovnkube-node-x4dft\" (UID: \"5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-x4dft" Apr 17 14:21:23.366630 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.365135 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf-host-cni-bin\") pod \"ovnkube-node-x4dft\" (UID: \"5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-x4dft" Apr 17 14:21:23.366630 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.365186 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf-host-kubelet\") pod \"ovnkube-node-x4dft\" (UID: \"5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-x4dft" Apr 17 14:21:23.367155 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.365190 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lzwjz\" (UniqueName: \"kubernetes.io/projected/31c597ae-daa7-47cf-855c-9a1613df2d3f-kube-api-access-lzwjz\") pod \"aws-ebs-csi-driver-node-4dhg6\" (UID: \"31c597ae-daa7-47cf-855c-9a1613df2d3f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4dhg6" Apr 17 14:21:23.367155 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.365200 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf-host-cni-bin\") pod \"ovnkube-node-x4dft\" (UID: \"5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-x4dft" Apr 17 14:21:23.367155 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.365235 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/207ded88-4793-4bf3-9d1a-a6775c96a280-system-cni-dir\") pod \"multus-gxf4n\" (UID: \"207ded88-4793-4bf3-9d1a-a6775c96a280\") " pod="openshift-multus/multus-gxf4n" Apr 17 14:21:23.367155 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.365365 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8zrz7\" (UniqueName: \"kubernetes.io/projected/822ddccf-2c1a-42b8-8b4f-b676b2aa2c9f-kube-api-access-8zrz7\") pod \"node-ca-58lbn\" (UID: \"822ddccf-2c1a-42b8-8b4f-b676b2aa2c9f\") " pod="openshift-image-registry/node-ca-58lbn" Apr 17 14:21:23.367155 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.365405 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf-host-run-netns\") pod \"ovnkube-node-x4dft\" (UID: \"5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-x4dft" Apr 17 14:21:23.367155 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.365505 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf-host-run-netns\") pod \"ovnkube-node-x4dft\" (UID: \"5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-x4dft" Apr 17 14:21:23.367155 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.365512 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf-var-lib-openvswitch\") pod \"ovnkube-node-x4dft\" (UID: \"5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-x4dft" Apr 17 14:21:23.367155 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.365548 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf-env-overrides\") pod \"ovnkube-node-x4dft\" (UID: \"5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-x4dft" Apr 17 14:21:23.367155 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.365549 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf-var-lib-openvswitch\") pod \"ovnkube-node-x4dft\" (UID: \"5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-x4dft" Apr 17 14:21:23.367155 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.365580 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/207ded88-4793-4bf3-9d1a-a6775c96a280-multus-cni-dir\") pod \"multus-gxf4n\" (UID: \"207ded88-4793-4bf3-9d1a-a6775c96a280\") " pod="openshift-multus/multus-gxf4n" Apr 17 14:21:23.367155 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.365600 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/207ded88-4793-4bf3-9d1a-a6775c96a280-multus-conf-dir\") pod \"multus-gxf4n\" (UID: \"207ded88-4793-4bf3-9d1a-a6775c96a280\") " pod="openshift-multus/multus-gxf4n" Apr 17 14:21:23.367155 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.365625 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/207ded88-4793-4bf3-9d1a-a6775c96a280-multus-daemon-config\") pod \"multus-gxf4n\" (UID: \"207ded88-4793-4bf3-9d1a-a6775c96a280\") " pod="openshift-multus/multus-gxf4n" Apr 17 14:21:23.367155 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.365659 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/822ddccf-2c1a-42b8-8b4f-b676b2aa2c9f-serviceca\") pod \"node-ca-58lbn\" (UID: \"822ddccf-2c1a-42b8-8b4f-b676b2aa2c9f\") " pod="openshift-image-registry/node-ca-58lbn" Apr 17 14:21:23.367155 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.365711 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7q5wk\" (UniqueName: \"kubernetes.io/projected/39622ed6-2176-4d02-9823-0818852bbb2d-kube-api-access-7q5wk\") pod \"network-check-target-9cwz2\" (UID: \"39622ed6-2176-4d02-9823-0818852bbb2d\") " pod="openshift-network-diagnostics/network-check-target-9cwz2" Apr 17 14:21:23.367155 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.365740 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf-systemd-units\") pod \"ovnkube-node-x4dft\" (UID: \"5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-x4dft" Apr 17 14:21:23.367155 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.365767 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0d1f7ee5-7856-41fe-9646-4344f387a26d-lib-modules\") pod \"tuned-6xth9\" (UID: \"0d1f7ee5-7856-41fe-9646-4344f387a26d\") " pod="openshift-cluster-node-tuning-operator/tuned-6xth9" Apr 17 14:21:23.367155 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.365793 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0d1f7ee5-7856-41fe-9646-4344f387a26d-tmp\") pod \"tuned-6xth9\" (UID: \"0d1f7ee5-7856-41fe-9646-4344f387a26d\") " pod="openshift-cluster-node-tuning-operator/tuned-6xth9" Apr 17 14:21:23.367600 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.365811 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf-systemd-units\") pod \"ovnkube-node-x4dft\" (UID: \"5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-x4dft" Apr 17 14:21:23.367600 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.365909 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf-env-overrides\") pod \"ovnkube-node-x4dft\" (UID: \"5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-x4dft" Apr 17 14:21:23.367600 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.366010 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/822ddccf-2c1a-42b8-8b4f-b676b2aa2c9f-serviceca\") pod \"node-ca-58lbn\" (UID: \"822ddccf-2c1a-42b8-8b4f-b676b2aa2c9f\") " pod="openshift-image-registry/node-ca-58lbn" Apr 17 14:21:23.367600 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.366985 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf-ovn-node-metrics-cert\") pod \"ovnkube-node-x4dft\" (UID: \"5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-x4dft" Apr 17 14:21:23.367600 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.367523 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/f386792e-2dbd-4e36-af17-6dbd71a6ad31-agent-certs\") pod \"konnectivity-agent-bx274\" (UID: \"f386792e-2dbd-4e36-af17-6dbd71a6ad31\") " pod="kube-system/konnectivity-agent-bx274" Apr 17 14:21:23.375228 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:23.375192 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 14:21:23.375334 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:23.375232 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 14:21:23.375334 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:23.375265 2575 projected.go:194] Error preparing data for projected volume kube-api-access-7q5wk for pod openshift-network-diagnostics/network-check-target-9cwz2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:21:23.375334 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:23.375321 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/39622ed6-2176-4d02-9823-0818852bbb2d-kube-api-access-7q5wk podName:39622ed6-2176-4d02-9823-0818852bbb2d nodeName:}" failed. No retries permitted until 2026-04-17 14:21:23.875306703 +0000 UTC m=+2.182060301 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-7q5wk" (UniqueName: "kubernetes.io/projected/39622ed6-2176-4d02-9823-0818852bbb2d-kube-api-access-7q5wk") pod "network-check-target-9cwz2" (UID: "39622ed6-2176-4d02-9823-0818852bbb2d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:21:23.376269 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.376245 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfwsw\" (UniqueName: \"kubernetes.io/projected/3081116a-d0b8-4212-ab0e-d65967b55e93-kube-api-access-tfwsw\") pod \"iptables-alerter-jrx2k\" (UID: \"3081116a-d0b8-4212-ab0e-d65967b55e93\") " pod="openshift-network-operator/iptables-alerter-jrx2k" Apr 17 14:21:23.377187 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.377163 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmwrs\" (UniqueName: \"kubernetes.io/projected/e8508015-adfb-42aa-acfc-92b24ec90241-kube-api-access-xmwrs\") pod \"node-resolver-kvtp7\" (UID: \"e8508015-adfb-42aa-acfc-92b24ec90241\") " pod="openshift-dns/node-resolver-kvtp7" Apr 17 14:21:23.377362 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.377344 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zrz7\" (UniqueName: \"kubernetes.io/projected/822ddccf-2c1a-42b8-8b4f-b676b2aa2c9f-kube-api-access-8zrz7\") pod \"node-ca-58lbn\" (UID: \"822ddccf-2c1a-42b8-8b4f-b676b2aa2c9f\") " pod="openshift-image-registry/node-ca-58lbn" Apr 17 14:21:23.377442 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.377418 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4rmz\" (UniqueName: \"kubernetes.io/projected/d0eca2ae-83f6-462e-b7d9-9ab1592717a8-kube-api-access-d4rmz\") pod \"network-metrics-daemon-ccsgf\" (UID: \"d0eca2ae-83f6-462e-b7d9-9ab1592717a8\") " pod="openshift-multus/network-metrics-daemon-ccsgf" Apr 17 14:21:23.377526 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.377509 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pfnr\" (UniqueName: \"kubernetes.io/projected/5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf-kube-api-access-4pfnr\") pod \"ovnkube-node-x4dft\" (UID: \"5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-x4dft" Apr 17 14:21:23.377836 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.377822 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzwjz\" (UniqueName: \"kubernetes.io/projected/31c597ae-daa7-47cf-855c-9a1613df2d3f-kube-api-access-lzwjz\") pod \"aws-ebs-csi-driver-node-4dhg6\" (UID: \"31c597ae-daa7-47cf-855c-9a1613df2d3f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4dhg6" Apr 17 14:21:23.433210 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:23.433181 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfcb38c6600a2cdeab28c1b9fce28f12c.slice/crio-27439a4aea2be1e150d14c3981ea3e8494cb94b67abf2081002734f1990226b3 WatchSource:0}: Error finding container 27439a4aea2be1e150d14c3981ea3e8494cb94b67abf2081002734f1990226b3: Status 404 returned error can't find the container with id 27439a4aea2be1e150d14c3981ea3e8494cb94b67abf2081002734f1990226b3 Apr 17 14:21:23.437155 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.437125 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 14:21:23.443414 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:23.443393 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b09b5827d36ff6ab7618b3998f1064e.slice/crio-e675402dfec44ea1db05e7b521581497c289507be00bc7f7fbcd68e7216fa440 WatchSource:0}: Error finding container e675402dfec44ea1db05e7b521581497c289507be00bc7f7fbcd68e7216fa440: Status 404 returned error can't find the container with id e675402dfec44ea1db05e7b521581497c289507be00bc7f7fbcd68e7216fa440 Apr 17 14:21:23.466682 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.466662 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0d1f7ee5-7856-41fe-9646-4344f387a26d-run\") pod \"tuned-6xth9\" (UID: \"0d1f7ee5-7856-41fe-9646-4344f387a26d\") " pod="openshift-cluster-node-tuning-operator/tuned-6xth9" Apr 17 14:21:23.466784 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.466690 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/63505374-1c69-4d7f-853d-90e9526b6d12-os-release\") pod \"multus-additional-cni-plugins-vp9jm\" (UID: \"63505374-1c69-4d7f-853d-90e9526b6d12\") " pod="openshift-multus/multus-additional-cni-plugins-vp9jm" Apr 17 14:21:23.466784 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.466707 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/63505374-1c69-4d7f-853d-90e9526b6d12-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-vp9jm\" (UID: \"63505374-1c69-4d7f-853d-90e9526b6d12\") " pod="openshift-multus/multus-additional-cni-plugins-vp9jm" Apr 17 14:21:23.466784 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.466724 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/207ded88-4793-4bf3-9d1a-a6775c96a280-host-run-k8s-cni-cncf-io\") pod \"multus-gxf4n\" (UID: \"207ded88-4793-4bf3-9d1a-a6775c96a280\") " pod="openshift-multus/multus-gxf4n" Apr 17 14:21:23.466784 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.466756 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/207ded88-4793-4bf3-9d1a-a6775c96a280-host-run-k8s-cni-cncf-io\") pod \"multus-gxf4n\" (UID: \"207ded88-4793-4bf3-9d1a-a6775c96a280\") " pod="openshift-multus/multus-gxf4n" Apr 17 14:21:23.466784 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.466758 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0d1f7ee5-7856-41fe-9646-4344f387a26d-sys\") pod \"tuned-6xth9\" (UID: \"0d1f7ee5-7856-41fe-9646-4344f387a26d\") " pod="openshift-cluster-node-tuning-operator/tuned-6xth9" Apr 17 14:21:23.467047 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.466788 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0d1f7ee5-7856-41fe-9646-4344f387a26d-run\") pod \"tuned-6xth9\" (UID: \"0d1f7ee5-7856-41fe-9646-4344f387a26d\") " pod="openshift-cluster-node-tuning-operator/tuned-6xth9" Apr 17 14:21:23.467047 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.466835 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0d1f7ee5-7856-41fe-9646-4344f387a26d-sys\") pod \"tuned-6xth9\" (UID: \"0d1f7ee5-7856-41fe-9646-4344f387a26d\") " pod="openshift-cluster-node-tuning-operator/tuned-6xth9" Apr 17 14:21:23.467047 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.466794 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/207ded88-4793-4bf3-9d1a-a6775c96a280-os-release\") pod \"multus-gxf4n\" (UID: \"207ded88-4793-4bf3-9d1a-a6775c96a280\") " pod="openshift-multus/multus-gxf4n" Apr 17 14:21:23.467047 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.466853 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/207ded88-4793-4bf3-9d1a-a6775c96a280-os-release\") pod \"multus-gxf4n\" (UID: \"207ded88-4793-4bf3-9d1a-a6775c96a280\") " pod="openshift-multus/multus-gxf4n" Apr 17 14:21:23.467047 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.466895 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6ntmv\" (UniqueName: \"kubernetes.io/projected/63505374-1c69-4d7f-853d-90e9526b6d12-kube-api-access-6ntmv\") pod \"multus-additional-cni-plugins-vp9jm\" (UID: \"63505374-1c69-4d7f-853d-90e9526b6d12\") " pod="openshift-multus/multus-additional-cni-plugins-vp9jm" Apr 17 14:21:23.467047 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.466901 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/63505374-1c69-4d7f-853d-90e9526b6d12-os-release\") pod \"multus-additional-cni-plugins-vp9jm\" (UID: \"63505374-1c69-4d7f-853d-90e9526b6d12\") " pod="openshift-multus/multus-additional-cni-plugins-vp9jm" Apr 17 14:21:23.467047 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.466928 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/63505374-1c69-4d7f-853d-90e9526b6d12-system-cni-dir\") pod \"multus-additional-cni-plugins-vp9jm\" (UID: \"63505374-1c69-4d7f-853d-90e9526b6d12\") " pod="openshift-multus/multus-additional-cni-plugins-vp9jm" Apr 17 14:21:23.467047 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.466952 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/63505374-1c69-4d7f-853d-90e9526b6d12-cnibin\") pod \"multus-additional-cni-plugins-vp9jm\" (UID: \"63505374-1c69-4d7f-853d-90e9526b6d12\") " pod="openshift-multus/multus-additional-cni-plugins-vp9jm" Apr 17 14:21:23.467047 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.466974 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/63505374-1c69-4d7f-853d-90e9526b6d12-cni-binary-copy\") pod \"multus-additional-cni-plugins-vp9jm\" (UID: \"63505374-1c69-4d7f-853d-90e9526b6d12\") " pod="openshift-multus/multus-additional-cni-plugins-vp9jm" Apr 17 14:21:23.467047 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.466990 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/63505374-1c69-4d7f-853d-90e9526b6d12-system-cni-dir\") pod \"multus-additional-cni-plugins-vp9jm\" (UID: \"63505374-1c69-4d7f-853d-90e9526b6d12\") " pod="openshift-multus/multus-additional-cni-plugins-vp9jm" Apr 17 14:21:23.467047 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.466996 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/63505374-1c69-4d7f-853d-90e9526b6d12-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-vp9jm\" (UID: \"63505374-1c69-4d7f-853d-90e9526b6d12\") " pod="openshift-multus/multus-additional-cni-plugins-vp9jm" Apr 17 14:21:23.467047 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.467023 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/63505374-1c69-4d7f-853d-90e9526b6d12-cnibin\") pod \"multus-additional-cni-plugins-vp9jm\" (UID: \"63505374-1c69-4d7f-853d-90e9526b6d12\") " pod="openshift-multus/multus-additional-cni-plugins-vp9jm" Apr 17 14:21:23.467047 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.467023 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/207ded88-4793-4bf3-9d1a-a6775c96a280-cni-binary-copy\") pod \"multus-gxf4n\" (UID: \"207ded88-4793-4bf3-9d1a-a6775c96a280\") " pod="openshift-multus/multus-gxf4n" Apr 17 14:21:23.467047 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.467055 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/207ded88-4793-4bf3-9d1a-a6775c96a280-host-var-lib-kubelet\") pod \"multus-gxf4n\" (UID: \"207ded88-4793-4bf3-9d1a-a6775c96a280\") " pod="openshift-multus/multus-gxf4n" Apr 17 14:21:23.467668 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.467079 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/0d1f7ee5-7856-41fe-9646-4344f387a26d-etc-sysctl-conf\") pod \"tuned-6xth9\" (UID: \"0d1f7ee5-7856-41fe-9646-4344f387a26d\") " pod="openshift-cluster-node-tuning-operator/tuned-6xth9" Apr 17 14:21:23.467668 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.467101 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0d1f7ee5-7856-41fe-9646-4344f387a26d-host\") pod \"tuned-6xth9\" (UID: \"0d1f7ee5-7856-41fe-9646-4344f387a26d\") " pod="openshift-cluster-node-tuning-operator/tuned-6xth9" Apr 17 14:21:23.467668 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.467125 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/207ded88-4793-4bf3-9d1a-a6775c96a280-system-cni-dir\") pod \"multus-gxf4n\" (UID: \"207ded88-4793-4bf3-9d1a-a6775c96a280\") " pod="openshift-multus/multus-gxf4n" Apr 17 14:21:23.467668 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.467141 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/207ded88-4793-4bf3-9d1a-a6775c96a280-host-var-lib-kubelet\") pod \"multus-gxf4n\" (UID: \"207ded88-4793-4bf3-9d1a-a6775c96a280\") " pod="openshift-multus/multus-gxf4n" Apr 17 14:21:23.467668 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.467148 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/207ded88-4793-4bf3-9d1a-a6775c96a280-multus-cni-dir\") pod \"multus-gxf4n\" (UID: \"207ded88-4793-4bf3-9d1a-a6775c96a280\") " pod="openshift-multus/multus-gxf4n" Apr 17 14:21:23.467668 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.467192 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/207ded88-4793-4bf3-9d1a-a6775c96a280-multus-cni-dir\") pod \"multus-gxf4n\" (UID: \"207ded88-4793-4bf3-9d1a-a6775c96a280\") " pod="openshift-multus/multus-gxf4n" Apr 17 14:21:23.467668 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.467204 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/207ded88-4793-4bf3-9d1a-a6775c96a280-multus-conf-dir\") pod \"multus-gxf4n\" (UID: \"207ded88-4793-4bf3-9d1a-a6775c96a280\") " pod="openshift-multus/multus-gxf4n" Apr 17 14:21:23.467668 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.467218 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/0d1f7ee5-7856-41fe-9646-4344f387a26d-etc-sysctl-conf\") pod \"tuned-6xth9\" (UID: \"0d1f7ee5-7856-41fe-9646-4344f387a26d\") " pod="openshift-cluster-node-tuning-operator/tuned-6xth9" Apr 17 14:21:23.467668 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.467230 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/207ded88-4793-4bf3-9d1a-a6775c96a280-multus-daemon-config\") pod \"multus-gxf4n\" (UID: \"207ded88-4793-4bf3-9d1a-a6775c96a280\") " pod="openshift-multus/multus-gxf4n" Apr 17 14:21:23.467668 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.467232 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0d1f7ee5-7856-41fe-9646-4344f387a26d-host\") pod \"tuned-6xth9\" (UID: \"0d1f7ee5-7856-41fe-9646-4344f387a26d\") " pod="openshift-cluster-node-tuning-operator/tuned-6xth9" Apr 17 14:21:23.467668 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.467265 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/207ded88-4793-4bf3-9d1a-a6775c96a280-multus-conf-dir\") pod \"multus-gxf4n\" (UID: \"207ded88-4793-4bf3-9d1a-a6775c96a280\") " pod="openshift-multus/multus-gxf4n" Apr 17 14:21:23.467668 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.467288 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0d1f7ee5-7856-41fe-9646-4344f387a26d-lib-modules\") pod \"tuned-6xth9\" (UID: \"0d1f7ee5-7856-41fe-9646-4344f387a26d\") " pod="openshift-cluster-node-tuning-operator/tuned-6xth9" Apr 17 14:21:23.467668 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.467307 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/207ded88-4793-4bf3-9d1a-a6775c96a280-system-cni-dir\") pod \"multus-gxf4n\" (UID: \"207ded88-4793-4bf3-9d1a-a6775c96a280\") " pod="openshift-multus/multus-gxf4n" Apr 17 14:21:23.467668 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.467317 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0d1f7ee5-7856-41fe-9646-4344f387a26d-tmp\") pod \"tuned-6xth9\" (UID: \"0d1f7ee5-7856-41fe-9646-4344f387a26d\") " pod="openshift-cluster-node-tuning-operator/tuned-6xth9" Apr 17 14:21:23.467668 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.467316 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/63505374-1c69-4d7f-853d-90e9526b6d12-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-vp9jm\" (UID: \"63505374-1c69-4d7f-853d-90e9526b6d12\") " pod="openshift-multus/multus-additional-cni-plugins-vp9jm" Apr 17 14:21:23.467668 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.467343 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xkj42\" (UniqueName: \"kubernetes.io/projected/0d1f7ee5-7856-41fe-9646-4344f387a26d-kube-api-access-xkj42\") pod \"tuned-6xth9\" (UID: \"0d1f7ee5-7856-41fe-9646-4344f387a26d\") " pod="openshift-cluster-node-tuning-operator/tuned-6xth9" Apr 17 14:21:23.467668 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.467373 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/63505374-1c69-4d7f-853d-90e9526b6d12-tuning-conf-dir\") pod \"multus-additional-cni-plugins-vp9jm\" (UID: \"63505374-1c69-4d7f-853d-90e9526b6d12\") " pod="openshift-multus/multus-additional-cni-plugins-vp9jm" Apr 17 14:21:23.467668 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.467399 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/207ded88-4793-4bf3-9d1a-a6775c96a280-cnibin\") pod \"multus-gxf4n\" (UID: \"207ded88-4793-4bf3-9d1a-a6775c96a280\") " pod="openshift-multus/multus-gxf4n" Apr 17 14:21:23.468476 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.467408 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0d1f7ee5-7856-41fe-9646-4344f387a26d-lib-modules\") pod \"tuned-6xth9\" (UID: \"0d1f7ee5-7856-41fe-9646-4344f387a26d\") " pod="openshift-cluster-node-tuning-operator/tuned-6xth9" Apr 17 14:21:23.468476 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.467421 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/0d1f7ee5-7856-41fe-9646-4344f387a26d-etc-modprobe-d\") pod \"tuned-6xth9\" (UID: \"0d1f7ee5-7856-41fe-9646-4344f387a26d\") " pod="openshift-cluster-node-tuning-operator/tuned-6xth9" Apr 17 14:21:23.468476 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.467447 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/0d1f7ee5-7856-41fe-9646-4344f387a26d-etc-sysctl-d\") pod \"tuned-6xth9\" (UID: \"0d1f7ee5-7856-41fe-9646-4344f387a26d\") " pod="openshift-cluster-node-tuning-operator/tuned-6xth9" Apr 17 14:21:23.468476 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.467469 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0d1f7ee5-7856-41fe-9646-4344f387a26d-var-lib-kubelet\") pod \"tuned-6xth9\" (UID: \"0d1f7ee5-7856-41fe-9646-4344f387a26d\") " pod="openshift-cluster-node-tuning-operator/tuned-6xth9" Apr 17 14:21:23.468476 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.467493 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/207ded88-4793-4bf3-9d1a-a6775c96a280-multus-socket-dir-parent\") pod \"multus-gxf4n\" (UID: \"207ded88-4793-4bf3-9d1a-a6775c96a280\") " pod="openshift-multus/multus-gxf4n" Apr 17 14:21:23.468476 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.467516 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/207ded88-4793-4bf3-9d1a-a6775c96a280-host-run-netns\") pod \"multus-gxf4n\" (UID: \"207ded88-4793-4bf3-9d1a-a6775c96a280\") " pod="openshift-multus/multus-gxf4n" Apr 17 14:21:23.468476 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.467529 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/63505374-1c69-4d7f-853d-90e9526b6d12-tuning-conf-dir\") pod \"multus-additional-cni-plugins-vp9jm\" (UID: \"63505374-1c69-4d7f-853d-90e9526b6d12\") " pod="openshift-multus/multus-additional-cni-plugins-vp9jm" Apr 17 14:21:23.468476 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.467539 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/207ded88-4793-4bf3-9d1a-a6775c96a280-host-run-multus-certs\") pod \"multus-gxf4n\" (UID: \"207ded88-4793-4bf3-9d1a-a6775c96a280\") " pod="openshift-multus/multus-gxf4n" Apr 17 14:21:23.468476 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.467540 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/63505374-1c69-4d7f-853d-90e9526b6d12-cni-binary-copy\") pod \"multus-additional-cni-plugins-vp9jm\" (UID: \"63505374-1c69-4d7f-853d-90e9526b6d12\") " pod="openshift-multus/multus-additional-cni-plugins-vp9jm" Apr 17 14:21:23.468476 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.467588 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/207ded88-4793-4bf3-9d1a-a6775c96a280-host-run-multus-certs\") pod \"multus-gxf4n\" (UID: \"207ded88-4793-4bf3-9d1a-a6775c96a280\") " pod="openshift-multus/multus-gxf4n" Apr 17 14:21:23.468476 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.467595 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/207ded88-4793-4bf3-9d1a-a6775c96a280-cni-binary-copy\") pod \"multus-gxf4n\" (UID: \"207ded88-4793-4bf3-9d1a-a6775c96a280\") " pod="openshift-multus/multus-gxf4n" Apr 17 14:21:23.468476 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.467608 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/63505374-1c69-4d7f-853d-90e9526b6d12-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-vp9jm\" (UID: \"63505374-1c69-4d7f-853d-90e9526b6d12\") " pod="openshift-multus/multus-additional-cni-plugins-vp9jm" Apr 17 14:21:23.468476 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.467647 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/207ded88-4793-4bf3-9d1a-a6775c96a280-host-run-netns\") pod \"multus-gxf4n\" (UID: \"207ded88-4793-4bf3-9d1a-a6775c96a280\") " pod="openshift-multus/multus-gxf4n" Apr 17 14:21:23.468476 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.467648 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0d1f7ee5-7856-41fe-9646-4344f387a26d-var-lib-kubelet\") pod \"tuned-6xth9\" (UID: \"0d1f7ee5-7856-41fe-9646-4344f387a26d\") " pod="openshift-cluster-node-tuning-operator/tuned-6xth9" Apr 17 14:21:23.468476 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.467666 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/207ded88-4793-4bf3-9d1a-a6775c96a280-multus-daemon-config\") pod \"multus-gxf4n\" (UID: \"207ded88-4793-4bf3-9d1a-a6775c96a280\") " pod="openshift-multus/multus-gxf4n" Apr 17 14:21:23.468476 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.467677 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/207ded88-4793-4bf3-9d1a-a6775c96a280-etc-kubernetes\") pod \"multus-gxf4n\" (UID: \"207ded88-4793-4bf3-9d1a-a6775c96a280\") " pod="openshift-multus/multus-gxf4n" Apr 17 14:21:23.468476 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.467683 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/207ded88-4793-4bf3-9d1a-a6775c96a280-multus-socket-dir-parent\") pod \"multus-gxf4n\" (UID: \"207ded88-4793-4bf3-9d1a-a6775c96a280\") " pod="openshift-multus/multus-gxf4n" Apr 17 14:21:23.468476 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.467669 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/0d1f7ee5-7856-41fe-9646-4344f387a26d-etc-sysctl-d\") pod \"tuned-6xth9\" (UID: \"0d1f7ee5-7856-41fe-9646-4344f387a26d\") " pod="openshift-cluster-node-tuning-operator/tuned-6xth9" Apr 17 14:21:23.469325 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.467666 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/207ded88-4793-4bf3-9d1a-a6775c96a280-cnibin\") pod \"multus-gxf4n\" (UID: \"207ded88-4793-4bf3-9d1a-a6775c96a280\") " pod="openshift-multus/multus-gxf4n" Apr 17 14:21:23.469325 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.467714 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/207ded88-4793-4bf3-9d1a-a6775c96a280-etc-kubernetes\") pod \"multus-gxf4n\" (UID: \"207ded88-4793-4bf3-9d1a-a6775c96a280\") " pod="openshift-multus/multus-gxf4n" Apr 17 14:21:23.469325 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.467732 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/0d1f7ee5-7856-41fe-9646-4344f387a26d-etc-modprobe-d\") pod \"tuned-6xth9\" (UID: \"0d1f7ee5-7856-41fe-9646-4344f387a26d\") " pod="openshift-cluster-node-tuning-operator/tuned-6xth9" Apr 17 14:21:23.469325 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.467739 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/0d1f7ee5-7856-41fe-9646-4344f387a26d-etc-tuned\") pod \"tuned-6xth9\" (UID: \"0d1f7ee5-7856-41fe-9646-4344f387a26d\") " pod="openshift-cluster-node-tuning-operator/tuned-6xth9" Apr 17 14:21:23.469325 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.467764 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/207ded88-4793-4bf3-9d1a-a6775c96a280-host-var-lib-cni-bin\") pod \"multus-gxf4n\" (UID: \"207ded88-4793-4bf3-9d1a-a6775c96a280\") " pod="openshift-multus/multus-gxf4n" Apr 17 14:21:23.469325 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.467786 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/207ded88-4793-4bf3-9d1a-a6775c96a280-host-var-lib-cni-multus\") pod \"multus-gxf4n\" (UID: \"207ded88-4793-4bf3-9d1a-a6775c96a280\") " pod="openshift-multus/multus-gxf4n" Apr 17 14:21:23.469325 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.467803 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/207ded88-4793-4bf3-9d1a-a6775c96a280-hostroot\") pod \"multus-gxf4n\" (UID: \"207ded88-4793-4bf3-9d1a-a6775c96a280\") " pod="openshift-multus/multus-gxf4n" Apr 17 14:21:23.469325 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.467839 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/207ded88-4793-4bf3-9d1a-a6775c96a280-host-var-lib-cni-bin\") pod \"multus-gxf4n\" (UID: \"207ded88-4793-4bf3-9d1a-a6775c96a280\") " pod="openshift-multus/multus-gxf4n" Apr 17 14:21:23.469325 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.467841 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l7ls7\" (UniqueName: \"kubernetes.io/projected/207ded88-4793-4bf3-9d1a-a6775c96a280-kube-api-access-l7ls7\") pod \"multus-gxf4n\" (UID: \"207ded88-4793-4bf3-9d1a-a6775c96a280\") " pod="openshift-multus/multus-gxf4n" Apr 17 14:21:23.469325 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.467860 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/207ded88-4793-4bf3-9d1a-a6775c96a280-hostroot\") pod \"multus-gxf4n\" (UID: \"207ded88-4793-4bf3-9d1a-a6775c96a280\") " pod="openshift-multus/multus-gxf4n" Apr 17 14:21:23.469325 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.467885 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/207ded88-4793-4bf3-9d1a-a6775c96a280-host-var-lib-cni-multus\") pod \"multus-gxf4n\" (UID: \"207ded88-4793-4bf3-9d1a-a6775c96a280\") " pod="openshift-multus/multus-gxf4n" Apr 17 14:21:23.469325 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.467888 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/0d1f7ee5-7856-41fe-9646-4344f387a26d-etc-sysconfig\") pod \"tuned-6xth9\" (UID: \"0d1f7ee5-7856-41fe-9646-4344f387a26d\") " pod="openshift-cluster-node-tuning-operator/tuned-6xth9" Apr 17 14:21:23.469325 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.467921 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0d1f7ee5-7856-41fe-9646-4344f387a26d-etc-kubernetes\") pod \"tuned-6xth9\" (UID: \"0d1f7ee5-7856-41fe-9646-4344f387a26d\") " pod="openshift-cluster-node-tuning-operator/tuned-6xth9" Apr 17 14:21:23.469325 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.467926 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/0d1f7ee5-7856-41fe-9646-4344f387a26d-etc-sysconfig\") pod \"tuned-6xth9\" (UID: \"0d1f7ee5-7856-41fe-9646-4344f387a26d\") " pod="openshift-cluster-node-tuning-operator/tuned-6xth9" Apr 17 14:21:23.469325 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.467949 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/0d1f7ee5-7856-41fe-9646-4344f387a26d-etc-systemd\") pod \"tuned-6xth9\" (UID: \"0d1f7ee5-7856-41fe-9646-4344f387a26d\") " pod="openshift-cluster-node-tuning-operator/tuned-6xth9" Apr 17 14:21:23.469325 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.467970 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0d1f7ee5-7856-41fe-9646-4344f387a26d-etc-kubernetes\") pod \"tuned-6xth9\" (UID: \"0d1f7ee5-7856-41fe-9646-4344f387a26d\") " pod="openshift-cluster-node-tuning-operator/tuned-6xth9" Apr 17 14:21:23.469325 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.467990 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/0d1f7ee5-7856-41fe-9646-4344f387a26d-etc-systemd\") pod \"tuned-6xth9\" (UID: \"0d1f7ee5-7856-41fe-9646-4344f387a26d\") " pod="openshift-cluster-node-tuning-operator/tuned-6xth9" Apr 17 14:21:23.469897 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.469607 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/0d1f7ee5-7856-41fe-9646-4344f387a26d-etc-tuned\") pod \"tuned-6xth9\" (UID: \"0d1f7ee5-7856-41fe-9646-4344f387a26d\") " pod="openshift-cluster-node-tuning-operator/tuned-6xth9" Apr 17 14:21:23.469897 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.469706 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0d1f7ee5-7856-41fe-9646-4344f387a26d-tmp\") pod \"tuned-6xth9\" (UID: \"0d1f7ee5-7856-41fe-9646-4344f387a26d\") " pod="openshift-cluster-node-tuning-operator/tuned-6xth9" Apr 17 14:21:23.474715 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.474690 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ntmv\" (UniqueName: \"kubernetes.io/projected/63505374-1c69-4d7f-853d-90e9526b6d12-kube-api-access-6ntmv\") pod \"multus-additional-cni-plugins-vp9jm\" (UID: \"63505374-1c69-4d7f-853d-90e9526b6d12\") " pod="openshift-multus/multus-additional-cni-plugins-vp9jm" Apr 17 14:21:23.475125 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.475105 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7ls7\" (UniqueName: \"kubernetes.io/projected/207ded88-4793-4bf3-9d1a-a6775c96a280-kube-api-access-l7ls7\") pod \"multus-gxf4n\" (UID: \"207ded88-4793-4bf3-9d1a-a6775c96a280\") " pod="openshift-multus/multus-gxf4n" Apr 17 14:21:23.475171 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.475108 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkj42\" (UniqueName: \"kubernetes.io/projected/0d1f7ee5-7856-41fe-9646-4344f387a26d-kube-api-access-xkj42\") pod \"tuned-6xth9\" (UID: \"0d1f7ee5-7856-41fe-9646-4344f387a26d\") " pod="openshift-cluster-node-tuning-operator/tuned-6xth9" Apr 17 14:21:23.580115 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.580021 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-jrx2k" Apr 17 14:21:23.586666 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:23.586640 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3081116a_d0b8_4212_ab0e_d65967b55e93.slice/crio-aa3a6a569702a25cc8cf34c26d4e34a88b0a0a6b96014fe4d08fa9da19ed78b5 WatchSource:0}: Error finding container aa3a6a569702a25cc8cf34c26d4e34a88b0a0a6b96014fe4d08fa9da19ed78b5: Status 404 returned error can't find the container with id aa3a6a569702a25cc8cf34c26d4e34a88b0a0a6b96014fe4d08fa9da19ed78b5 Apr 17 14:21:23.600248 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.600228 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-x4dft" Apr 17 14:21:23.606005 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:23.605962 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ab40dbd_e2b5_4a82_83d0_3a4846afc3cf.slice/crio-5d3bc5c61b93fe9e50927d5172875488033f494a26b0330b9172d61722b0cad4 WatchSource:0}: Error finding container 5d3bc5c61b93fe9e50927d5172875488033f494a26b0330b9172d61722b0cad4: Status 404 returned error can't find the container with id 5d3bc5c61b93fe9e50927d5172875488033f494a26b0330b9172d61722b0cad4 Apr 17 14:21:23.611006 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.610985 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4dhg6" Apr 17 14:21:23.616995 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:23.616973 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod31c597ae_daa7_47cf_855c_9a1613df2d3f.slice/crio-6aedafa249aa28472cd9e596315a6c96d87418d05a3f676457a90c618e6cfdce WatchSource:0}: Error finding container 6aedafa249aa28472cd9e596315a6c96d87418d05a3f676457a90c618e6cfdce: Status 404 returned error can't find the container with id 6aedafa249aa28472cd9e596315a6c96d87418d05a3f676457a90c618e6cfdce Apr 17 14:21:23.626385 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.626364 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-kvtp7" Apr 17 14:21:23.632313 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:23.632290 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode8508015_adfb_42aa_acfc_92b24ec90241.slice/crio-fb884e0b58bc027ab01f2463430a883328d10bbaa03bdb1d4bd077d7bb67d9ac WatchSource:0}: Error finding container fb884e0b58bc027ab01f2463430a883328d10bbaa03bdb1d4bd077d7bb67d9ac: Status 404 returned error can't find the container with id fb884e0b58bc027ab01f2463430a883328d10bbaa03bdb1d4bd077d7bb67d9ac Apr 17 14:21:23.638282 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.638264 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-58lbn" Apr 17 14:21:23.643804 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:23.643783 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod822ddccf_2c1a_42b8_8b4f_b676b2aa2c9f.slice/crio-29871e47e954df235a52d445c65f85e289ecb8263b21e61913365fc8ff1806bd WatchSource:0}: Error finding container 29871e47e954df235a52d445c65f85e289ecb8263b21e61913365fc8ff1806bd: Status 404 returned error can't find the container with id 29871e47e954df235a52d445c65f85e289ecb8263b21e61913365fc8ff1806bd Apr 17 14:21:23.654744 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.654725 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-bx274" Apr 17 14:21:23.661597 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:23.661577 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf386792e_2dbd_4e36_af17_6dbd71a6ad31.slice/crio-c09ef33a75e9dd09b43f357967b2989b03e364c02dac9809b69dbc52b8ec6987 WatchSource:0}: Error finding container c09ef33a75e9dd09b43f357967b2989b03e364c02dac9809b69dbc52b8ec6987: Status 404 returned error can't find the container with id c09ef33a75e9dd09b43f357967b2989b03e364c02dac9809b69dbc52b8ec6987 Apr 17 14:21:23.667411 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.667393 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-6xth9" Apr 17 14:21:23.673209 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:23.673182 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0d1f7ee5_7856_41fe_9646_4344f387a26d.slice/crio-51689fb1b95d5513372750ee311e5e82e137a54229db6ca8d980d66e39e7799f WatchSource:0}: Error finding container 51689fb1b95d5513372750ee311e5e82e137a54229db6ca8d980d66e39e7799f: Status 404 returned error can't find the container with id 51689fb1b95d5513372750ee311e5e82e137a54229db6ca8d980d66e39e7799f Apr 17 14:21:23.691683 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.691662 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-vp9jm" Apr 17 14:21:23.697264 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:23.697232 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63505374_1c69_4d7f_853d_90e9526b6d12.slice/crio-b083cb7e43c033e1db6ce97a49516207c306e06a0392067de39914e1987d7ae0 WatchSource:0}: Error finding container b083cb7e43c033e1db6ce97a49516207c306e06a0392067de39914e1987d7ae0: Status 404 returned error can't find the container with id b083cb7e43c033e1db6ce97a49516207c306e06a0392067de39914e1987d7ae0 Apr 17 14:21:23.697818 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.697796 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-gxf4n" Apr 17 14:21:23.703308 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:21:23.703288 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod207ded88_4793_4bf3_9d1a_a6775c96a280.slice/crio-ae8731922b72e1ff71a0bdf3c9f83a8fce5b23a70bbe816a8457881c2d98f56b WatchSource:0}: Error finding container ae8731922b72e1ff71a0bdf3c9f83a8fce5b23a70bbe816a8457881c2d98f56b: Status 404 returned error can't find the container with id ae8731922b72e1ff71a0bdf3c9f83a8fce5b23a70bbe816a8457881c2d98f56b Apr 17 14:21:23.872027 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.871943 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d0eca2ae-83f6-462e-b7d9-9ab1592717a8-metrics-certs\") pod \"network-metrics-daemon-ccsgf\" (UID: \"d0eca2ae-83f6-462e-b7d9-9ab1592717a8\") " pod="openshift-multus/network-metrics-daemon-ccsgf" Apr 17 14:21:23.872175 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:23.872091 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:21:23.872175 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:23.872170 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0eca2ae-83f6-462e-b7d9-9ab1592717a8-metrics-certs podName:d0eca2ae-83f6-462e-b7d9-9ab1592717a8 nodeName:}" failed. No retries permitted until 2026-04-17 14:21:24.872150033 +0000 UTC m=+3.178903641 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d0eca2ae-83f6-462e-b7d9-9ab1592717a8-metrics-certs") pod "network-metrics-daemon-ccsgf" (UID: "d0eca2ae-83f6-462e-b7d9-9ab1592717a8") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:21:23.973416 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:23.973377 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7q5wk\" (UniqueName: \"kubernetes.io/projected/39622ed6-2176-4d02-9823-0818852bbb2d-kube-api-access-7q5wk\") pod \"network-check-target-9cwz2\" (UID: \"39622ed6-2176-4d02-9823-0818852bbb2d\") " pod="openshift-network-diagnostics/network-check-target-9cwz2" Apr 17 14:21:23.973612 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:23.973554 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 14:21:23.973612 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:23.973574 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 14:21:23.973612 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:23.973587 2575 projected.go:194] Error preparing data for projected volume kube-api-access-7q5wk for pod openshift-network-diagnostics/network-check-target-9cwz2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:21:23.973773 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:23.973652 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/39622ed6-2176-4d02-9823-0818852bbb2d-kube-api-access-7q5wk podName:39622ed6-2176-4d02-9823-0818852bbb2d nodeName:}" failed. No retries permitted until 2026-04-17 14:21:24.973632277 +0000 UTC m=+3.280385881 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-7q5wk" (UniqueName: "kubernetes.io/projected/39622ed6-2176-4d02-9823-0818852bbb2d-kube-api-access-7q5wk") pod "network-check-target-9cwz2" (UID: "39622ed6-2176-4d02-9823-0818852bbb2d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:21:24.295180 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:24.294949 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 14:16:23 +0000 UTC" deadline="2027-12-15 02:19:33.400784251 +0000 UTC" Apr 17 14:21:24.295180 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:24.294990 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14555h58m9.105798991s" Apr 17 14:21:24.326276 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:24.326245 2575 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 14:21:24.364653 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:24.364610 2575 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 14:21:24.413420 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:24.413357 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-kvtp7" event={"ID":"e8508015-adfb-42aa-acfc-92b24ec90241","Type":"ContainerStarted","Data":"fb884e0b58bc027ab01f2463430a883328d10bbaa03bdb1d4bd077d7bb67d9ac"} Apr 17 14:21:24.427653 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:24.427614 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4dhg6" event={"ID":"31c597ae-daa7-47cf-855c-9a1613df2d3f","Type":"ContainerStarted","Data":"6aedafa249aa28472cd9e596315a6c96d87418d05a3f676457a90c618e6cfdce"} Apr 17 14:21:24.433282 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:24.433246 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x4dft" event={"ID":"5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf","Type":"ContainerStarted","Data":"5d3bc5c61b93fe9e50927d5172875488033f494a26b0330b9172d61722b0cad4"} Apr 17 14:21:24.453750 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:24.453711 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-jrx2k" event={"ID":"3081116a-d0b8-4212-ab0e-d65967b55e93","Type":"ContainerStarted","Data":"aa3a6a569702a25cc8cf34c26d4e34a88b0a0a6b96014fe4d08fa9da19ed78b5"} Apr 17 14:21:24.466583 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:24.466539 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-3.ec2.internal" event={"ID":"2b09b5827d36ff6ab7618b3998f1064e","Type":"ContainerStarted","Data":"e675402dfec44ea1db05e7b521581497c289507be00bc7f7fbcd68e7216fa440"} Apr 17 14:21:24.478210 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:24.478167 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-3.ec2.internal" event={"ID":"fcb38c6600a2cdeab28c1b9fce28f12c","Type":"ContainerStarted","Data":"27439a4aea2be1e150d14c3981ea3e8494cb94b67abf2081002734f1990226b3"} Apr 17 14:21:24.493570 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:24.493524 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-gxf4n" event={"ID":"207ded88-4793-4bf3-9d1a-a6775c96a280","Type":"ContainerStarted","Data":"ae8731922b72e1ff71a0bdf3c9f83a8fce5b23a70bbe816a8457881c2d98f56b"} Apr 17 14:21:24.506790 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:24.506716 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-6xth9" event={"ID":"0d1f7ee5-7856-41fe-9646-4344f387a26d","Type":"ContainerStarted","Data":"51689fb1b95d5513372750ee311e5e82e137a54229db6ca8d980d66e39e7799f"} Apr 17 14:21:24.511485 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:24.511374 2575 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 14:21:24.537180 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:24.536979 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-bx274" event={"ID":"f386792e-2dbd-4e36-af17-6dbd71a6ad31","Type":"ContainerStarted","Data":"c09ef33a75e9dd09b43f357967b2989b03e364c02dac9809b69dbc52b8ec6987"} Apr 17 14:21:24.549313 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:24.549224 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vp9jm" event={"ID":"63505374-1c69-4d7f-853d-90e9526b6d12","Type":"ContainerStarted","Data":"b083cb7e43c033e1db6ce97a49516207c306e06a0392067de39914e1987d7ae0"} Apr 17 14:21:24.564372 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:24.564324 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-58lbn" event={"ID":"822ddccf-2c1a-42b8-8b4f-b676b2aa2c9f","Type":"ContainerStarted","Data":"29871e47e954df235a52d445c65f85e289ecb8263b21e61913365fc8ff1806bd"} Apr 17 14:21:24.881741 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:24.881653 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d0eca2ae-83f6-462e-b7d9-9ab1592717a8-metrics-certs\") pod \"network-metrics-daemon-ccsgf\" (UID: \"d0eca2ae-83f6-462e-b7d9-9ab1592717a8\") " pod="openshift-multus/network-metrics-daemon-ccsgf" Apr 17 14:21:24.881936 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:24.881823 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:21:24.881936 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:24.881907 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0eca2ae-83f6-462e-b7d9-9ab1592717a8-metrics-certs podName:d0eca2ae-83f6-462e-b7d9-9ab1592717a8 nodeName:}" failed. No retries permitted until 2026-04-17 14:21:26.881885996 +0000 UTC m=+5.188639619 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d0eca2ae-83f6-462e-b7d9-9ab1592717a8-metrics-certs") pod "network-metrics-daemon-ccsgf" (UID: "d0eca2ae-83f6-462e-b7d9-9ab1592717a8") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:21:24.982250 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:24.982211 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7q5wk\" (UniqueName: \"kubernetes.io/projected/39622ed6-2176-4d02-9823-0818852bbb2d-kube-api-access-7q5wk\") pod \"network-check-target-9cwz2\" (UID: \"39622ed6-2176-4d02-9823-0818852bbb2d\") " pod="openshift-network-diagnostics/network-check-target-9cwz2" Apr 17 14:21:24.982493 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:24.982474 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 14:21:24.982576 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:24.982500 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 14:21:24.982576 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:24.982515 2575 projected.go:194] Error preparing data for projected volume kube-api-access-7q5wk for pod openshift-network-diagnostics/network-check-target-9cwz2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:21:24.982674 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:24.982577 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/39622ed6-2176-4d02-9823-0818852bbb2d-kube-api-access-7q5wk podName:39622ed6-2176-4d02-9823-0818852bbb2d nodeName:}" failed. No retries permitted until 2026-04-17 14:21:26.982557597 +0000 UTC m=+5.289311209 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-7q5wk" (UniqueName: "kubernetes.io/projected/39622ed6-2176-4d02-9823-0818852bbb2d-kube-api-access-7q5wk") pod "network-check-target-9cwz2" (UID: "39622ed6-2176-4d02-9823-0818852bbb2d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:21:25.296145 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:25.296049 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 14:16:23 +0000 UTC" deadline="2027-12-03 13:09:41.271318662 +0000 UTC" Apr 17 14:21:25.296145 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:25.296087 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14278h48m15.975235666s" Apr 17 14:21:25.398046 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:25.398008 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ccsgf" Apr 17 14:21:25.398213 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:25.398162 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ccsgf" podUID="d0eca2ae-83f6-462e-b7d9-9ab1592717a8" Apr 17 14:21:25.398599 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:25.398576 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9cwz2" Apr 17 14:21:25.398710 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:25.398676 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9cwz2" podUID="39622ed6-2176-4d02-9823-0818852bbb2d" Apr 17 14:21:26.900049 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:26.900008 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d0eca2ae-83f6-462e-b7d9-9ab1592717a8-metrics-certs\") pod \"network-metrics-daemon-ccsgf\" (UID: \"d0eca2ae-83f6-462e-b7d9-9ab1592717a8\") " pod="openshift-multus/network-metrics-daemon-ccsgf" Apr 17 14:21:26.900490 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:26.900170 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:21:26.900490 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:26.900234 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0eca2ae-83f6-462e-b7d9-9ab1592717a8-metrics-certs podName:d0eca2ae-83f6-462e-b7d9-9ab1592717a8 nodeName:}" failed. No retries permitted until 2026-04-17 14:21:30.900212879 +0000 UTC m=+9.206966479 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d0eca2ae-83f6-462e-b7d9-9ab1592717a8-metrics-certs") pod "network-metrics-daemon-ccsgf" (UID: "d0eca2ae-83f6-462e-b7d9-9ab1592717a8") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:21:27.000435 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:27.000390 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7q5wk\" (UniqueName: \"kubernetes.io/projected/39622ed6-2176-4d02-9823-0818852bbb2d-kube-api-access-7q5wk\") pod \"network-check-target-9cwz2\" (UID: \"39622ed6-2176-4d02-9823-0818852bbb2d\") " pod="openshift-network-diagnostics/network-check-target-9cwz2" Apr 17 14:21:27.000599 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:27.000575 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 14:21:27.000599 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:27.000595 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 14:21:27.000714 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:27.000608 2575 projected.go:194] Error preparing data for projected volume kube-api-access-7q5wk for pod openshift-network-diagnostics/network-check-target-9cwz2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:21:27.000714 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:27.000669 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/39622ed6-2176-4d02-9823-0818852bbb2d-kube-api-access-7q5wk podName:39622ed6-2176-4d02-9823-0818852bbb2d nodeName:}" failed. No retries permitted until 2026-04-17 14:21:31.000649123 +0000 UTC m=+9.307402725 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-7q5wk" (UniqueName: "kubernetes.io/projected/39622ed6-2176-4d02-9823-0818852bbb2d-kube-api-access-7q5wk") pod "network-check-target-9cwz2" (UID: "39622ed6-2176-4d02-9823-0818852bbb2d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:21:27.397940 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:27.397726 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ccsgf" Apr 17 14:21:27.397940 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:27.397907 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ccsgf" podUID="d0eca2ae-83f6-462e-b7d9-9ab1592717a8" Apr 17 14:21:27.398188 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:27.397966 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9cwz2" Apr 17 14:21:27.398188 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:27.398041 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9cwz2" podUID="39622ed6-2176-4d02-9823-0818852bbb2d" Apr 17 14:21:29.398334 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:29.398285 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9cwz2" Apr 17 14:21:29.398813 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:29.398341 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ccsgf" Apr 17 14:21:29.398813 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:29.398415 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9cwz2" podUID="39622ed6-2176-4d02-9823-0818852bbb2d" Apr 17 14:21:29.398813 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:29.398525 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ccsgf" podUID="d0eca2ae-83f6-462e-b7d9-9ab1592717a8" Apr 17 14:21:30.934525 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:30.934474 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d0eca2ae-83f6-462e-b7d9-9ab1592717a8-metrics-certs\") pod \"network-metrics-daemon-ccsgf\" (UID: \"d0eca2ae-83f6-462e-b7d9-9ab1592717a8\") " pod="openshift-multus/network-metrics-daemon-ccsgf" Apr 17 14:21:30.935007 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:30.934646 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:21:30.935007 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:30.934717 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0eca2ae-83f6-462e-b7d9-9ab1592717a8-metrics-certs podName:d0eca2ae-83f6-462e-b7d9-9ab1592717a8 nodeName:}" failed. No retries permitted until 2026-04-17 14:21:38.934701097 +0000 UTC m=+17.241454705 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d0eca2ae-83f6-462e-b7d9-9ab1592717a8-metrics-certs") pod "network-metrics-daemon-ccsgf" (UID: "d0eca2ae-83f6-462e-b7d9-9ab1592717a8") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:21:31.035348 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:31.035304 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7q5wk\" (UniqueName: \"kubernetes.io/projected/39622ed6-2176-4d02-9823-0818852bbb2d-kube-api-access-7q5wk\") pod \"network-check-target-9cwz2\" (UID: \"39622ed6-2176-4d02-9823-0818852bbb2d\") " pod="openshift-network-diagnostics/network-check-target-9cwz2" Apr 17 14:21:31.035531 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:31.035505 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 14:21:31.035531 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:31.035520 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 14:21:31.035531 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:31.035529 2575 projected.go:194] Error preparing data for projected volume kube-api-access-7q5wk for pod openshift-network-diagnostics/network-check-target-9cwz2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:21:31.035701 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:31.035586 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/39622ed6-2176-4d02-9823-0818852bbb2d-kube-api-access-7q5wk podName:39622ed6-2176-4d02-9823-0818852bbb2d nodeName:}" failed. No retries permitted until 2026-04-17 14:21:39.035567717 +0000 UTC m=+17.342321314 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-7q5wk" (UniqueName: "kubernetes.io/projected/39622ed6-2176-4d02-9823-0818852bbb2d-kube-api-access-7q5wk") pod "network-check-target-9cwz2" (UID: "39622ed6-2176-4d02-9823-0818852bbb2d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:21:31.398557 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:31.398343 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9cwz2" Apr 17 14:21:31.398557 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:31.398464 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9cwz2" podUID="39622ed6-2176-4d02-9823-0818852bbb2d" Apr 17 14:21:31.398557 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:31.398504 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ccsgf" Apr 17 14:21:31.398832 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:31.398592 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ccsgf" podUID="d0eca2ae-83f6-462e-b7d9-9ab1592717a8" Apr 17 14:21:33.398485 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:33.398378 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9cwz2" Apr 17 14:21:33.399013 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:33.398511 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9cwz2" podUID="39622ed6-2176-4d02-9823-0818852bbb2d" Apr 17 14:21:33.399013 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:33.398992 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ccsgf" Apr 17 14:21:33.399129 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:33.399100 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ccsgf" podUID="d0eca2ae-83f6-462e-b7d9-9ab1592717a8" Apr 17 14:21:33.589363 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:33.589322 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-3.ec2.internal" event={"ID":"fcb38c6600a2cdeab28c1b9fce28f12c","Type":"ContainerStarted","Data":"6eba3e20236f3c35b7f0aea354cd8ef01e36ebb98f758f25134062c5e2ce9d2b"} Apr 17 14:21:33.591367 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:33.591338 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-6xth9" event={"ID":"0d1f7ee5-7856-41fe-9646-4344f387a26d","Type":"ContainerStarted","Data":"43e787c23c865a11148812b5670de589874c73010de88f20d60ff70877ebec13"} Apr 17 14:21:33.603395 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:33.603339 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-3.ec2.internal" podStartSLOduration=11.60332243 podStartE2EDuration="11.60332243s" podCreationTimestamp="2026-04-17 14:21:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 14:21:33.602933598 +0000 UTC m=+11.909687217" watchObservedRunningTime="2026-04-17 14:21:33.60332243 +0000 UTC m=+11.910076049" Apr 17 14:21:33.618567 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:33.618509 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-6xth9" podStartSLOduration=2.144863845 podStartE2EDuration="11.618490959s" podCreationTimestamp="2026-04-17 14:21:22 +0000 UTC" firstStartedPulling="2026-04-17 14:21:23.674641787 +0000 UTC m=+1.981395384" lastFinishedPulling="2026-04-17 14:21:33.148268887 +0000 UTC m=+11.455022498" observedRunningTime="2026-04-17 14:21:33.618121012 +0000 UTC m=+11.924874632" watchObservedRunningTime="2026-04-17 14:21:33.618490959 +0000 UTC m=+11.925244578" Apr 17 14:21:34.594584 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:34.594328 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-3.ec2.internal" event={"ID":"2b09b5827d36ff6ab7618b3998f1064e","Type":"ContainerStarted","Data":"788eafd712310f235cc5964f6fc2847dee1da12eb8e571a2421c60bbb499d4e9"} Apr 17 14:21:34.595762 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:34.595728 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-bx274" event={"ID":"f386792e-2dbd-4e36-af17-6dbd71a6ad31","Type":"ContainerStarted","Data":"baa3f0c87b3111fe7536662de7e1b3351f06c368a96ac039c10f238573b36e9d"} Apr 17 14:21:34.597093 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:34.597060 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vp9jm" event={"ID":"63505374-1c69-4d7f-853d-90e9526b6d12","Type":"ContainerStarted","Data":"9e08a8515afcc7736e3508454d06f81af7b734ce70d7292cf8aa04c9815d4719"} Apr 17 14:21:34.598421 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:34.598388 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-58lbn" event={"ID":"822ddccf-2c1a-42b8-8b4f-b676b2aa2c9f","Type":"ContainerStarted","Data":"1fde8fbbc64810d82b778e252b14f37fe389fa532d50cb78fdc85258f05b2783"} Apr 17 14:21:34.599658 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:34.599617 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-kvtp7" event={"ID":"e8508015-adfb-42aa-acfc-92b24ec90241","Type":"ContainerStarted","Data":"935d529ec6b19dbe78ce0a0a834b4d477ace72d600b70c8a9ac27cd0de5587c2"} Apr 17 14:21:34.601171 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:34.601151 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4dhg6" event={"ID":"31c597ae-daa7-47cf-855c-9a1613df2d3f","Type":"ContainerStarted","Data":"3ef5508ac53ca182e95f270ca2dd11e768c0d365b0fbf17b29dbd7dba5c95de7"} Apr 17 14:21:34.602474 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:34.602420 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-jrx2k" event={"ID":"3081116a-d0b8-4212-ab0e-d65967b55e93","Type":"ContainerStarted","Data":"13944b4ec5b65594c3b0ef2176ae71382b9d10ca8f20b2a82bd245d7d7758bf7"} Apr 17 14:21:34.621021 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:34.620938 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-jrx2k" podStartSLOduration=3.083511327 podStartE2EDuration="12.620926778s" podCreationTimestamp="2026-04-17 14:21:22 +0000 UTC" firstStartedPulling="2026-04-17 14:21:23.588134255 +0000 UTC m=+1.894887852" lastFinishedPulling="2026-04-17 14:21:33.125549692 +0000 UTC m=+11.432303303" observedRunningTime="2026-04-17 14:21:34.620622377 +0000 UTC m=+12.927375995" watchObservedRunningTime="2026-04-17 14:21:34.620926778 +0000 UTC m=+12.927680395" Apr 17 14:21:34.633905 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:34.633847 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-58lbn" podStartSLOduration=3.181347351 podStartE2EDuration="12.633835323s" podCreationTimestamp="2026-04-17 14:21:22 +0000 UTC" firstStartedPulling="2026-04-17 14:21:23.645230196 +0000 UTC m=+1.951983794" lastFinishedPulling="2026-04-17 14:21:33.097718164 +0000 UTC m=+11.404471766" observedRunningTime="2026-04-17 14:21:34.633471418 +0000 UTC m=+12.940225039" watchObservedRunningTime="2026-04-17 14:21:34.633835323 +0000 UTC m=+12.940588943" Apr 17 14:21:34.646379 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:34.646343 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-bx274" podStartSLOduration=3.185629522 podStartE2EDuration="12.646331687s" podCreationTimestamp="2026-04-17 14:21:22 +0000 UTC" firstStartedPulling="2026-04-17 14:21:23.663042841 +0000 UTC m=+1.969796439" lastFinishedPulling="2026-04-17 14:21:33.123745002 +0000 UTC m=+11.430498604" observedRunningTime="2026-04-17 14:21:34.645974008 +0000 UTC m=+12.952727627" watchObservedRunningTime="2026-04-17 14:21:34.646331687 +0000 UTC m=+12.953085306" Apr 17 14:21:34.662374 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:34.662334 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-kvtp7" podStartSLOduration=3.195952696 podStartE2EDuration="12.66232247s" podCreationTimestamp="2026-04-17 14:21:22 +0000 UTC" firstStartedPulling="2026-04-17 14:21:23.633845691 +0000 UTC m=+1.940599289" lastFinishedPulling="2026-04-17 14:21:33.100215466 +0000 UTC m=+11.406969063" observedRunningTime="2026-04-17 14:21:34.661892919 +0000 UTC m=+12.968646539" watchObservedRunningTime="2026-04-17 14:21:34.66232247 +0000 UTC m=+12.969076089" Apr 17 14:21:35.398215 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:35.398175 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9cwz2" Apr 17 14:21:35.398432 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:35.398191 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ccsgf" Apr 17 14:21:35.398432 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:35.398288 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9cwz2" podUID="39622ed6-2176-4d02-9823-0818852bbb2d" Apr 17 14:21:35.398544 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:35.398424 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ccsgf" podUID="d0eca2ae-83f6-462e-b7d9-9ab1592717a8" Apr 17 14:21:37.398130 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:37.398100 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ccsgf" Apr 17 14:21:37.398909 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:37.398093 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9cwz2" Apr 17 14:21:37.398909 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:37.398234 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ccsgf" podUID="d0eca2ae-83f6-462e-b7d9-9ab1592717a8" Apr 17 14:21:37.398909 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:37.398275 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9cwz2" podUID="39622ed6-2176-4d02-9823-0818852bbb2d" Apr 17 14:21:37.609344 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:37.609308 2575 generic.go:358] "Generic (PLEG): container finished" podID="2b09b5827d36ff6ab7618b3998f1064e" containerID="788eafd712310f235cc5964f6fc2847dee1da12eb8e571a2421c60bbb499d4e9" exitCode=0 Apr 17 14:21:37.609495 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:37.609379 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-3.ec2.internal" event={"ID":"2b09b5827d36ff6ab7618b3998f1064e","Type":"ContainerDied","Data":"788eafd712310f235cc5964f6fc2847dee1da12eb8e571a2421c60bbb499d4e9"} Apr 17 14:21:37.610917 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:37.610898 2575 generic.go:358] "Generic (PLEG): container finished" podID="63505374-1c69-4d7f-853d-90e9526b6d12" containerID="9e08a8515afcc7736e3508454d06f81af7b734ce70d7292cf8aa04c9815d4719" exitCode=0 Apr 17 14:21:37.611026 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:37.610928 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vp9jm" event={"ID":"63505374-1c69-4d7f-853d-90e9526b6d12","Type":"ContainerDied","Data":"9e08a8515afcc7736e3508454d06f81af7b734ce70d7292cf8aa04c9815d4719"} Apr 17 14:21:38.255003 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:38.254970 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-bx274" Apr 17 14:21:38.255633 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:38.255613 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-bx274" Apr 17 14:21:38.612898 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:38.612849 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-bx274" Apr 17 14:21:38.613492 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:38.613471 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-bx274" Apr 17 14:21:38.997890 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:38.997806 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d0eca2ae-83f6-462e-b7d9-9ab1592717a8-metrics-certs\") pod \"network-metrics-daemon-ccsgf\" (UID: \"d0eca2ae-83f6-462e-b7d9-9ab1592717a8\") " pod="openshift-multus/network-metrics-daemon-ccsgf" Apr 17 14:21:38.998076 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:38.997941 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:21:38.998076 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:38.998025 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0eca2ae-83f6-462e-b7d9-9ab1592717a8-metrics-certs podName:d0eca2ae-83f6-462e-b7d9-9ab1592717a8 nodeName:}" failed. No retries permitted until 2026-04-17 14:21:54.998006299 +0000 UTC m=+33.304759897 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d0eca2ae-83f6-462e-b7d9-9ab1592717a8-metrics-certs") pod "network-metrics-daemon-ccsgf" (UID: "d0eca2ae-83f6-462e-b7d9-9ab1592717a8") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:21:39.098262 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:39.098228 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7q5wk\" (UniqueName: \"kubernetes.io/projected/39622ed6-2176-4d02-9823-0818852bbb2d-kube-api-access-7q5wk\") pod \"network-check-target-9cwz2\" (UID: \"39622ed6-2176-4d02-9823-0818852bbb2d\") " pod="openshift-network-diagnostics/network-check-target-9cwz2" Apr 17 14:21:39.098439 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:39.098377 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 14:21:39.098439 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:39.098398 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 14:21:39.098439 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:39.098411 2575 projected.go:194] Error preparing data for projected volume kube-api-access-7q5wk for pod openshift-network-diagnostics/network-check-target-9cwz2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:21:39.098584 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:39.098474 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/39622ed6-2176-4d02-9823-0818852bbb2d-kube-api-access-7q5wk podName:39622ed6-2176-4d02-9823-0818852bbb2d nodeName:}" failed. No retries permitted until 2026-04-17 14:21:55.098456349 +0000 UTC m=+33.405209946 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-7q5wk" (UniqueName: "kubernetes.io/projected/39622ed6-2176-4d02-9823-0818852bbb2d-kube-api-access-7q5wk") pod "network-check-target-9cwz2" (UID: "39622ed6-2176-4d02-9823-0818852bbb2d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:21:39.398684 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:39.398641 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9cwz2" Apr 17 14:21:39.398857 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:39.398773 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9cwz2" podUID="39622ed6-2176-4d02-9823-0818852bbb2d" Apr 17 14:21:39.398857 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:39.398832 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ccsgf" Apr 17 14:21:39.398983 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:39.398953 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ccsgf" podUID="d0eca2ae-83f6-462e-b7d9-9ab1592717a8" Apr 17 14:21:41.398703 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:41.398669 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9cwz2" Apr 17 14:21:41.399101 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:41.398669 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ccsgf" Apr 17 14:21:41.399101 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:41.398781 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9cwz2" podUID="39622ed6-2176-4d02-9823-0818852bbb2d" Apr 17 14:21:41.399101 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:41.398844 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ccsgf" podUID="d0eca2ae-83f6-462e-b7d9-9ab1592717a8" Apr 17 14:21:42.102252 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:42.102038 2575 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 17 14:21:42.341124 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:42.341030 2575 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-17T14:21:42.102252738Z","UUID":"8947f5d6-ee3c-463f-a394-1b16a62abdc6","Handler":null,"Name":"","Endpoint":""} Apr 17 14:21:42.344577 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:42.344554 2575 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 17 14:21:42.344718 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:42.344587 2575 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 17 14:21:42.623155 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:42.622991 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x4dft" event={"ID":"5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf","Type":"ContainerStarted","Data":"210adca9e9a054bebcc53c455759b00fbc6947fec1df398e4586857284b8856b"} Apr 17 14:21:42.623155 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:42.623027 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x4dft" event={"ID":"5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf","Type":"ContainerStarted","Data":"912e9eec4223dc4d299d2c8699da1484e049e20d7a10033257e51e3087d1183d"} Apr 17 14:21:42.623155 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:42.623040 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x4dft" event={"ID":"5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf","Type":"ContainerStarted","Data":"577144dbd9fad8e2d527f28b5a7ca0a76fdc1f89ad493fdebc1a4a2e61cc963e"} Apr 17 14:21:42.623155 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:42.623059 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x4dft" event={"ID":"5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf","Type":"ContainerStarted","Data":"1aa8142c02f977d8d7b0d4503537d5bba38d2041ad3ddc8bdab238825a7560f3"} Apr 17 14:21:42.623155 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:42.623069 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x4dft" event={"ID":"5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf","Type":"ContainerStarted","Data":"722ab8d748f47e4e93bbf854284b66f1d81c878877e6ae5a0cb8ca5e26d673b9"} Apr 17 14:21:42.623155 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:42.623081 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x4dft" event={"ID":"5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf","Type":"ContainerStarted","Data":"35d36cd9513fe5a452a62b3fc2dac1bf88c562f272704dbaf9a3b4c673356163"} Apr 17 14:21:42.626072 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:42.626041 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-3.ec2.internal" event={"ID":"2b09b5827d36ff6ab7618b3998f1064e","Type":"ContainerStarted","Data":"39020961a037d0345f88badc8df46621b28eec49dfe08972d588e35fc0ebe775"} Apr 17 14:21:42.627690 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:42.627665 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-gxf4n" event={"ID":"207ded88-4793-4bf3-9d1a-a6775c96a280","Type":"ContainerStarted","Data":"d8b1f429e5705e33012b7cede79240d4762006f9691c31df50e6f8c87e2e8e4c"} Apr 17 14:21:42.630123 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:42.630091 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4dhg6" event={"ID":"31c597ae-daa7-47cf-855c-9a1613df2d3f","Type":"ContainerStarted","Data":"436761f0e2daa4c2a8e2b461f602958ed5ae9a3de0c2fec7d2e1bb6dd1accd30"} Apr 17 14:21:42.641331 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:42.641227 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-3.ec2.internal" podStartSLOduration=20.64121316 podStartE2EDuration="20.64121316s" podCreationTimestamp="2026-04-17 14:21:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 14:21:42.640600921 +0000 UTC m=+20.947354540" watchObservedRunningTime="2026-04-17 14:21:42.64121316 +0000 UTC m=+20.947966782" Apr 17 14:21:42.660083 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:42.660027 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-gxf4n" podStartSLOduration=2.436182449 podStartE2EDuration="20.6600073s" podCreationTimestamp="2026-04-17 14:21:22 +0000 UTC" firstStartedPulling="2026-04-17 14:21:23.704597295 +0000 UTC m=+2.011350892" lastFinishedPulling="2026-04-17 14:21:41.928422134 +0000 UTC m=+20.235175743" observedRunningTime="2026-04-17 14:21:42.659155838 +0000 UTC m=+20.965909458" watchObservedRunningTime="2026-04-17 14:21:42.6600073 +0000 UTC m=+20.966760919" Apr 17 14:21:43.398119 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:43.398091 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9cwz2" Apr 17 14:21:43.398300 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:43.398091 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ccsgf" Apr 17 14:21:43.398300 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:43.398204 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9cwz2" podUID="39622ed6-2176-4d02-9823-0818852bbb2d" Apr 17 14:21:43.398392 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:43.398315 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ccsgf" podUID="d0eca2ae-83f6-462e-b7d9-9ab1592717a8" Apr 17 14:21:43.634128 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:43.634092 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4dhg6" event={"ID":"31c597ae-daa7-47cf-855c-9a1613df2d3f","Type":"ContainerStarted","Data":"fa24541901cc367b98e8583c3881dec23899564953ee8800f12f4a2a51abd2c9"} Apr 17 14:21:45.397936 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:45.397896 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9cwz2" Apr 17 14:21:45.398343 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:45.397896 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ccsgf" Apr 17 14:21:45.398343 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:45.398043 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9cwz2" podUID="39622ed6-2176-4d02-9823-0818852bbb2d" Apr 17 14:21:45.398343 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:45.398084 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ccsgf" podUID="d0eca2ae-83f6-462e-b7d9-9ab1592717a8" Apr 17 14:21:45.642857 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:45.642821 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x4dft" event={"ID":"5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf","Type":"ContainerStarted","Data":"afd6b6564e8a7cf65c15c7df5107f6d079dbc831bbd83ec11d329d80e0525e5b"} Apr 17 14:21:45.644275 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:45.644245 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vp9jm" event={"ID":"63505374-1c69-4d7f-853d-90e9526b6d12","Type":"ContainerStarted","Data":"9f97a9654d7250c46aa702c50e3f7ead10f63f57197d8c1221bcca3435fcdd1a"} Apr 17 14:21:45.676132 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:45.676046 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4dhg6" podStartSLOduration=4.320097887 podStartE2EDuration="23.676031708s" podCreationTimestamp="2026-04-17 14:21:22 +0000 UTC" firstStartedPulling="2026-04-17 14:21:23.618728843 +0000 UTC m=+1.925482440" lastFinishedPulling="2026-04-17 14:21:42.974662649 +0000 UTC m=+21.281416261" observedRunningTime="2026-04-17 14:21:43.651252691 +0000 UTC m=+21.958006310" watchObservedRunningTime="2026-04-17 14:21:45.676031708 +0000 UTC m=+23.982785377" Apr 17 14:21:45.792123 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:45.792084 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-fhzz8"] Apr 17 14:21:45.794864 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:45.794842 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fhzz8" Apr 17 14:21:45.795013 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:45.794940 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fhzz8" podUID="05b23a0f-2e19-4ad7-861d-8f2bb92bc69c" Apr 17 14:21:45.953622 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:45.953542 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/05b23a0f-2e19-4ad7-861d-8f2bb92bc69c-kubelet-config\") pod \"global-pull-secret-syncer-fhzz8\" (UID: \"05b23a0f-2e19-4ad7-861d-8f2bb92bc69c\") " pod="kube-system/global-pull-secret-syncer-fhzz8" Apr 17 14:21:45.953760 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:45.953629 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/05b23a0f-2e19-4ad7-861d-8f2bb92bc69c-dbus\") pod \"global-pull-secret-syncer-fhzz8\" (UID: \"05b23a0f-2e19-4ad7-861d-8f2bb92bc69c\") " pod="kube-system/global-pull-secret-syncer-fhzz8" Apr 17 14:21:45.953760 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:45.953675 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/05b23a0f-2e19-4ad7-861d-8f2bb92bc69c-original-pull-secret\") pod \"global-pull-secret-syncer-fhzz8\" (UID: \"05b23a0f-2e19-4ad7-861d-8f2bb92bc69c\") " pod="kube-system/global-pull-secret-syncer-fhzz8" Apr 17 14:21:46.054867 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:46.054830 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/05b23a0f-2e19-4ad7-861d-8f2bb92bc69c-dbus\") pod \"global-pull-secret-syncer-fhzz8\" (UID: \"05b23a0f-2e19-4ad7-861d-8f2bb92bc69c\") " pod="kube-system/global-pull-secret-syncer-fhzz8" Apr 17 14:21:46.055055 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:46.054972 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/05b23a0f-2e19-4ad7-861d-8f2bb92bc69c-original-pull-secret\") pod \"global-pull-secret-syncer-fhzz8\" (UID: \"05b23a0f-2e19-4ad7-861d-8f2bb92bc69c\") " pod="kube-system/global-pull-secret-syncer-fhzz8" Apr 17 14:21:46.055055 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:46.055008 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/05b23a0f-2e19-4ad7-861d-8f2bb92bc69c-kubelet-config\") pod \"global-pull-secret-syncer-fhzz8\" (UID: \"05b23a0f-2e19-4ad7-861d-8f2bb92bc69c\") " pod="kube-system/global-pull-secret-syncer-fhzz8" Apr 17 14:21:46.055122 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:46.055078 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/05b23a0f-2e19-4ad7-861d-8f2bb92bc69c-dbus\") pod \"global-pull-secret-syncer-fhzz8\" (UID: \"05b23a0f-2e19-4ad7-861d-8f2bb92bc69c\") " pod="kube-system/global-pull-secret-syncer-fhzz8" Apr 17 14:21:46.055122 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:46.055092 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/05b23a0f-2e19-4ad7-861d-8f2bb92bc69c-kubelet-config\") pod \"global-pull-secret-syncer-fhzz8\" (UID: \"05b23a0f-2e19-4ad7-861d-8f2bb92bc69c\") " pod="kube-system/global-pull-secret-syncer-fhzz8" Apr 17 14:21:46.055122 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:46.055107 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 14:21:46.055211 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:46.055162 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05b23a0f-2e19-4ad7-861d-8f2bb92bc69c-original-pull-secret podName:05b23a0f-2e19-4ad7-861d-8f2bb92bc69c nodeName:}" failed. No retries permitted until 2026-04-17 14:21:46.555149313 +0000 UTC m=+24.861902915 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/05b23a0f-2e19-4ad7-861d-8f2bb92bc69c-original-pull-secret") pod "global-pull-secret-syncer-fhzz8" (UID: "05b23a0f-2e19-4ad7-861d-8f2bb92bc69c") : object "kube-system"/"original-pull-secret" not registered Apr 17 14:21:46.558022 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:46.557991 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/05b23a0f-2e19-4ad7-861d-8f2bb92bc69c-original-pull-secret\") pod \"global-pull-secret-syncer-fhzz8\" (UID: \"05b23a0f-2e19-4ad7-861d-8f2bb92bc69c\") " pod="kube-system/global-pull-secret-syncer-fhzz8" Apr 17 14:21:46.558494 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:46.558100 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 14:21:46.558494 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:46.558145 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05b23a0f-2e19-4ad7-861d-8f2bb92bc69c-original-pull-secret podName:05b23a0f-2e19-4ad7-861d-8f2bb92bc69c nodeName:}" failed. No retries permitted until 2026-04-17 14:21:47.558133207 +0000 UTC m=+25.864886805 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/05b23a0f-2e19-4ad7-861d-8f2bb92bc69c-original-pull-secret") pod "global-pull-secret-syncer-fhzz8" (UID: "05b23a0f-2e19-4ad7-861d-8f2bb92bc69c") : object "kube-system"/"original-pull-secret" not registered Apr 17 14:21:46.647067 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:46.647030 2575 generic.go:358] "Generic (PLEG): container finished" podID="63505374-1c69-4d7f-853d-90e9526b6d12" containerID="9f97a9654d7250c46aa702c50e3f7ead10f63f57197d8c1221bcca3435fcdd1a" exitCode=0 Apr 17 14:21:46.647067 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:46.647062 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vp9jm" event={"ID":"63505374-1c69-4d7f-853d-90e9526b6d12","Type":"ContainerDied","Data":"9f97a9654d7250c46aa702c50e3f7ead10f63f57197d8c1221bcca3435fcdd1a"} Apr 17 14:21:47.398730 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:47.398695 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fhzz8" Apr 17 14:21:47.398903 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:47.398815 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ccsgf" Apr 17 14:21:47.398974 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:47.398814 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fhzz8" podUID="05b23a0f-2e19-4ad7-861d-8f2bb92bc69c" Apr 17 14:21:47.399030 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:47.398974 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ccsgf" podUID="d0eca2ae-83f6-462e-b7d9-9ab1592717a8" Apr 17 14:21:47.399030 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:47.398999 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9cwz2" Apr 17 14:21:47.399093 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:47.399078 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9cwz2" podUID="39622ed6-2176-4d02-9823-0818852bbb2d" Apr 17 14:21:47.566180 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:47.566023 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/05b23a0f-2e19-4ad7-861d-8f2bb92bc69c-original-pull-secret\") pod \"global-pull-secret-syncer-fhzz8\" (UID: \"05b23a0f-2e19-4ad7-861d-8f2bb92bc69c\") " pod="kube-system/global-pull-secret-syncer-fhzz8" Apr 17 14:21:47.566467 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:47.566161 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 14:21:47.566467 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:47.566256 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05b23a0f-2e19-4ad7-861d-8f2bb92bc69c-original-pull-secret podName:05b23a0f-2e19-4ad7-861d-8f2bb92bc69c nodeName:}" failed. No retries permitted until 2026-04-17 14:21:49.56621896 +0000 UTC m=+27.872972557 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/05b23a0f-2e19-4ad7-861d-8f2bb92bc69c-original-pull-secret") pod "global-pull-secret-syncer-fhzz8" (UID: "05b23a0f-2e19-4ad7-861d-8f2bb92bc69c") : object "kube-system"/"original-pull-secret" not registered Apr 17 14:21:47.653084 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:47.652908 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x4dft" event={"ID":"5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf","Type":"ContainerStarted","Data":"c515ca7dcf22fbf8a1aae17b38488f0d3557a46f4b1fffea164367a23b2fbe1d"} Apr 17 14:21:47.653460 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:47.653262 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-x4dft" Apr 17 14:21:47.655345 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:47.655298 2575 generic.go:358] "Generic (PLEG): container finished" podID="63505374-1c69-4d7f-853d-90e9526b6d12" containerID="75e0a8e244383432ae44f5f98a63293d8026f29363cc2969ac16bd0ed8a8f37e" exitCode=0 Apr 17 14:21:47.655398 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:47.655342 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vp9jm" event={"ID":"63505374-1c69-4d7f-853d-90e9526b6d12","Type":"ContainerDied","Data":"75e0a8e244383432ae44f5f98a63293d8026f29363cc2969ac16bd0ed8a8f37e"} Apr 17 14:21:47.670846 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:47.670700 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-x4dft" Apr 17 14:21:47.681684 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:47.681636 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-x4dft" podStartSLOduration=7.489283315 podStartE2EDuration="25.681618801s" podCreationTimestamp="2026-04-17 14:21:22 +0000 UTC" firstStartedPulling="2026-04-17 14:21:23.60758324 +0000 UTC m=+1.914336841" lastFinishedPulling="2026-04-17 14:21:41.799918717 +0000 UTC m=+20.106672327" observedRunningTime="2026-04-17 14:21:47.680783584 +0000 UTC m=+25.987537204" watchObservedRunningTime="2026-04-17 14:21:47.681618801 +0000 UTC m=+25.988372421" Apr 17 14:21:48.659148 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:48.659113 2575 generic.go:358] "Generic (PLEG): container finished" podID="63505374-1c69-4d7f-853d-90e9526b6d12" containerID="04b7201dc6cf4b724d4693ddb0feeb7c8ea8f69e9d94d4c5c82c905c410b74a6" exitCode=0 Apr 17 14:21:48.659534 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:48.659185 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vp9jm" event={"ID":"63505374-1c69-4d7f-853d-90e9526b6d12","Type":"ContainerDied","Data":"04b7201dc6cf4b724d4693ddb0feeb7c8ea8f69e9d94d4c5c82c905c410b74a6"} Apr 17 14:21:48.659724 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:48.659689 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-x4dft" Apr 17 14:21:48.659724 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:48.659723 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-x4dft" Apr 17 14:21:48.674476 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:48.674449 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-x4dft" Apr 17 14:21:49.398271 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:49.398241 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9cwz2" Apr 17 14:21:49.398465 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:49.398242 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ccsgf" Apr 17 14:21:49.398465 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:49.398353 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9cwz2" podUID="39622ed6-2176-4d02-9823-0818852bbb2d" Apr 17 14:21:49.398550 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:49.398462 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ccsgf" podUID="d0eca2ae-83f6-462e-b7d9-9ab1592717a8" Apr 17 14:21:49.398550 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:49.398506 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fhzz8" Apr 17 14:21:49.398617 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:49.398574 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fhzz8" podUID="05b23a0f-2e19-4ad7-861d-8f2bb92bc69c" Apr 17 14:21:49.582537 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:49.582492 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/05b23a0f-2e19-4ad7-861d-8f2bb92bc69c-original-pull-secret\") pod \"global-pull-secret-syncer-fhzz8\" (UID: \"05b23a0f-2e19-4ad7-861d-8f2bb92bc69c\") " pod="kube-system/global-pull-secret-syncer-fhzz8" Apr 17 14:21:49.582707 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:49.582663 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 14:21:49.582761 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:49.582742 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05b23a0f-2e19-4ad7-861d-8f2bb92bc69c-original-pull-secret podName:05b23a0f-2e19-4ad7-861d-8f2bb92bc69c nodeName:}" failed. No retries permitted until 2026-04-17 14:21:53.582724657 +0000 UTC m=+31.889478259 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/05b23a0f-2e19-4ad7-861d-8f2bb92bc69c-original-pull-secret") pod "global-pull-secret-syncer-fhzz8" (UID: "05b23a0f-2e19-4ad7-861d-8f2bb92bc69c") : object "kube-system"/"original-pull-secret" not registered Apr 17 14:21:50.034745 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:50.034449 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-9cwz2"] Apr 17 14:21:50.035196 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:50.034857 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9cwz2" Apr 17 14:21:50.035196 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:50.034981 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9cwz2" podUID="39622ed6-2176-4d02-9823-0818852bbb2d" Apr 17 14:21:50.038373 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:50.038347 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-ccsgf"] Apr 17 14:21:50.038502 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:50.038386 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-fhzz8"] Apr 17 14:21:50.038502 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:50.038458 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fhzz8" Apr 17 14:21:50.038591 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:50.038552 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fhzz8" podUID="05b23a0f-2e19-4ad7-861d-8f2bb92bc69c" Apr 17 14:21:50.038640 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:50.038621 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ccsgf" Apr 17 14:21:50.038754 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:50.038704 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ccsgf" podUID="d0eca2ae-83f6-462e-b7d9-9ab1592717a8" Apr 17 14:21:51.397724 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:51.397689 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fhzz8" Apr 17 14:21:51.398212 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:51.397689 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9cwz2" Apr 17 14:21:51.398212 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:51.397689 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ccsgf" Apr 17 14:21:51.398212 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:51.397813 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fhzz8" podUID="05b23a0f-2e19-4ad7-861d-8f2bb92bc69c" Apr 17 14:21:51.398212 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:51.397955 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ccsgf" podUID="d0eca2ae-83f6-462e-b7d9-9ab1592717a8" Apr 17 14:21:51.398212 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:51.398042 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9cwz2" podUID="39622ed6-2176-4d02-9823-0818852bbb2d" Apr 17 14:21:53.397704 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:53.397663 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ccsgf" Apr 17 14:21:53.397704 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:53.397693 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9cwz2" Apr 17 14:21:53.398409 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:53.397701 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fhzz8" Apr 17 14:21:53.398409 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:53.397801 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ccsgf" podUID="d0eca2ae-83f6-462e-b7d9-9ab1592717a8" Apr 17 14:21:53.398409 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:53.397901 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9cwz2" podUID="39622ed6-2176-4d02-9823-0818852bbb2d" Apr 17 14:21:53.398409 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:53.397990 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fhzz8" podUID="05b23a0f-2e19-4ad7-861d-8f2bb92bc69c" Apr 17 14:21:53.612807 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:53.612772 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/05b23a0f-2e19-4ad7-861d-8f2bb92bc69c-original-pull-secret\") pod \"global-pull-secret-syncer-fhzz8\" (UID: \"05b23a0f-2e19-4ad7-861d-8f2bb92bc69c\") " pod="kube-system/global-pull-secret-syncer-fhzz8" Apr 17 14:21:53.613006 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:53.612938 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 14:21:53.613006 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:53.612996 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05b23a0f-2e19-4ad7-861d-8f2bb92bc69c-original-pull-secret podName:05b23a0f-2e19-4ad7-861d-8f2bb92bc69c nodeName:}" failed. No retries permitted until 2026-04-17 14:22:01.61298272 +0000 UTC m=+39.919736317 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/05b23a0f-2e19-4ad7-861d-8f2bb92bc69c-original-pull-secret") pod "global-pull-secret-syncer-fhzz8" (UID: "05b23a0f-2e19-4ad7-861d-8f2bb92bc69c") : object "kube-system"/"original-pull-secret" not registered Apr 17 14:21:54.467487 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:54.467456 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-3.ec2.internal" event="NodeReady" Apr 17 14:21:54.467894 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:54.467601 2575 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 17 14:21:54.507275 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:54.507240 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-lp8nz"] Apr 17 14:21:54.525549 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:54.525515 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-869tz"] Apr 17 14:21:54.525735 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:54.525715 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-lp8nz" Apr 17 14:21:54.528460 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:54.528417 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 17 14:21:54.528584 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:54.528519 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 17 14:21:54.528670 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:54.528657 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-rkhth\"" Apr 17 14:21:54.537323 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:54.537302 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-lp8nz"] Apr 17 14:21:54.537323 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:54.537326 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-869tz"] Apr 17 14:21:54.537484 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:54.537440 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-869tz" Apr 17 14:21:54.539845 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:54.539826 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 17 14:21:54.539845 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:54.539837 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-w9wr2\"" Apr 17 14:21:54.540001 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:54.539847 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 17 14:21:54.540001 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:54.539825 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 17 14:21:54.618351 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:54.618264 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6z4tw\" (UniqueName: \"kubernetes.io/projected/d37b1dac-43fd-47dd-9f14-18b1f81b8155-kube-api-access-6z4tw\") pod \"ingress-canary-869tz\" (UID: \"d37b1dac-43fd-47dd-9f14-18b1f81b8155\") " pod="openshift-ingress-canary/ingress-canary-869tz" Apr 17 14:21:54.618351 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:54.618324 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/822a8df9-29ea-4649-a163-22e1db926c84-config-volume\") pod \"dns-default-lp8nz\" (UID: \"822a8df9-29ea-4649-a163-22e1db926c84\") " pod="openshift-dns/dns-default-lp8nz" Apr 17 14:21:54.618351 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:54.618341 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8448c\" (UniqueName: \"kubernetes.io/projected/822a8df9-29ea-4649-a163-22e1db926c84-kube-api-access-8448c\") pod \"dns-default-lp8nz\" (UID: \"822a8df9-29ea-4649-a163-22e1db926c84\") " pod="openshift-dns/dns-default-lp8nz" Apr 17 14:21:54.618562 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:54.618366 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d37b1dac-43fd-47dd-9f14-18b1f81b8155-cert\") pod \"ingress-canary-869tz\" (UID: \"d37b1dac-43fd-47dd-9f14-18b1f81b8155\") " pod="openshift-ingress-canary/ingress-canary-869tz" Apr 17 14:21:54.618562 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:54.618388 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/822a8df9-29ea-4649-a163-22e1db926c84-tmp-dir\") pod \"dns-default-lp8nz\" (UID: \"822a8df9-29ea-4649-a163-22e1db926c84\") " pod="openshift-dns/dns-default-lp8nz" Apr 17 14:21:54.618562 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:54.618471 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/822a8df9-29ea-4649-a163-22e1db926c84-metrics-tls\") pod \"dns-default-lp8nz\" (UID: \"822a8df9-29ea-4649-a163-22e1db926c84\") " pod="openshift-dns/dns-default-lp8nz" Apr 17 14:21:54.719348 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:54.719309 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/822a8df9-29ea-4649-a163-22e1db926c84-config-volume\") pod \"dns-default-lp8nz\" (UID: \"822a8df9-29ea-4649-a163-22e1db926c84\") " pod="openshift-dns/dns-default-lp8nz" Apr 17 14:21:54.719544 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:54.719359 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8448c\" (UniqueName: \"kubernetes.io/projected/822a8df9-29ea-4649-a163-22e1db926c84-kube-api-access-8448c\") pod \"dns-default-lp8nz\" (UID: \"822a8df9-29ea-4649-a163-22e1db926c84\") " pod="openshift-dns/dns-default-lp8nz" Apr 17 14:21:54.719544 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:54.719404 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d37b1dac-43fd-47dd-9f14-18b1f81b8155-cert\") pod \"ingress-canary-869tz\" (UID: \"d37b1dac-43fd-47dd-9f14-18b1f81b8155\") " pod="openshift-ingress-canary/ingress-canary-869tz" Apr 17 14:21:54.719544 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:54.719442 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/822a8df9-29ea-4649-a163-22e1db926c84-tmp-dir\") pod \"dns-default-lp8nz\" (UID: \"822a8df9-29ea-4649-a163-22e1db926c84\") " pod="openshift-dns/dns-default-lp8nz" Apr 17 14:21:54.719544 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:54.719485 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/822a8df9-29ea-4649-a163-22e1db926c84-metrics-tls\") pod \"dns-default-lp8nz\" (UID: \"822a8df9-29ea-4649-a163-22e1db926c84\") " pod="openshift-dns/dns-default-lp8nz" Apr 17 14:21:54.719544 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:54.719527 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6z4tw\" (UniqueName: \"kubernetes.io/projected/d37b1dac-43fd-47dd-9f14-18b1f81b8155-kube-api-access-6z4tw\") pod \"ingress-canary-869tz\" (UID: \"d37b1dac-43fd-47dd-9f14-18b1f81b8155\") " pod="openshift-ingress-canary/ingress-canary-869tz" Apr 17 14:21:54.719769 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:54.719639 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 14:21:54.719769 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:54.719710 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/822a8df9-29ea-4649-a163-22e1db926c84-metrics-tls podName:822a8df9-29ea-4649-a163-22e1db926c84 nodeName:}" failed. No retries permitted until 2026-04-17 14:21:55.21968785 +0000 UTC m=+33.526441464 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/822a8df9-29ea-4649-a163-22e1db926c84-metrics-tls") pod "dns-default-lp8nz" (UID: "822a8df9-29ea-4649-a163-22e1db926c84") : secret "dns-default-metrics-tls" not found Apr 17 14:21:54.719769 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:54.719762 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 14:21:54.719949 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:54.719820 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d37b1dac-43fd-47dd-9f14-18b1f81b8155-cert podName:d37b1dac-43fd-47dd-9f14-18b1f81b8155 nodeName:}" failed. No retries permitted until 2026-04-17 14:21:55.219800878 +0000 UTC m=+33.526554494 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d37b1dac-43fd-47dd-9f14-18b1f81b8155-cert") pod "ingress-canary-869tz" (UID: "d37b1dac-43fd-47dd-9f14-18b1f81b8155") : secret "canary-serving-cert" not found Apr 17 14:21:54.719949 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:54.719897 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/822a8df9-29ea-4649-a163-22e1db926c84-tmp-dir\") pod \"dns-default-lp8nz\" (UID: \"822a8df9-29ea-4649-a163-22e1db926c84\") " pod="openshift-dns/dns-default-lp8nz" Apr 17 14:21:54.720013 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:54.719970 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/822a8df9-29ea-4649-a163-22e1db926c84-config-volume\") pod \"dns-default-lp8nz\" (UID: \"822a8df9-29ea-4649-a163-22e1db926c84\") " pod="openshift-dns/dns-default-lp8nz" Apr 17 14:21:54.731001 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:54.730976 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8448c\" (UniqueName: \"kubernetes.io/projected/822a8df9-29ea-4649-a163-22e1db926c84-kube-api-access-8448c\") pod \"dns-default-lp8nz\" (UID: \"822a8df9-29ea-4649-a163-22e1db926c84\") " pod="openshift-dns/dns-default-lp8nz" Apr 17 14:21:54.731118 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:54.731092 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6z4tw\" (UniqueName: \"kubernetes.io/projected/d37b1dac-43fd-47dd-9f14-18b1f81b8155-kube-api-access-6z4tw\") pod \"ingress-canary-869tz\" (UID: \"d37b1dac-43fd-47dd-9f14-18b1f81b8155\") " pod="openshift-ingress-canary/ingress-canary-869tz" Apr 17 14:21:55.021373 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:55.021324 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d0eca2ae-83f6-462e-b7d9-9ab1592717a8-metrics-certs\") pod \"network-metrics-daemon-ccsgf\" (UID: \"d0eca2ae-83f6-462e-b7d9-9ab1592717a8\") " pod="openshift-multus/network-metrics-daemon-ccsgf" Apr 17 14:21:55.021545 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:55.021468 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:21:55.021545 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:55.021535 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0eca2ae-83f6-462e-b7d9-9ab1592717a8-metrics-certs podName:d0eca2ae-83f6-462e-b7d9-9ab1592717a8 nodeName:}" failed. No retries permitted until 2026-04-17 14:22:27.021520606 +0000 UTC m=+65.328274205 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d0eca2ae-83f6-462e-b7d9-9ab1592717a8-metrics-certs") pod "network-metrics-daemon-ccsgf" (UID: "d0eca2ae-83f6-462e-b7d9-9ab1592717a8") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:21:55.122141 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:55.122102 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7q5wk\" (UniqueName: \"kubernetes.io/projected/39622ed6-2176-4d02-9823-0818852bbb2d-kube-api-access-7q5wk\") pod \"network-check-target-9cwz2\" (UID: \"39622ed6-2176-4d02-9823-0818852bbb2d\") " pod="openshift-network-diagnostics/network-check-target-9cwz2" Apr 17 14:21:55.122288 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:55.122233 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 14:21:55.122288 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:55.122245 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 14:21:55.122288 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:55.122255 2575 projected.go:194] Error preparing data for projected volume kube-api-access-7q5wk for pod openshift-network-diagnostics/network-check-target-9cwz2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:21:55.122390 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:55.122297 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/39622ed6-2176-4d02-9823-0818852bbb2d-kube-api-access-7q5wk podName:39622ed6-2176-4d02-9823-0818852bbb2d nodeName:}" failed. No retries permitted until 2026-04-17 14:22:27.122285424 +0000 UTC m=+65.429039022 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-7q5wk" (UniqueName: "kubernetes.io/projected/39622ed6-2176-4d02-9823-0818852bbb2d-kube-api-access-7q5wk") pod "network-check-target-9cwz2" (UID: "39622ed6-2176-4d02-9823-0818852bbb2d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:21:55.222511 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:55.222412 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/822a8df9-29ea-4649-a163-22e1db926c84-metrics-tls\") pod \"dns-default-lp8nz\" (UID: \"822a8df9-29ea-4649-a163-22e1db926c84\") " pod="openshift-dns/dns-default-lp8nz" Apr 17 14:21:55.222660 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:55.222524 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d37b1dac-43fd-47dd-9f14-18b1f81b8155-cert\") pod \"ingress-canary-869tz\" (UID: \"d37b1dac-43fd-47dd-9f14-18b1f81b8155\") " pod="openshift-ingress-canary/ingress-canary-869tz" Apr 17 14:21:55.222660 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:55.222563 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 14:21:55.222660 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:55.222625 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 14:21:55.222660 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:55.222631 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/822a8df9-29ea-4649-a163-22e1db926c84-metrics-tls podName:822a8df9-29ea-4649-a163-22e1db926c84 nodeName:}" failed. No retries permitted until 2026-04-17 14:21:56.222611284 +0000 UTC m=+34.529364892 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/822a8df9-29ea-4649-a163-22e1db926c84-metrics-tls") pod "dns-default-lp8nz" (UID: "822a8df9-29ea-4649-a163-22e1db926c84") : secret "dns-default-metrics-tls" not found Apr 17 14:21:55.222792 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:55.222665 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d37b1dac-43fd-47dd-9f14-18b1f81b8155-cert podName:d37b1dac-43fd-47dd-9f14-18b1f81b8155 nodeName:}" failed. No retries permitted until 2026-04-17 14:21:56.222655888 +0000 UTC m=+34.529409493 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d37b1dac-43fd-47dd-9f14-18b1f81b8155-cert") pod "ingress-canary-869tz" (UID: "d37b1dac-43fd-47dd-9f14-18b1f81b8155") : secret "canary-serving-cert" not found Apr 17 14:21:55.397830 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:55.397797 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9cwz2" Apr 17 14:21:55.398065 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:55.397946 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ccsgf" Apr 17 14:21:55.398163 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:55.398143 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fhzz8" Apr 17 14:21:55.400781 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:55.400759 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 14:21:55.401980 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:55.401949 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 14:21:55.401980 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:55.401965 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 17 14:21:55.402136 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:55.401984 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-qbxgh\"" Apr 17 14:21:55.402136 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:55.401968 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-hxvwz\"" Apr 17 14:21:55.402136 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:55.401966 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 14:21:55.677716 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:55.677685 2575 generic.go:358] "Generic (PLEG): container finished" podID="63505374-1c69-4d7f-853d-90e9526b6d12" containerID="38e2f7d304a0e769fb69a4f1861aee31727e549ac7813a76b8a2d4de21a7d815" exitCode=0 Apr 17 14:21:55.678335 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:55.677746 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vp9jm" event={"ID":"63505374-1c69-4d7f-853d-90e9526b6d12","Type":"ContainerDied","Data":"38e2f7d304a0e769fb69a4f1861aee31727e549ac7813a76b8a2d4de21a7d815"} Apr 17 14:21:56.229155 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:56.229117 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d37b1dac-43fd-47dd-9f14-18b1f81b8155-cert\") pod \"ingress-canary-869tz\" (UID: \"d37b1dac-43fd-47dd-9f14-18b1f81b8155\") " pod="openshift-ingress-canary/ingress-canary-869tz" Apr 17 14:21:56.229369 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:56.229177 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/822a8df9-29ea-4649-a163-22e1db926c84-metrics-tls\") pod \"dns-default-lp8nz\" (UID: \"822a8df9-29ea-4649-a163-22e1db926c84\") " pod="openshift-dns/dns-default-lp8nz" Apr 17 14:21:56.229369 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:56.229266 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 14:21:56.229369 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:56.229282 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 14:21:56.229369 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:56.229337 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d37b1dac-43fd-47dd-9f14-18b1f81b8155-cert podName:d37b1dac-43fd-47dd-9f14-18b1f81b8155 nodeName:}" failed. No retries permitted until 2026-04-17 14:21:58.229317517 +0000 UTC m=+36.536071119 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d37b1dac-43fd-47dd-9f14-18b1f81b8155-cert") pod "ingress-canary-869tz" (UID: "d37b1dac-43fd-47dd-9f14-18b1f81b8155") : secret "canary-serving-cert" not found Apr 17 14:21:56.229369 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:56.229352 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/822a8df9-29ea-4649-a163-22e1db926c84-metrics-tls podName:822a8df9-29ea-4649-a163-22e1db926c84 nodeName:}" failed. No retries permitted until 2026-04-17 14:21:58.229345877 +0000 UTC m=+36.536099474 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/822a8df9-29ea-4649-a163-22e1db926c84-metrics-tls") pod "dns-default-lp8nz" (UID: "822a8df9-29ea-4649-a163-22e1db926c84") : secret "dns-default-metrics-tls" not found Apr 17 14:21:56.682021 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:56.681973 2575 generic.go:358] "Generic (PLEG): container finished" podID="63505374-1c69-4d7f-853d-90e9526b6d12" containerID="c43d478086cd03802d071f5fbc738248e98a77c41c0201018accf19fceecbda8" exitCode=0 Apr 17 14:21:56.682384 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:56.682028 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vp9jm" event={"ID":"63505374-1c69-4d7f-853d-90e9526b6d12","Type":"ContainerDied","Data":"c43d478086cd03802d071f5fbc738248e98a77c41c0201018accf19fceecbda8"} Apr 17 14:21:57.686703 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:57.686521 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vp9jm" event={"ID":"63505374-1c69-4d7f-853d-90e9526b6d12","Type":"ContainerStarted","Data":"67ccd22af3e0640391e5fb00a87d32194967cbad6a2bce754dc8b06f753140fe"} Apr 17 14:21:57.708429 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:57.708373 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-vp9jm" podStartSLOduration=4.760336579 podStartE2EDuration="35.708355522s" podCreationTimestamp="2026-04-17 14:21:22 +0000 UTC" firstStartedPulling="2026-04-17 14:21:23.698688952 +0000 UTC m=+2.005442549" lastFinishedPulling="2026-04-17 14:21:54.646707888 +0000 UTC m=+32.953461492" observedRunningTime="2026-04-17 14:21:57.707025227 +0000 UTC m=+36.013778868" watchObservedRunningTime="2026-04-17 14:21:57.708355522 +0000 UTC m=+36.015109165" Apr 17 14:21:58.244821 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:58.244782 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/822a8df9-29ea-4649-a163-22e1db926c84-metrics-tls\") pod \"dns-default-lp8nz\" (UID: \"822a8df9-29ea-4649-a163-22e1db926c84\") " pod="openshift-dns/dns-default-lp8nz" Apr 17 14:21:58.245039 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:21:58.244861 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d37b1dac-43fd-47dd-9f14-18b1f81b8155-cert\") pod \"ingress-canary-869tz\" (UID: \"d37b1dac-43fd-47dd-9f14-18b1f81b8155\") " pod="openshift-ingress-canary/ingress-canary-869tz" Apr 17 14:21:58.245039 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:58.244925 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 14:21:58.245039 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:58.244990 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 14:21:58.245039 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:58.245002 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/822a8df9-29ea-4649-a163-22e1db926c84-metrics-tls podName:822a8df9-29ea-4649-a163-22e1db926c84 nodeName:}" failed. No retries permitted until 2026-04-17 14:22:02.244982373 +0000 UTC m=+40.551735982 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/822a8df9-29ea-4649-a163-22e1db926c84-metrics-tls") pod "dns-default-lp8nz" (UID: "822a8df9-29ea-4649-a163-22e1db926c84") : secret "dns-default-metrics-tls" not found Apr 17 14:21:58.245039 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:21:58.245034 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d37b1dac-43fd-47dd-9f14-18b1f81b8155-cert podName:d37b1dac-43fd-47dd-9f14-18b1f81b8155 nodeName:}" failed. No retries permitted until 2026-04-17 14:22:02.245021639 +0000 UTC m=+40.551775237 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d37b1dac-43fd-47dd-9f14-18b1f81b8155-cert") pod "ingress-canary-869tz" (UID: "d37b1dac-43fd-47dd-9f14-18b1f81b8155") : secret "canary-serving-cert" not found Apr 17 14:22:01.668094 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:22:01.668050 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/05b23a0f-2e19-4ad7-861d-8f2bb92bc69c-original-pull-secret\") pod \"global-pull-secret-syncer-fhzz8\" (UID: \"05b23a0f-2e19-4ad7-861d-8f2bb92bc69c\") " pod="kube-system/global-pull-secret-syncer-fhzz8" Apr 17 14:22:01.671118 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:22:01.671089 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/05b23a0f-2e19-4ad7-861d-8f2bb92bc69c-original-pull-secret\") pod \"global-pull-secret-syncer-fhzz8\" (UID: \"05b23a0f-2e19-4ad7-861d-8f2bb92bc69c\") " pod="kube-system/global-pull-secret-syncer-fhzz8" Apr 17 14:22:01.717541 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:22:01.717509 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fhzz8" Apr 17 14:22:01.871107 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:22:01.871054 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-fhzz8"] Apr 17 14:22:01.875995 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:22:01.875969 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod05b23a0f_2e19_4ad7_861d_8f2bb92bc69c.slice/crio-33922f1ae6d096fd5b81ba3f1adade546bf966cb0309e5b08d2f3363315c2376 WatchSource:0}: Error finding container 33922f1ae6d096fd5b81ba3f1adade546bf966cb0309e5b08d2f3363315c2376: Status 404 returned error can't find the container with id 33922f1ae6d096fd5b81ba3f1adade546bf966cb0309e5b08d2f3363315c2376 Apr 17 14:22:02.272109 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:22:02.272067 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d37b1dac-43fd-47dd-9f14-18b1f81b8155-cert\") pod \"ingress-canary-869tz\" (UID: \"d37b1dac-43fd-47dd-9f14-18b1f81b8155\") " pod="openshift-ingress-canary/ingress-canary-869tz" Apr 17 14:22:02.272289 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:22:02.272125 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/822a8df9-29ea-4649-a163-22e1db926c84-metrics-tls\") pod \"dns-default-lp8nz\" (UID: \"822a8df9-29ea-4649-a163-22e1db926c84\") " pod="openshift-dns/dns-default-lp8nz" Apr 17 14:22:02.272289 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:22:02.272206 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 14:22:02.272289 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:22:02.272206 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 14:22:02.272289 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:22:02.272257 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/822a8df9-29ea-4649-a163-22e1db926c84-metrics-tls podName:822a8df9-29ea-4649-a163-22e1db926c84 nodeName:}" failed. No retries permitted until 2026-04-17 14:22:10.27224233 +0000 UTC m=+48.578995932 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/822a8df9-29ea-4649-a163-22e1db926c84-metrics-tls") pod "dns-default-lp8nz" (UID: "822a8df9-29ea-4649-a163-22e1db926c84") : secret "dns-default-metrics-tls" not found Apr 17 14:22:02.272289 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:22:02.272269 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d37b1dac-43fd-47dd-9f14-18b1f81b8155-cert podName:d37b1dac-43fd-47dd-9f14-18b1f81b8155 nodeName:}" failed. No retries permitted until 2026-04-17 14:22:10.272263605 +0000 UTC m=+48.579017202 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d37b1dac-43fd-47dd-9f14-18b1f81b8155-cert") pod "ingress-canary-869tz" (UID: "d37b1dac-43fd-47dd-9f14-18b1f81b8155") : secret "canary-serving-cert" not found Apr 17 14:22:02.703363 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:22:02.703160 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-fhzz8" event={"ID":"05b23a0f-2e19-4ad7-861d-8f2bb92bc69c","Type":"ContainerStarted","Data":"33922f1ae6d096fd5b81ba3f1adade546bf966cb0309e5b08d2f3363315c2376"} Apr 17 14:22:06.712012 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:22:06.711955 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-fhzz8" event={"ID":"05b23a0f-2e19-4ad7-861d-8f2bb92bc69c","Type":"ContainerStarted","Data":"cfb98bd2d597b036254dae4d93926d9ff22708efab2a63eea2f843fbc37c0140"} Apr 17 14:22:06.726191 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:22:06.726143 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-fhzz8" podStartSLOduration=17.564257395 podStartE2EDuration="21.726128559s" podCreationTimestamp="2026-04-17 14:21:45 +0000 UTC" firstStartedPulling="2026-04-17 14:22:01.877773616 +0000 UTC m=+40.184527228" lastFinishedPulling="2026-04-17 14:22:06.039644791 +0000 UTC m=+44.346398392" observedRunningTime="2026-04-17 14:22:06.725863863 +0000 UTC m=+45.032617484" watchObservedRunningTime="2026-04-17 14:22:06.726128559 +0000 UTC m=+45.032882168" Apr 17 14:22:10.333216 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:22:10.333168 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d37b1dac-43fd-47dd-9f14-18b1f81b8155-cert\") pod \"ingress-canary-869tz\" (UID: \"d37b1dac-43fd-47dd-9f14-18b1f81b8155\") " pod="openshift-ingress-canary/ingress-canary-869tz" Apr 17 14:22:10.333216 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:22:10.333225 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/822a8df9-29ea-4649-a163-22e1db926c84-metrics-tls\") pod \"dns-default-lp8nz\" (UID: \"822a8df9-29ea-4649-a163-22e1db926c84\") " pod="openshift-dns/dns-default-lp8nz" Apr 17 14:22:10.333793 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:22:10.333321 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 14:22:10.333793 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:22:10.333323 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 14:22:10.333793 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:22:10.333380 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/822a8df9-29ea-4649-a163-22e1db926c84-metrics-tls podName:822a8df9-29ea-4649-a163-22e1db926c84 nodeName:}" failed. No retries permitted until 2026-04-17 14:22:26.333367253 +0000 UTC m=+64.640120851 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/822a8df9-29ea-4649-a163-22e1db926c84-metrics-tls") pod "dns-default-lp8nz" (UID: "822a8df9-29ea-4649-a163-22e1db926c84") : secret "dns-default-metrics-tls" not found Apr 17 14:22:10.333793 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:22:10.333394 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d37b1dac-43fd-47dd-9f14-18b1f81b8155-cert podName:d37b1dac-43fd-47dd-9f14-18b1f81b8155 nodeName:}" failed. No retries permitted until 2026-04-17 14:22:26.333388771 +0000 UTC m=+64.640142369 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d37b1dac-43fd-47dd-9f14-18b1f81b8155-cert") pod "ingress-canary-869tz" (UID: "d37b1dac-43fd-47dd-9f14-18b1f81b8155") : secret "canary-serving-cert" not found Apr 17 14:22:20.675494 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:22:20.675468 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-x4dft" Apr 17 14:22:26.350141 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:22:26.350097 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/822a8df9-29ea-4649-a163-22e1db926c84-metrics-tls\") pod \"dns-default-lp8nz\" (UID: \"822a8df9-29ea-4649-a163-22e1db926c84\") " pod="openshift-dns/dns-default-lp8nz" Apr 17 14:22:26.350537 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:22:26.350170 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d37b1dac-43fd-47dd-9f14-18b1f81b8155-cert\") pod \"ingress-canary-869tz\" (UID: \"d37b1dac-43fd-47dd-9f14-18b1f81b8155\") " pod="openshift-ingress-canary/ingress-canary-869tz" Apr 17 14:22:26.350537 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:22:26.350234 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 14:22:26.350537 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:22:26.350272 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 14:22:26.350537 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:22:26.350308 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/822a8df9-29ea-4649-a163-22e1db926c84-metrics-tls podName:822a8df9-29ea-4649-a163-22e1db926c84 nodeName:}" failed. No retries permitted until 2026-04-17 14:22:58.350291558 +0000 UTC m=+96.657045160 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/822a8df9-29ea-4649-a163-22e1db926c84-metrics-tls") pod "dns-default-lp8nz" (UID: "822a8df9-29ea-4649-a163-22e1db926c84") : secret "dns-default-metrics-tls" not found Apr 17 14:22:26.350537 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:22:26.350321 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d37b1dac-43fd-47dd-9f14-18b1f81b8155-cert podName:d37b1dac-43fd-47dd-9f14-18b1f81b8155 nodeName:}" failed. No retries permitted until 2026-04-17 14:22:58.350315711 +0000 UTC m=+96.657069308 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d37b1dac-43fd-47dd-9f14-18b1f81b8155-cert") pod "ingress-canary-869tz" (UID: "d37b1dac-43fd-47dd-9f14-18b1f81b8155") : secret "canary-serving-cert" not found Apr 17 14:22:27.053694 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:22:27.053661 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d0eca2ae-83f6-462e-b7d9-9ab1592717a8-metrics-certs\") pod \"network-metrics-daemon-ccsgf\" (UID: \"d0eca2ae-83f6-462e-b7d9-9ab1592717a8\") " pod="openshift-multus/network-metrics-daemon-ccsgf" Apr 17 14:22:27.056622 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:22:27.056603 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 14:22:27.064811 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:22:27.064795 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 14:22:27.064923 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:22:27.064863 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0eca2ae-83f6-462e-b7d9-9ab1592717a8-metrics-certs podName:d0eca2ae-83f6-462e-b7d9-9ab1592717a8 nodeName:}" failed. No retries permitted until 2026-04-17 14:23:31.064842691 +0000 UTC m=+129.371596289 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d0eca2ae-83f6-462e-b7d9-9ab1592717a8-metrics-certs") pod "network-metrics-daemon-ccsgf" (UID: "d0eca2ae-83f6-462e-b7d9-9ab1592717a8") : secret "metrics-daemon-secret" not found Apr 17 14:22:27.154210 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:22:27.154163 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7q5wk\" (UniqueName: \"kubernetes.io/projected/39622ed6-2176-4d02-9823-0818852bbb2d-kube-api-access-7q5wk\") pod \"network-check-target-9cwz2\" (UID: \"39622ed6-2176-4d02-9823-0818852bbb2d\") " pod="openshift-network-diagnostics/network-check-target-9cwz2" Apr 17 14:22:27.156847 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:22:27.156831 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 14:22:27.167263 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:22:27.167245 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 14:22:27.179086 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:22:27.179060 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7q5wk\" (UniqueName: \"kubernetes.io/projected/39622ed6-2176-4d02-9823-0818852bbb2d-kube-api-access-7q5wk\") pod \"network-check-target-9cwz2\" (UID: \"39622ed6-2176-4d02-9823-0818852bbb2d\") " pod="openshift-network-diagnostics/network-check-target-9cwz2" Apr 17 14:22:27.210138 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:22:27.210111 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-qbxgh\"" Apr 17 14:22:27.218222 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:22:27.218203 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9cwz2" Apr 17 14:22:27.328270 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:22:27.328240 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-9cwz2"] Apr 17 14:22:27.332482 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:22:27.332453 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39622ed6_2176_4d02_9823_0818852bbb2d.slice/crio-de7da30c599c1702c3fb4746d3b5ac50219b4ac84fcd27c2628384dea6b1dfa9 WatchSource:0}: Error finding container de7da30c599c1702c3fb4746d3b5ac50219b4ac84fcd27c2628384dea6b1dfa9: Status 404 returned error can't find the container with id de7da30c599c1702c3fb4746d3b5ac50219b4ac84fcd27c2628384dea6b1dfa9 Apr 17 14:22:27.752836 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:22:27.752750 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-9cwz2" event={"ID":"39622ed6-2176-4d02-9823-0818852bbb2d","Type":"ContainerStarted","Data":"de7da30c599c1702c3fb4746d3b5ac50219b4ac84fcd27c2628384dea6b1dfa9"} Apr 17 14:22:30.759244 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:22:30.759208 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-9cwz2" event={"ID":"39622ed6-2176-4d02-9823-0818852bbb2d","Type":"ContainerStarted","Data":"9aa930b62bdcb9b2ef2578e8809e480a17da2bf460bf9701b551a6d58b42637b"} Apr 17 14:22:30.759674 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:22:30.759327 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-9cwz2" Apr 17 14:22:30.776143 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:22:30.776099 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-9cwz2" podStartSLOduration=66.111895585 podStartE2EDuration="1m8.776085484s" podCreationTimestamp="2026-04-17 14:21:22 +0000 UTC" firstStartedPulling="2026-04-17 14:22:27.334374447 +0000 UTC m=+65.641128044" lastFinishedPulling="2026-04-17 14:22:29.998564343 +0000 UTC m=+68.305317943" observedRunningTime="2026-04-17 14:22:30.775311193 +0000 UTC m=+69.082064851" watchObservedRunningTime="2026-04-17 14:22:30.776085484 +0000 UTC m=+69.082839126" Apr 17 14:22:58.394130 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:22:58.393983 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d37b1dac-43fd-47dd-9f14-18b1f81b8155-cert\") pod \"ingress-canary-869tz\" (UID: \"d37b1dac-43fd-47dd-9f14-18b1f81b8155\") " pod="openshift-ingress-canary/ingress-canary-869tz" Apr 17 14:22:58.394130 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:22:58.394058 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/822a8df9-29ea-4649-a163-22e1db926c84-metrics-tls\") pod \"dns-default-lp8nz\" (UID: \"822a8df9-29ea-4649-a163-22e1db926c84\") " pod="openshift-dns/dns-default-lp8nz" Apr 17 14:22:58.394646 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:22:58.394132 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 14:22:58.394646 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:22:58.394145 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 14:22:58.394646 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:22:58.394210 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d37b1dac-43fd-47dd-9f14-18b1f81b8155-cert podName:d37b1dac-43fd-47dd-9f14-18b1f81b8155 nodeName:}" failed. No retries permitted until 2026-04-17 14:24:02.394189982 +0000 UTC m=+160.700943580 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d37b1dac-43fd-47dd-9f14-18b1f81b8155-cert") pod "ingress-canary-869tz" (UID: "d37b1dac-43fd-47dd-9f14-18b1f81b8155") : secret "canary-serving-cert" not found Apr 17 14:22:58.394646 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:22:58.394236 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/822a8df9-29ea-4649-a163-22e1db926c84-metrics-tls podName:822a8df9-29ea-4649-a163-22e1db926c84 nodeName:}" failed. No retries permitted until 2026-04-17 14:24:02.394222249 +0000 UTC m=+160.700975852 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/822a8df9-29ea-4649-a163-22e1db926c84-metrics-tls") pod "dns-default-lp8nz" (UID: "822a8df9-29ea-4649-a163-22e1db926c84") : secret "dns-default-metrics-tls" not found Apr 17 14:23:01.763970 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:01.763939 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-9cwz2" Apr 17 14:23:17.173011 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:17.172980 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-685d56bbbb-mp6gq"] Apr 17 14:23:17.175652 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:17.175633 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-685d56bbbb-mp6gq" Apr 17 14:23:17.178295 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:17.178269 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 17 14:23:17.178438 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:17.178269 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-b9wcm\"" Apr 17 14:23:17.178533 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:17.178521 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 17 14:23:17.178598 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:17.178539 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 17 14:23:17.178598 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:17.178521 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 17 14:23:17.178598 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:17.178567 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 17 14:23:17.178598 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:17.178519 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 17 14:23:17.186653 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:17.186630 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-685d56bbbb-mp6gq"] Apr 17 14:23:17.318597 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:17.318564 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c067d95c-e09f-4c47-8281-f638f4740633-default-certificate\") pod \"router-default-685d56bbbb-mp6gq\" (UID: \"c067d95c-e09f-4c47-8281-f638f4740633\") " pod="openshift-ingress/router-default-685d56bbbb-mp6gq" Apr 17 14:23:17.318597 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:17.318599 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4qwr\" (UniqueName: \"kubernetes.io/projected/c067d95c-e09f-4c47-8281-f638f4740633-kube-api-access-q4qwr\") pod \"router-default-685d56bbbb-mp6gq\" (UID: \"c067d95c-e09f-4c47-8281-f638f4740633\") " pod="openshift-ingress/router-default-685d56bbbb-mp6gq" Apr 17 14:23:17.318819 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:17.318633 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c067d95c-e09f-4c47-8281-f638f4740633-metrics-certs\") pod \"router-default-685d56bbbb-mp6gq\" (UID: \"c067d95c-e09f-4c47-8281-f638f4740633\") " pod="openshift-ingress/router-default-685d56bbbb-mp6gq" Apr 17 14:23:17.318819 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:17.318694 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c067d95c-e09f-4c47-8281-f638f4740633-service-ca-bundle\") pod \"router-default-685d56bbbb-mp6gq\" (UID: \"c067d95c-e09f-4c47-8281-f638f4740633\") " pod="openshift-ingress/router-default-685d56bbbb-mp6gq" Apr 17 14:23:17.318819 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:17.318724 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c067d95c-e09f-4c47-8281-f638f4740633-stats-auth\") pod \"router-default-685d56bbbb-mp6gq\" (UID: \"c067d95c-e09f-4c47-8281-f638f4740633\") " pod="openshift-ingress/router-default-685d56bbbb-mp6gq" Apr 17 14:23:17.381317 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:17.381285 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-zrlkv"] Apr 17 14:23:17.384003 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:17.383985 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-b4t8r"] Apr 17 14:23:17.384144 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:17.384126 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zrlkv" Apr 17 14:23:17.386915 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:17.386887 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-b4t8r" Apr 17 14:23:17.387375 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:17.387355 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 17 14:23:17.388691 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:17.388669 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 17 14:23:17.388825 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:17.388690 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 17 14:23:17.388825 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:17.388690 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-dnz54\"" Apr 17 14:23:17.388825 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:17.388667 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 17 14:23:17.389251 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:17.389238 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 17 14:23:17.389810 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:17.389793 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 17 14:23:17.389948 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:17.389797 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 17 14:23:17.390293 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:17.390278 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-v4ssd\"" Apr 17 14:23:17.390511 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:17.390496 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 17 14:23:17.397152 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:17.397106 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-b4t8r"] Apr 17 14:23:17.397365 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:17.397339 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 17 14:23:17.400020 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:17.399996 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-zrlkv"] Apr 17 14:23:17.419214 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:17.419179 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c067d95c-e09f-4c47-8281-f638f4740633-default-certificate\") pod \"router-default-685d56bbbb-mp6gq\" (UID: \"c067d95c-e09f-4c47-8281-f638f4740633\") " pod="openshift-ingress/router-default-685d56bbbb-mp6gq" Apr 17 14:23:17.419400 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:17.419219 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q4qwr\" (UniqueName: \"kubernetes.io/projected/c067d95c-e09f-4c47-8281-f638f4740633-kube-api-access-q4qwr\") pod \"router-default-685d56bbbb-mp6gq\" (UID: \"c067d95c-e09f-4c47-8281-f638f4740633\") " pod="openshift-ingress/router-default-685d56bbbb-mp6gq" Apr 17 14:23:17.419400 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:17.419252 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c067d95c-e09f-4c47-8281-f638f4740633-metrics-certs\") pod \"router-default-685d56bbbb-mp6gq\" (UID: \"c067d95c-e09f-4c47-8281-f638f4740633\") " pod="openshift-ingress/router-default-685d56bbbb-mp6gq" Apr 17 14:23:17.419400 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:17.419305 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c067d95c-e09f-4c47-8281-f638f4740633-service-ca-bundle\") pod \"router-default-685d56bbbb-mp6gq\" (UID: \"c067d95c-e09f-4c47-8281-f638f4740633\") " pod="openshift-ingress/router-default-685d56bbbb-mp6gq" Apr 17 14:23:17.419400 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:17.419346 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c067d95c-e09f-4c47-8281-f638f4740633-stats-auth\") pod \"router-default-685d56bbbb-mp6gq\" (UID: \"c067d95c-e09f-4c47-8281-f638f4740633\") " pod="openshift-ingress/router-default-685d56bbbb-mp6gq" Apr 17 14:23:17.419612 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:23:17.419418 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 14:23:17.419612 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:23:17.419486 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c067d95c-e09f-4c47-8281-f638f4740633-metrics-certs podName:c067d95c-e09f-4c47-8281-f638f4740633 nodeName:}" failed. No retries permitted until 2026-04-17 14:23:17.919466969 +0000 UTC m=+116.226220571 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c067d95c-e09f-4c47-8281-f638f4740633-metrics-certs") pod "router-default-685d56bbbb-mp6gq" (UID: "c067d95c-e09f-4c47-8281-f638f4740633") : secret "router-metrics-certs-default" not found Apr 17 14:23:17.419612 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:23:17.419514 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c067d95c-e09f-4c47-8281-f638f4740633-service-ca-bundle podName:c067d95c-e09f-4c47-8281-f638f4740633 nodeName:}" failed. No retries permitted until 2026-04-17 14:23:17.919498432 +0000 UTC m=+116.226252030 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/c067d95c-e09f-4c47-8281-f638f4740633-service-ca-bundle") pod "router-default-685d56bbbb-mp6gq" (UID: "c067d95c-e09f-4c47-8281-f638f4740633") : configmap references non-existent config key: service-ca.crt Apr 17 14:23:17.421860 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:17.421828 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c067d95c-e09f-4c47-8281-f638f4740633-default-certificate\") pod \"router-default-685d56bbbb-mp6gq\" (UID: \"c067d95c-e09f-4c47-8281-f638f4740633\") " pod="openshift-ingress/router-default-685d56bbbb-mp6gq" Apr 17 14:23:17.422006 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:17.421866 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c067d95c-e09f-4c47-8281-f638f4740633-stats-auth\") pod \"router-default-685d56bbbb-mp6gq\" (UID: \"c067d95c-e09f-4c47-8281-f638f4740633\") " pod="openshift-ingress/router-default-685d56bbbb-mp6gq" Apr 17 14:23:17.427767 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:17.427714 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4qwr\" (UniqueName: \"kubernetes.io/projected/c067d95c-e09f-4c47-8281-f638f4740633-kube-api-access-q4qwr\") pod \"router-default-685d56bbbb-mp6gq\" (UID: \"c067d95c-e09f-4c47-8281-f638f4740633\") " pod="openshift-ingress/router-default-685d56bbbb-mp6gq" Apr 17 14:23:17.480518 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:17.480482 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rkftq"] Apr 17 14:23:17.483587 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:17.483570 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rkftq" Apr 17 14:23:17.486133 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:17.486106 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 17 14:23:17.486250 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:17.486182 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 17 14:23:17.486250 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:17.486217 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 17 14:23:17.486355 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:17.486249 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 17 14:23:17.486548 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:17.486534 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-thpjv\"" Apr 17 14:23:17.490167 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:17.490142 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rkftq"] Apr 17 14:23:17.519985 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:17.519945 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c5d9d12d-4816-4a8c-954c-b83681df2cd9-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-b4t8r\" (UID: \"c5d9d12d-4816-4a8c-954c-b83681df2cd9\") " pod="openshift-insights/insights-operator-585dfdc468-b4t8r" Apr 17 14:23:17.520159 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:17.520110 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c5d9d12d-4816-4a8c-954c-b83681df2cd9-tmp\") pod \"insights-operator-585dfdc468-b4t8r\" (UID: \"c5d9d12d-4816-4a8c-954c-b83681df2cd9\") " pod="openshift-insights/insights-operator-585dfdc468-b4t8r" Apr 17 14:23:17.520234 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:17.520213 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c5d9d12d-4816-4a8c-954c-b83681df2cd9-serving-cert\") pod \"insights-operator-585dfdc468-b4t8r\" (UID: \"c5d9d12d-4816-4a8c-954c-b83681df2cd9\") " pod="openshift-insights/insights-operator-585dfdc468-b4t8r" Apr 17 14:23:17.520323 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:17.520296 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7951b390-3fe0-4bc3-bd2a-fde607c15638-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-zrlkv\" (UID: \"7951b390-3fe0-4bc3-bd2a-fde607c15638\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zrlkv" Apr 17 14:23:17.520368 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:17.520325 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/c5d9d12d-4816-4a8c-954c-b83681df2cd9-snapshots\") pod \"insights-operator-585dfdc468-b4t8r\" (UID: \"c5d9d12d-4816-4a8c-954c-b83681df2cd9\") " pod="openshift-insights/insights-operator-585dfdc468-b4t8r" Apr 17 14:23:17.520368 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:17.520354 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/7951b390-3fe0-4bc3-bd2a-fde607c15638-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-zrlkv\" (UID: \"7951b390-3fe0-4bc3-bd2a-fde607c15638\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zrlkv" Apr 17 14:23:17.520441 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:17.520377 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwxh9\" (UniqueName: \"kubernetes.io/projected/c5d9d12d-4816-4a8c-954c-b83681df2cd9-kube-api-access-fwxh9\") pod \"insights-operator-585dfdc468-b4t8r\" (UID: \"c5d9d12d-4816-4a8c-954c-b83681df2cd9\") " pod="openshift-insights/insights-operator-585dfdc468-b4t8r" Apr 17 14:23:17.520441 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:17.520402 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lppg2\" (UniqueName: \"kubernetes.io/projected/7951b390-3fe0-4bc3-bd2a-fde607c15638-kube-api-access-lppg2\") pod \"cluster-monitoring-operator-75587bd455-zrlkv\" (UID: \"7951b390-3fe0-4bc3-bd2a-fde607c15638\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zrlkv" Apr 17 14:23:17.520519 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:17.520449 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c5d9d12d-4816-4a8c-954c-b83681df2cd9-service-ca-bundle\") pod \"insights-operator-585dfdc468-b4t8r\" (UID: \"c5d9d12d-4816-4a8c-954c-b83681df2cd9\") " pod="openshift-insights/insights-operator-585dfdc468-b4t8r" Apr 17 14:23:17.621350 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:17.621317 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/7951b390-3fe0-4bc3-bd2a-fde607c15638-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-zrlkv\" (UID: \"7951b390-3fe0-4bc3-bd2a-fde607c15638\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zrlkv" Apr 17 14:23:17.621517 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:17.621356 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fwxh9\" (UniqueName: \"kubernetes.io/projected/c5d9d12d-4816-4a8c-954c-b83681df2cd9-kube-api-access-fwxh9\") pod \"insights-operator-585dfdc468-b4t8r\" (UID: \"c5d9d12d-4816-4a8c-954c-b83681df2cd9\") " pod="openshift-insights/insights-operator-585dfdc468-b4t8r" Apr 17 14:23:17.621517 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:17.621403 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b3196128-fba4-41e4-a197-d8c5cb0025cb-serving-cert\") pod \"service-ca-operator-d6fc45fc5-rkftq\" (UID: \"b3196128-fba4-41e4-a197-d8c5cb0025cb\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rkftq" Apr 17 14:23:17.621517 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:17.621427 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhc7f\" (UniqueName: \"kubernetes.io/projected/b3196128-fba4-41e4-a197-d8c5cb0025cb-kube-api-access-zhc7f\") pod \"service-ca-operator-d6fc45fc5-rkftq\" (UID: \"b3196128-fba4-41e4-a197-d8c5cb0025cb\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rkftq" Apr 17 14:23:17.621517 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:17.621474 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lppg2\" (UniqueName: \"kubernetes.io/projected/7951b390-3fe0-4bc3-bd2a-fde607c15638-kube-api-access-lppg2\") pod \"cluster-monitoring-operator-75587bd455-zrlkv\" (UID: \"7951b390-3fe0-4bc3-bd2a-fde607c15638\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zrlkv" Apr 17 14:23:17.621517 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:17.621499 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3196128-fba4-41e4-a197-d8c5cb0025cb-config\") pod \"service-ca-operator-d6fc45fc5-rkftq\" (UID: \"b3196128-fba4-41e4-a197-d8c5cb0025cb\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rkftq" Apr 17 14:23:17.621734 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:17.621529 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c5d9d12d-4816-4a8c-954c-b83681df2cd9-service-ca-bundle\") pod \"insights-operator-585dfdc468-b4t8r\" (UID: \"c5d9d12d-4816-4a8c-954c-b83681df2cd9\") " pod="openshift-insights/insights-operator-585dfdc468-b4t8r" Apr 17 14:23:17.621734 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:17.621573 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c5d9d12d-4816-4a8c-954c-b83681df2cd9-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-b4t8r\" (UID: \"c5d9d12d-4816-4a8c-954c-b83681df2cd9\") " pod="openshift-insights/insights-operator-585dfdc468-b4t8r" Apr 17 14:23:17.621734 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:17.621666 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c5d9d12d-4816-4a8c-954c-b83681df2cd9-tmp\") pod \"insights-operator-585dfdc468-b4t8r\" (UID: \"c5d9d12d-4816-4a8c-954c-b83681df2cd9\") " pod="openshift-insights/insights-operator-585dfdc468-b4t8r" Apr 17 14:23:17.621734 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:17.621714 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c5d9d12d-4816-4a8c-954c-b83681df2cd9-serving-cert\") pod \"insights-operator-585dfdc468-b4t8r\" (UID: \"c5d9d12d-4816-4a8c-954c-b83681df2cd9\") " pod="openshift-insights/insights-operator-585dfdc468-b4t8r" Apr 17 14:23:17.621931 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:17.621769 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7951b390-3fe0-4bc3-bd2a-fde607c15638-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-zrlkv\" (UID: \"7951b390-3fe0-4bc3-bd2a-fde607c15638\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zrlkv" Apr 17 14:23:17.621931 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:17.621799 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/c5d9d12d-4816-4a8c-954c-b83681df2cd9-snapshots\") pod \"insights-operator-585dfdc468-b4t8r\" (UID: \"c5d9d12d-4816-4a8c-954c-b83681df2cd9\") " pod="openshift-insights/insights-operator-585dfdc468-b4t8r" Apr 17 14:23:17.622036 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:23:17.621939 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 14:23:17.622088 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:23:17.622033 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7951b390-3fe0-4bc3-bd2a-fde607c15638-cluster-monitoring-operator-tls podName:7951b390-3fe0-4bc3-bd2a-fde607c15638 nodeName:}" failed. No retries permitted until 2026-04-17 14:23:18.122011869 +0000 UTC m=+116.428765482 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/7951b390-3fe0-4bc3-bd2a-fde607c15638-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-zrlkv" (UID: "7951b390-3fe0-4bc3-bd2a-fde607c15638") : secret "cluster-monitoring-operator-tls" not found Apr 17 14:23:17.622148 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:17.622132 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c5d9d12d-4816-4a8c-954c-b83681df2cd9-tmp\") pod \"insights-operator-585dfdc468-b4t8r\" (UID: \"c5d9d12d-4816-4a8c-954c-b83681df2cd9\") " pod="openshift-insights/insights-operator-585dfdc468-b4t8r" Apr 17 14:23:17.622238 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:17.622204 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c5d9d12d-4816-4a8c-954c-b83681df2cd9-service-ca-bundle\") pod \"insights-operator-585dfdc468-b4t8r\" (UID: \"c5d9d12d-4816-4a8c-954c-b83681df2cd9\") " pod="openshift-insights/insights-operator-585dfdc468-b4t8r" Apr 17 14:23:17.622344 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:17.622255 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/7951b390-3fe0-4bc3-bd2a-fde607c15638-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-zrlkv\" (UID: \"7951b390-3fe0-4bc3-bd2a-fde607c15638\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zrlkv" Apr 17 14:23:17.622437 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:17.622415 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/c5d9d12d-4816-4a8c-954c-b83681df2cd9-snapshots\") pod \"insights-operator-585dfdc468-b4t8r\" (UID: \"c5d9d12d-4816-4a8c-954c-b83681df2cd9\") " pod="openshift-insights/insights-operator-585dfdc468-b4t8r" Apr 17 14:23:17.622479 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:17.622449 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c5d9d12d-4816-4a8c-954c-b83681df2cd9-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-b4t8r\" (UID: \"c5d9d12d-4816-4a8c-954c-b83681df2cd9\") " pod="openshift-insights/insights-operator-585dfdc468-b4t8r" Apr 17 14:23:17.624312 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:17.624295 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c5d9d12d-4816-4a8c-954c-b83681df2cd9-serving-cert\") pod \"insights-operator-585dfdc468-b4t8r\" (UID: \"c5d9d12d-4816-4a8c-954c-b83681df2cd9\") " pod="openshift-insights/insights-operator-585dfdc468-b4t8r" Apr 17 14:23:17.630357 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:17.630328 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwxh9\" (UniqueName: \"kubernetes.io/projected/c5d9d12d-4816-4a8c-954c-b83681df2cd9-kube-api-access-fwxh9\") pod \"insights-operator-585dfdc468-b4t8r\" (UID: \"c5d9d12d-4816-4a8c-954c-b83681df2cd9\") " pod="openshift-insights/insights-operator-585dfdc468-b4t8r" Apr 17 14:23:17.630473 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:17.630390 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lppg2\" (UniqueName: \"kubernetes.io/projected/7951b390-3fe0-4bc3-bd2a-fde607c15638-kube-api-access-lppg2\") pod \"cluster-monitoring-operator-75587bd455-zrlkv\" (UID: \"7951b390-3fe0-4bc3-bd2a-fde607c15638\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zrlkv" Apr 17 14:23:17.704847 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:17.704740 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-b4t8r" Apr 17 14:23:17.722789 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:17.722756 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b3196128-fba4-41e4-a197-d8c5cb0025cb-serving-cert\") pod \"service-ca-operator-d6fc45fc5-rkftq\" (UID: \"b3196128-fba4-41e4-a197-d8c5cb0025cb\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rkftq" Apr 17 14:23:17.722962 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:17.722799 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zhc7f\" (UniqueName: \"kubernetes.io/projected/b3196128-fba4-41e4-a197-d8c5cb0025cb-kube-api-access-zhc7f\") pod \"service-ca-operator-d6fc45fc5-rkftq\" (UID: \"b3196128-fba4-41e4-a197-d8c5cb0025cb\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rkftq" Apr 17 14:23:17.722962 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:17.722921 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3196128-fba4-41e4-a197-d8c5cb0025cb-config\") pod \"service-ca-operator-d6fc45fc5-rkftq\" (UID: \"b3196128-fba4-41e4-a197-d8c5cb0025cb\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rkftq" Apr 17 14:23:17.723470 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:17.723450 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3196128-fba4-41e4-a197-d8c5cb0025cb-config\") pod \"service-ca-operator-d6fc45fc5-rkftq\" (UID: \"b3196128-fba4-41e4-a197-d8c5cb0025cb\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rkftq" Apr 17 14:23:17.725146 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:17.725118 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b3196128-fba4-41e4-a197-d8c5cb0025cb-serving-cert\") pod \"service-ca-operator-d6fc45fc5-rkftq\" (UID: \"b3196128-fba4-41e4-a197-d8c5cb0025cb\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rkftq" Apr 17 14:23:17.731384 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:17.731364 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhc7f\" (UniqueName: \"kubernetes.io/projected/b3196128-fba4-41e4-a197-d8c5cb0025cb-kube-api-access-zhc7f\") pod \"service-ca-operator-d6fc45fc5-rkftq\" (UID: \"b3196128-fba4-41e4-a197-d8c5cb0025cb\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rkftq" Apr 17 14:23:17.793331 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:17.793294 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rkftq" Apr 17 14:23:17.821773 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:17.821741 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-b4t8r"] Apr 17 14:23:17.825723 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:23:17.825695 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc5d9d12d_4816_4a8c_954c_b83681df2cd9.slice/crio-57ccaad19603fac209a782ab427339aaae500deb0721602fdae0b487f808a53a WatchSource:0}: Error finding container 57ccaad19603fac209a782ab427339aaae500deb0721602fdae0b487f808a53a: Status 404 returned error can't find the container with id 57ccaad19603fac209a782ab427339aaae500deb0721602fdae0b487f808a53a Apr 17 14:23:17.846129 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:17.846092 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-b4t8r" event={"ID":"c5d9d12d-4816-4a8c-954c-b83681df2cd9","Type":"ContainerStarted","Data":"57ccaad19603fac209a782ab427339aaae500deb0721602fdae0b487f808a53a"} Apr 17 14:23:17.919730 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:17.919700 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rkftq"] Apr 17 14:23:17.922388 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:23:17.922357 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb3196128_fba4_41e4_a197_d8c5cb0025cb.slice/crio-d5ddb14c7a8b9247eadba5ac4e23532af53556919cdb1d251bdb2e7c80c7f4d4 WatchSource:0}: Error finding container d5ddb14c7a8b9247eadba5ac4e23532af53556919cdb1d251bdb2e7c80c7f4d4: Status 404 returned error can't find the container with id d5ddb14c7a8b9247eadba5ac4e23532af53556919cdb1d251bdb2e7c80c7f4d4 Apr 17 14:23:17.924382 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:17.924363 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c067d95c-e09f-4c47-8281-f638f4740633-metrics-certs\") pod \"router-default-685d56bbbb-mp6gq\" (UID: \"c067d95c-e09f-4c47-8281-f638f4740633\") " pod="openshift-ingress/router-default-685d56bbbb-mp6gq" Apr 17 14:23:17.924463 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:17.924413 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c067d95c-e09f-4c47-8281-f638f4740633-service-ca-bundle\") pod \"router-default-685d56bbbb-mp6gq\" (UID: \"c067d95c-e09f-4c47-8281-f638f4740633\") " pod="openshift-ingress/router-default-685d56bbbb-mp6gq" Apr 17 14:23:17.924519 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:23:17.924496 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 14:23:17.924562 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:23:17.924548 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c067d95c-e09f-4c47-8281-f638f4740633-metrics-certs podName:c067d95c-e09f-4c47-8281-f638f4740633 nodeName:}" failed. No retries permitted until 2026-04-17 14:23:18.924534008 +0000 UTC m=+117.231287605 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c067d95c-e09f-4c47-8281-f638f4740633-metrics-certs") pod "router-default-685d56bbbb-mp6gq" (UID: "c067d95c-e09f-4c47-8281-f638f4740633") : secret "router-metrics-certs-default" not found Apr 17 14:23:17.924607 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:23:17.924594 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c067d95c-e09f-4c47-8281-f638f4740633-service-ca-bundle podName:c067d95c-e09f-4c47-8281-f638f4740633 nodeName:}" failed. No retries permitted until 2026-04-17 14:23:18.924574917 +0000 UTC m=+117.231328521 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/c067d95c-e09f-4c47-8281-f638f4740633-service-ca-bundle") pod "router-default-685d56bbbb-mp6gq" (UID: "c067d95c-e09f-4c47-8281-f638f4740633") : configmap references non-existent config key: service-ca.crt Apr 17 14:23:18.126301 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:18.126249 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7951b390-3fe0-4bc3-bd2a-fde607c15638-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-zrlkv\" (UID: \"7951b390-3fe0-4bc3-bd2a-fde607c15638\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zrlkv" Apr 17 14:23:18.126486 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:23:18.126412 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 14:23:18.126486 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:23:18.126481 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7951b390-3fe0-4bc3-bd2a-fde607c15638-cluster-monitoring-operator-tls podName:7951b390-3fe0-4bc3-bd2a-fde607c15638 nodeName:}" failed. No retries permitted until 2026-04-17 14:23:19.126463331 +0000 UTC m=+117.433216929 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/7951b390-3fe0-4bc3-bd2a-fde607c15638-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-zrlkv" (UID: "7951b390-3fe0-4bc3-bd2a-fde607c15638") : secret "cluster-monitoring-operator-tls" not found Apr 17 14:23:18.850075 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:18.850039 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rkftq" event={"ID":"b3196128-fba4-41e4-a197-d8c5cb0025cb","Type":"ContainerStarted","Data":"d5ddb14c7a8b9247eadba5ac4e23532af53556919cdb1d251bdb2e7c80c7f4d4"} Apr 17 14:23:18.934301 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:18.934256 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c067d95c-e09f-4c47-8281-f638f4740633-service-ca-bundle\") pod \"router-default-685d56bbbb-mp6gq\" (UID: \"c067d95c-e09f-4c47-8281-f638f4740633\") " pod="openshift-ingress/router-default-685d56bbbb-mp6gq" Apr 17 14:23:18.934478 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:18.934354 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c067d95c-e09f-4c47-8281-f638f4740633-metrics-certs\") pod \"router-default-685d56bbbb-mp6gq\" (UID: \"c067d95c-e09f-4c47-8281-f638f4740633\") " pod="openshift-ingress/router-default-685d56bbbb-mp6gq" Apr 17 14:23:18.934478 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:23:18.934459 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 14:23:18.934575 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:23:18.934461 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c067d95c-e09f-4c47-8281-f638f4740633-service-ca-bundle podName:c067d95c-e09f-4c47-8281-f638f4740633 nodeName:}" failed. No retries permitted until 2026-04-17 14:23:20.934436881 +0000 UTC m=+119.241190492 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/c067d95c-e09f-4c47-8281-f638f4740633-service-ca-bundle") pod "router-default-685d56bbbb-mp6gq" (UID: "c067d95c-e09f-4c47-8281-f638f4740633") : configmap references non-existent config key: service-ca.crt Apr 17 14:23:18.934575 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:23:18.934519 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c067d95c-e09f-4c47-8281-f638f4740633-metrics-certs podName:c067d95c-e09f-4c47-8281-f638f4740633 nodeName:}" failed. No retries permitted until 2026-04-17 14:23:20.934502221 +0000 UTC m=+119.241255824 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c067d95c-e09f-4c47-8281-f638f4740633-metrics-certs") pod "router-default-685d56bbbb-mp6gq" (UID: "c067d95c-e09f-4c47-8281-f638f4740633") : secret "router-metrics-certs-default" not found Apr 17 14:23:19.136358 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:23:19.136231 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 14:23:19.136358 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:23:19.136316 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7951b390-3fe0-4bc3-bd2a-fde607c15638-cluster-monitoring-operator-tls podName:7951b390-3fe0-4bc3-bd2a-fde607c15638 nodeName:}" failed. No retries permitted until 2026-04-17 14:23:21.136297169 +0000 UTC m=+119.443050771 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/7951b390-3fe0-4bc3-bd2a-fde607c15638-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-zrlkv" (UID: "7951b390-3fe0-4bc3-bd2a-fde607c15638") : secret "cluster-monitoring-operator-tls" not found Apr 17 14:23:19.136358 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:19.136175 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7951b390-3fe0-4bc3-bd2a-fde607c15638-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-zrlkv\" (UID: \"7951b390-3fe0-4bc3-bd2a-fde607c15638\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zrlkv" Apr 17 14:23:20.859474 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:20.859432 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rkftq" event={"ID":"b3196128-fba4-41e4-a197-d8c5cb0025cb","Type":"ContainerStarted","Data":"31a27b40727544efea83907cc25720f64825efdac95e49c89ac9de1eb76f6ffe"} Apr 17 14:23:20.860746 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:20.860722 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-b4t8r" event={"ID":"c5d9d12d-4816-4a8c-954c-b83681df2cd9","Type":"ContainerStarted","Data":"ef3e0f9ad9968bf21416fdcffcc0fdb34206f79240548344958edfafa3f18b8e"} Apr 17 14:23:20.875245 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:20.875197 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rkftq" podStartSLOduration=1.254876169 podStartE2EDuration="3.875181493s" podCreationTimestamp="2026-04-17 14:23:17 +0000 UTC" firstStartedPulling="2026-04-17 14:23:17.92404386 +0000 UTC m=+116.230797461" lastFinishedPulling="2026-04-17 14:23:20.544349183 +0000 UTC m=+118.851102785" observedRunningTime="2026-04-17 14:23:20.874906441 +0000 UTC m=+119.181660063" watchObservedRunningTime="2026-04-17 14:23:20.875181493 +0000 UTC m=+119.181935113" Apr 17 14:23:20.892356 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:20.892295 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-b4t8r" podStartSLOduration=1.17342376 podStartE2EDuration="3.892278113s" podCreationTimestamp="2026-04-17 14:23:17 +0000 UTC" firstStartedPulling="2026-04-17 14:23:17.827675116 +0000 UTC m=+116.134428727" lastFinishedPulling="2026-04-17 14:23:20.546529478 +0000 UTC m=+118.853283080" observedRunningTime="2026-04-17 14:23:20.890913767 +0000 UTC m=+119.197667388" watchObservedRunningTime="2026-04-17 14:23:20.892278113 +0000 UTC m=+119.199031736" Apr 17 14:23:20.952287 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:20.952215 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c067d95c-e09f-4c47-8281-f638f4740633-metrics-certs\") pod \"router-default-685d56bbbb-mp6gq\" (UID: \"c067d95c-e09f-4c47-8281-f638f4740633\") " pod="openshift-ingress/router-default-685d56bbbb-mp6gq" Apr 17 14:23:20.952479 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:20.952339 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c067d95c-e09f-4c47-8281-f638f4740633-service-ca-bundle\") pod \"router-default-685d56bbbb-mp6gq\" (UID: \"c067d95c-e09f-4c47-8281-f638f4740633\") " pod="openshift-ingress/router-default-685d56bbbb-mp6gq" Apr 17 14:23:20.952544 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:23:20.952500 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 14:23:20.952600 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:23:20.952576 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c067d95c-e09f-4c47-8281-f638f4740633-metrics-certs podName:c067d95c-e09f-4c47-8281-f638f4740633 nodeName:}" failed. No retries permitted until 2026-04-17 14:23:24.952555504 +0000 UTC m=+123.259309126 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c067d95c-e09f-4c47-8281-f638f4740633-metrics-certs") pod "router-default-685d56bbbb-mp6gq" (UID: "c067d95c-e09f-4c47-8281-f638f4740633") : secret "router-metrics-certs-default" not found Apr 17 14:23:20.952942 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:23:20.952760 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c067d95c-e09f-4c47-8281-f638f4740633-service-ca-bundle podName:c067d95c-e09f-4c47-8281-f638f4740633 nodeName:}" failed. No retries permitted until 2026-04-17 14:23:24.952744717 +0000 UTC m=+123.259498315 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/c067d95c-e09f-4c47-8281-f638f4740633-service-ca-bundle") pod "router-default-685d56bbbb-mp6gq" (UID: "c067d95c-e09f-4c47-8281-f638f4740633") : configmap references non-existent config key: service-ca.crt Apr 17 14:23:21.154155 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:21.154039 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7951b390-3fe0-4bc3-bd2a-fde607c15638-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-zrlkv\" (UID: \"7951b390-3fe0-4bc3-bd2a-fde607c15638\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zrlkv" Apr 17 14:23:21.154337 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:23:21.154202 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 14:23:21.154337 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:23:21.154284 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7951b390-3fe0-4bc3-bd2a-fde607c15638-cluster-monitoring-operator-tls podName:7951b390-3fe0-4bc3-bd2a-fde607c15638 nodeName:}" failed. No retries permitted until 2026-04-17 14:23:25.154262073 +0000 UTC m=+123.461015675 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/7951b390-3fe0-4bc3-bd2a-fde607c15638-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-zrlkv" (UID: "7951b390-3fe0-4bc3-bd2a-fde607c15638") : secret "cluster-monitoring-operator-tls" not found Apr 17 14:23:23.663816 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:23.663791 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-kvtp7_e8508015-adfb-42aa-acfc-92b24ec90241/dns-node-resolver/0.log" Apr 17 14:23:24.264050 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:24.264020 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-58lbn_822ddccf-2c1a-42b8-8b4f-b676b2aa2c9f/node-ca/0.log" Apr 17 14:23:24.493435 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:24.493402 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-g9pnc"] Apr 17 14:23:24.497322 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:24.497305 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-g9pnc" Apr 17 14:23:24.499977 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:24.499955 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 17 14:23:24.500096 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:24.499956 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 17 14:23:24.501144 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:24.501124 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-lpdlz\"" Apr 17 14:23:24.501215 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:24.501170 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 17 14:23:24.501215 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:24.501134 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 17 14:23:24.502777 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:24.502757 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-g9pnc"] Apr 17 14:23:24.583352 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:24.583309 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/35d01edb-30af-4406-86ce-ae52032a3653-signing-cabundle\") pod \"service-ca-865cb79987-g9pnc\" (UID: \"35d01edb-30af-4406-86ce-ae52032a3653\") " pod="openshift-service-ca/service-ca-865cb79987-g9pnc" Apr 17 14:23:24.583521 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:24.583392 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/35d01edb-30af-4406-86ce-ae52032a3653-signing-key\") pod \"service-ca-865cb79987-g9pnc\" (UID: \"35d01edb-30af-4406-86ce-ae52032a3653\") " pod="openshift-service-ca/service-ca-865cb79987-g9pnc" Apr 17 14:23:24.583521 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:24.583473 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r75rc\" (UniqueName: \"kubernetes.io/projected/35d01edb-30af-4406-86ce-ae52032a3653-kube-api-access-r75rc\") pod \"service-ca-865cb79987-g9pnc\" (UID: \"35d01edb-30af-4406-86ce-ae52032a3653\") " pod="openshift-service-ca/service-ca-865cb79987-g9pnc" Apr 17 14:23:24.684577 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:24.684538 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r75rc\" (UniqueName: \"kubernetes.io/projected/35d01edb-30af-4406-86ce-ae52032a3653-kube-api-access-r75rc\") pod \"service-ca-865cb79987-g9pnc\" (UID: \"35d01edb-30af-4406-86ce-ae52032a3653\") " pod="openshift-service-ca/service-ca-865cb79987-g9pnc" Apr 17 14:23:24.684577 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:24.684584 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/35d01edb-30af-4406-86ce-ae52032a3653-signing-cabundle\") pod \"service-ca-865cb79987-g9pnc\" (UID: \"35d01edb-30af-4406-86ce-ae52032a3653\") " pod="openshift-service-ca/service-ca-865cb79987-g9pnc" Apr 17 14:23:24.685051 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:24.684622 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/35d01edb-30af-4406-86ce-ae52032a3653-signing-key\") pod \"service-ca-865cb79987-g9pnc\" (UID: \"35d01edb-30af-4406-86ce-ae52032a3653\") " pod="openshift-service-ca/service-ca-865cb79987-g9pnc" Apr 17 14:23:24.685376 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:24.685356 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/35d01edb-30af-4406-86ce-ae52032a3653-signing-cabundle\") pod \"service-ca-865cb79987-g9pnc\" (UID: \"35d01edb-30af-4406-86ce-ae52032a3653\") " pod="openshift-service-ca/service-ca-865cb79987-g9pnc" Apr 17 14:23:24.686937 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:24.686919 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/35d01edb-30af-4406-86ce-ae52032a3653-signing-key\") pod \"service-ca-865cb79987-g9pnc\" (UID: \"35d01edb-30af-4406-86ce-ae52032a3653\") " pod="openshift-service-ca/service-ca-865cb79987-g9pnc" Apr 17 14:23:24.692175 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:24.692154 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r75rc\" (UniqueName: \"kubernetes.io/projected/35d01edb-30af-4406-86ce-ae52032a3653-kube-api-access-r75rc\") pod \"service-ca-865cb79987-g9pnc\" (UID: \"35d01edb-30af-4406-86ce-ae52032a3653\") " pod="openshift-service-ca/service-ca-865cb79987-g9pnc" Apr 17 14:23:24.806488 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:24.806457 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-g9pnc" Apr 17 14:23:24.918724 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:24.918695 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-g9pnc"] Apr 17 14:23:24.921842 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:23:24.921817 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod35d01edb_30af_4406_86ce_ae52032a3653.slice/crio-f255ee614906ce51cf00b5a5bd5528a1eb5ad7d147af27af4c7481c2725ea253 WatchSource:0}: Error finding container f255ee614906ce51cf00b5a5bd5528a1eb5ad7d147af27af4c7481c2725ea253: Status 404 returned error can't find the container with id f255ee614906ce51cf00b5a5bd5528a1eb5ad7d147af27af4c7481c2725ea253 Apr 17 14:23:24.986813 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:24.986784 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c067d95c-e09f-4c47-8281-f638f4740633-metrics-certs\") pod \"router-default-685d56bbbb-mp6gq\" (UID: \"c067d95c-e09f-4c47-8281-f638f4740633\") " pod="openshift-ingress/router-default-685d56bbbb-mp6gq" Apr 17 14:23:24.986945 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:24.986834 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c067d95c-e09f-4c47-8281-f638f4740633-service-ca-bundle\") pod \"router-default-685d56bbbb-mp6gq\" (UID: \"c067d95c-e09f-4c47-8281-f638f4740633\") " pod="openshift-ingress/router-default-685d56bbbb-mp6gq" Apr 17 14:23:24.986945 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:23:24.986934 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 14:23:24.986945 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:23:24.986945 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c067d95c-e09f-4c47-8281-f638f4740633-service-ca-bundle podName:c067d95c-e09f-4c47-8281-f638f4740633 nodeName:}" failed. No retries permitted until 2026-04-17 14:23:32.986931699 +0000 UTC m=+131.293685297 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/c067d95c-e09f-4c47-8281-f638f4740633-service-ca-bundle") pod "router-default-685d56bbbb-mp6gq" (UID: "c067d95c-e09f-4c47-8281-f638f4740633") : configmap references non-existent config key: service-ca.crt Apr 17 14:23:24.987052 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:23:24.986986 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c067d95c-e09f-4c47-8281-f638f4740633-metrics-certs podName:c067d95c-e09f-4c47-8281-f638f4740633 nodeName:}" failed. No retries permitted until 2026-04-17 14:23:32.986971638 +0000 UTC m=+131.293725238 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c067d95c-e09f-4c47-8281-f638f4740633-metrics-certs") pod "router-default-685d56bbbb-mp6gq" (UID: "c067d95c-e09f-4c47-8281-f638f4740633") : secret "router-metrics-certs-default" not found Apr 17 14:23:25.188262 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:25.188169 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7951b390-3fe0-4bc3-bd2a-fde607c15638-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-zrlkv\" (UID: \"7951b390-3fe0-4bc3-bd2a-fde607c15638\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zrlkv" Apr 17 14:23:25.188409 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:23:25.188288 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 14:23:25.188409 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:23:25.188341 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7951b390-3fe0-4bc3-bd2a-fde607c15638-cluster-monitoring-operator-tls podName:7951b390-3fe0-4bc3-bd2a-fde607c15638 nodeName:}" failed. No retries permitted until 2026-04-17 14:23:33.188327191 +0000 UTC m=+131.495080788 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/7951b390-3fe0-4bc3-bd2a-fde607c15638-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-zrlkv" (UID: "7951b390-3fe0-4bc3-bd2a-fde607c15638") : secret "cluster-monitoring-operator-tls" not found Apr 17 14:23:25.872534 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:25.872499 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-g9pnc" event={"ID":"35d01edb-30af-4406-86ce-ae52032a3653","Type":"ContainerStarted","Data":"51066189a6b9578a30cd6b25c7e1de72ba39182e99dca11c8151fcb00724cd77"} Apr 17 14:23:25.872534 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:25.872535 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-g9pnc" event={"ID":"35d01edb-30af-4406-86ce-ae52032a3653","Type":"ContainerStarted","Data":"f255ee614906ce51cf00b5a5bd5528a1eb5ad7d147af27af4c7481c2725ea253"} Apr 17 14:23:25.889916 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:25.889851 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-g9pnc" podStartSLOduration=1.889836437 podStartE2EDuration="1.889836437s" podCreationTimestamp="2026-04-17 14:23:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 14:23:25.889204785 +0000 UTC m=+124.195958418" watchObservedRunningTime="2026-04-17 14:23:25.889836437 +0000 UTC m=+124.196590056" Apr 17 14:23:31.137407 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:31.137370 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d0eca2ae-83f6-462e-b7d9-9ab1592717a8-metrics-certs\") pod \"network-metrics-daemon-ccsgf\" (UID: \"d0eca2ae-83f6-462e-b7d9-9ab1592717a8\") " pod="openshift-multus/network-metrics-daemon-ccsgf" Apr 17 14:23:31.137829 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:23:31.137522 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 14:23:31.137829 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:23:31.137589 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0eca2ae-83f6-462e-b7d9-9ab1592717a8-metrics-certs podName:d0eca2ae-83f6-462e-b7d9-9ab1592717a8 nodeName:}" failed. No retries permitted until 2026-04-17 14:25:33.137573209 +0000 UTC m=+251.444326808 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d0eca2ae-83f6-462e-b7d9-9ab1592717a8-metrics-certs") pod "network-metrics-daemon-ccsgf" (UID: "d0eca2ae-83f6-462e-b7d9-9ab1592717a8") : secret "metrics-daemon-secret" not found Apr 17 14:23:33.051975 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:33.051940 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c067d95c-e09f-4c47-8281-f638f4740633-metrics-certs\") pod \"router-default-685d56bbbb-mp6gq\" (UID: \"c067d95c-e09f-4c47-8281-f638f4740633\") " pod="openshift-ingress/router-default-685d56bbbb-mp6gq" Apr 17 14:23:33.052370 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:33.052001 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c067d95c-e09f-4c47-8281-f638f4740633-service-ca-bundle\") pod \"router-default-685d56bbbb-mp6gq\" (UID: \"c067d95c-e09f-4c47-8281-f638f4740633\") " pod="openshift-ingress/router-default-685d56bbbb-mp6gq" Apr 17 14:23:33.052609 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:33.052589 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c067d95c-e09f-4c47-8281-f638f4740633-service-ca-bundle\") pod \"router-default-685d56bbbb-mp6gq\" (UID: \"c067d95c-e09f-4c47-8281-f638f4740633\") " pod="openshift-ingress/router-default-685d56bbbb-mp6gq" Apr 17 14:23:33.054464 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:33.054436 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c067d95c-e09f-4c47-8281-f638f4740633-metrics-certs\") pod \"router-default-685d56bbbb-mp6gq\" (UID: \"c067d95c-e09f-4c47-8281-f638f4740633\") " pod="openshift-ingress/router-default-685d56bbbb-mp6gq" Apr 17 14:23:33.086959 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:33.086928 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-b9wcm\"" Apr 17 14:23:33.095157 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:33.095123 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-685d56bbbb-mp6gq" Apr 17 14:23:33.239559 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:33.239536 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-685d56bbbb-mp6gq"] Apr 17 14:23:33.241794 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:23:33.241763 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc067d95c_e09f_4c47_8281_f638f4740633.slice/crio-d77e329be80fd8aa752b16f297465484ef276d5891915d72e8bc6d6e03ca80a8 WatchSource:0}: Error finding container d77e329be80fd8aa752b16f297465484ef276d5891915d72e8bc6d6e03ca80a8: Status 404 returned error can't find the container with id d77e329be80fd8aa752b16f297465484ef276d5891915d72e8bc6d6e03ca80a8 Apr 17 14:23:33.254060 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:33.254032 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7951b390-3fe0-4bc3-bd2a-fde607c15638-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-zrlkv\" (UID: \"7951b390-3fe0-4bc3-bd2a-fde607c15638\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zrlkv" Apr 17 14:23:33.254164 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:23:33.254147 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 14:23:33.254237 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:23:33.254220 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7951b390-3fe0-4bc3-bd2a-fde607c15638-cluster-monitoring-operator-tls podName:7951b390-3fe0-4bc3-bd2a-fde607c15638 nodeName:}" failed. No retries permitted until 2026-04-17 14:23:49.254200012 +0000 UTC m=+147.560953615 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/7951b390-3fe0-4bc3-bd2a-fde607c15638-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-zrlkv" (UID: "7951b390-3fe0-4bc3-bd2a-fde607c15638") : secret "cluster-monitoring-operator-tls" not found Apr 17 14:23:33.890019 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:33.889986 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-685d56bbbb-mp6gq" event={"ID":"c067d95c-e09f-4c47-8281-f638f4740633","Type":"ContainerStarted","Data":"66a486d59b205620ec1d4191cc3f3ac0f1cca8b561c2465eeff6de4552027a2d"} Apr 17 14:23:33.890019 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:33.890021 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-685d56bbbb-mp6gq" event={"ID":"c067d95c-e09f-4c47-8281-f638f4740633","Type":"ContainerStarted","Data":"d77e329be80fd8aa752b16f297465484ef276d5891915d72e8bc6d6e03ca80a8"} Apr 17 14:23:33.907968 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:33.907926 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-685d56bbbb-mp6gq" podStartSLOduration=16.907911671 podStartE2EDuration="16.907911671s" podCreationTimestamp="2026-04-17 14:23:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 14:23:33.906570445 +0000 UTC m=+132.213324066" watchObservedRunningTime="2026-04-17 14:23:33.907911671 +0000 UTC m=+132.214665291" Apr 17 14:23:34.095944 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:34.095909 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-685d56bbbb-mp6gq" Apr 17 14:23:34.098433 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:34.098409 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-685d56bbbb-mp6gq" Apr 17 14:23:34.893021 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:34.892988 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-685d56bbbb-mp6gq" Apr 17 14:23:34.894380 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:34.894356 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-685d56bbbb-mp6gq" Apr 17 14:23:48.219771 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:48.219738 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-7b887f8f96-9xz6f"] Apr 17 14:23:48.222309 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:48.222287 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7b887f8f96-9xz6f" Apr 17 14:23:48.224834 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:48.224813 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 17 14:23:48.224968 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:48.224894 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 17 14:23:48.224968 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:48.224914 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-5dpr9\"" Apr 17 14:23:48.225088 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:48.225072 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 17 14:23:48.229862 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:48.229845 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 17 14:23:48.234059 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:48.234039 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7b887f8f96-9xz6f"] Apr 17 14:23:48.268514 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:48.268480 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/61a41f4f-43f3-483e-bf64-66b7a0d1f2a2-image-registry-private-configuration\") pod \"image-registry-7b887f8f96-9xz6f\" (UID: \"61a41f4f-43f3-483e-bf64-66b7a0d1f2a2\") " pod="openshift-image-registry/image-registry-7b887f8f96-9xz6f" Apr 17 14:23:48.268668 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:48.268523 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/61a41f4f-43f3-483e-bf64-66b7a0d1f2a2-trusted-ca\") pod \"image-registry-7b887f8f96-9xz6f\" (UID: \"61a41f4f-43f3-483e-bf64-66b7a0d1f2a2\") " pod="openshift-image-registry/image-registry-7b887f8f96-9xz6f" Apr 17 14:23:48.268668 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:48.268552 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6m68\" (UniqueName: \"kubernetes.io/projected/61a41f4f-43f3-483e-bf64-66b7a0d1f2a2-kube-api-access-m6m68\") pod \"image-registry-7b887f8f96-9xz6f\" (UID: \"61a41f4f-43f3-483e-bf64-66b7a0d1f2a2\") " pod="openshift-image-registry/image-registry-7b887f8f96-9xz6f" Apr 17 14:23:48.268668 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:48.268634 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/61a41f4f-43f3-483e-bf64-66b7a0d1f2a2-installation-pull-secrets\") pod \"image-registry-7b887f8f96-9xz6f\" (UID: \"61a41f4f-43f3-483e-bf64-66b7a0d1f2a2\") " pod="openshift-image-registry/image-registry-7b887f8f96-9xz6f" Apr 17 14:23:48.268805 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:48.268668 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/61a41f4f-43f3-483e-bf64-66b7a0d1f2a2-ca-trust-extracted\") pod \"image-registry-7b887f8f96-9xz6f\" (UID: \"61a41f4f-43f3-483e-bf64-66b7a0d1f2a2\") " pod="openshift-image-registry/image-registry-7b887f8f96-9xz6f" Apr 17 14:23:48.268805 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:48.268745 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/61a41f4f-43f3-483e-bf64-66b7a0d1f2a2-registry-tls\") pod \"image-registry-7b887f8f96-9xz6f\" (UID: \"61a41f4f-43f3-483e-bf64-66b7a0d1f2a2\") " pod="openshift-image-registry/image-registry-7b887f8f96-9xz6f" Apr 17 14:23:48.268805 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:48.268794 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/61a41f4f-43f3-483e-bf64-66b7a0d1f2a2-registry-certificates\") pod \"image-registry-7b887f8f96-9xz6f\" (UID: \"61a41f4f-43f3-483e-bf64-66b7a0d1f2a2\") " pod="openshift-image-registry/image-registry-7b887f8f96-9xz6f" Apr 17 14:23:48.268924 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:48.268819 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/61a41f4f-43f3-483e-bf64-66b7a0d1f2a2-bound-sa-token\") pod \"image-registry-7b887f8f96-9xz6f\" (UID: \"61a41f4f-43f3-483e-bf64-66b7a0d1f2a2\") " pod="openshift-image-registry/image-registry-7b887f8f96-9xz6f" Apr 17 14:23:48.330403 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:48.330371 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-zphcj"] Apr 17 14:23:48.333438 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:48.333419 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-zphcj" Apr 17 14:23:48.336136 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:48.336106 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 17 14:23:48.336136 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:48.336127 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 17 14:23:48.336283 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:48.336127 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-j9ndf\"" Apr 17 14:23:48.343267 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:48.343217 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-zphcj"] Apr 17 14:23:48.369669 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:48.369646 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/61a41f4f-43f3-483e-bf64-66b7a0d1f2a2-ca-trust-extracted\") pod \"image-registry-7b887f8f96-9xz6f\" (UID: \"61a41f4f-43f3-483e-bf64-66b7a0d1f2a2\") " pod="openshift-image-registry/image-registry-7b887f8f96-9xz6f" Apr 17 14:23:48.369821 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:48.369691 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/61a41f4f-43f3-483e-bf64-66b7a0d1f2a2-registry-tls\") pod \"image-registry-7b887f8f96-9xz6f\" (UID: \"61a41f4f-43f3-483e-bf64-66b7a0d1f2a2\") " pod="openshift-image-registry/image-registry-7b887f8f96-9xz6f" Apr 17 14:23:48.369821 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:48.369715 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/0090e783-2cdd-4c1a-b3ef-af664ae49c8f-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-zphcj\" (UID: \"0090e783-2cdd-4c1a-b3ef-af664ae49c8f\") " pod="openshift-insights/insights-runtime-extractor-zphcj" Apr 17 14:23:48.369821 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:48.369756 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/61a41f4f-43f3-483e-bf64-66b7a0d1f2a2-registry-certificates\") pod \"image-registry-7b887f8f96-9xz6f\" (UID: \"61a41f4f-43f3-483e-bf64-66b7a0d1f2a2\") " pod="openshift-image-registry/image-registry-7b887f8f96-9xz6f" Apr 17 14:23:48.369821 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:48.369773 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/0090e783-2cdd-4c1a-b3ef-af664ae49c8f-crio-socket\") pod \"insights-runtime-extractor-zphcj\" (UID: \"0090e783-2cdd-4c1a-b3ef-af664ae49c8f\") " pod="openshift-insights/insights-runtime-extractor-zphcj" Apr 17 14:23:48.369821 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:48.369797 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/61a41f4f-43f3-483e-bf64-66b7a0d1f2a2-bound-sa-token\") pod \"image-registry-7b887f8f96-9xz6f\" (UID: \"61a41f4f-43f3-483e-bf64-66b7a0d1f2a2\") " pod="openshift-image-registry/image-registry-7b887f8f96-9xz6f" Apr 17 14:23:48.370100 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:48.369826 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mr77\" (UniqueName: \"kubernetes.io/projected/0090e783-2cdd-4c1a-b3ef-af664ae49c8f-kube-api-access-5mr77\") pod \"insights-runtime-extractor-zphcj\" (UID: \"0090e783-2cdd-4c1a-b3ef-af664ae49c8f\") " pod="openshift-insights/insights-runtime-extractor-zphcj" Apr 17 14:23:48.370100 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:48.369865 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/61a41f4f-43f3-483e-bf64-66b7a0d1f2a2-image-registry-private-configuration\") pod \"image-registry-7b887f8f96-9xz6f\" (UID: \"61a41f4f-43f3-483e-bf64-66b7a0d1f2a2\") " pod="openshift-image-registry/image-registry-7b887f8f96-9xz6f" Apr 17 14:23:48.370100 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:48.369932 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/61a41f4f-43f3-483e-bf64-66b7a0d1f2a2-trusted-ca\") pod \"image-registry-7b887f8f96-9xz6f\" (UID: \"61a41f4f-43f3-483e-bf64-66b7a0d1f2a2\") " pod="openshift-image-registry/image-registry-7b887f8f96-9xz6f" Apr 17 14:23:48.370100 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:48.369962 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m6m68\" (UniqueName: \"kubernetes.io/projected/61a41f4f-43f3-483e-bf64-66b7a0d1f2a2-kube-api-access-m6m68\") pod \"image-registry-7b887f8f96-9xz6f\" (UID: \"61a41f4f-43f3-483e-bf64-66b7a0d1f2a2\") " pod="openshift-image-registry/image-registry-7b887f8f96-9xz6f" Apr 17 14:23:48.370100 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:48.370017 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/0090e783-2cdd-4c1a-b3ef-af664ae49c8f-data-volume\") pod \"insights-runtime-extractor-zphcj\" (UID: \"0090e783-2cdd-4c1a-b3ef-af664ae49c8f\") " pod="openshift-insights/insights-runtime-extractor-zphcj" Apr 17 14:23:48.370100 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:48.370044 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/0090e783-2cdd-4c1a-b3ef-af664ae49c8f-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-zphcj\" (UID: \"0090e783-2cdd-4c1a-b3ef-af664ae49c8f\") " pod="openshift-insights/insights-runtime-extractor-zphcj" Apr 17 14:23:48.370359 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:48.370111 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/61a41f4f-43f3-483e-bf64-66b7a0d1f2a2-installation-pull-secrets\") pod \"image-registry-7b887f8f96-9xz6f\" (UID: \"61a41f4f-43f3-483e-bf64-66b7a0d1f2a2\") " pod="openshift-image-registry/image-registry-7b887f8f96-9xz6f" Apr 17 14:23:48.370359 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:48.370250 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/61a41f4f-43f3-483e-bf64-66b7a0d1f2a2-ca-trust-extracted\") pod \"image-registry-7b887f8f96-9xz6f\" (UID: \"61a41f4f-43f3-483e-bf64-66b7a0d1f2a2\") " pod="openshift-image-registry/image-registry-7b887f8f96-9xz6f" Apr 17 14:23:48.370951 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:48.370921 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/61a41f4f-43f3-483e-bf64-66b7a0d1f2a2-registry-certificates\") pod \"image-registry-7b887f8f96-9xz6f\" (UID: \"61a41f4f-43f3-483e-bf64-66b7a0d1f2a2\") " pod="openshift-image-registry/image-registry-7b887f8f96-9xz6f" Apr 17 14:23:48.371065 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:48.371020 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/61a41f4f-43f3-483e-bf64-66b7a0d1f2a2-trusted-ca\") pod \"image-registry-7b887f8f96-9xz6f\" (UID: \"61a41f4f-43f3-483e-bf64-66b7a0d1f2a2\") " pod="openshift-image-registry/image-registry-7b887f8f96-9xz6f" Apr 17 14:23:48.372678 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:48.372658 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/61a41f4f-43f3-483e-bf64-66b7a0d1f2a2-installation-pull-secrets\") pod \"image-registry-7b887f8f96-9xz6f\" (UID: \"61a41f4f-43f3-483e-bf64-66b7a0d1f2a2\") " pod="openshift-image-registry/image-registry-7b887f8f96-9xz6f" Apr 17 14:23:48.372759 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:48.372665 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/61a41f4f-43f3-483e-bf64-66b7a0d1f2a2-registry-tls\") pod \"image-registry-7b887f8f96-9xz6f\" (UID: \"61a41f4f-43f3-483e-bf64-66b7a0d1f2a2\") " pod="openshift-image-registry/image-registry-7b887f8f96-9xz6f" Apr 17 14:23:48.372797 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:48.372754 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/61a41f4f-43f3-483e-bf64-66b7a0d1f2a2-image-registry-private-configuration\") pod \"image-registry-7b887f8f96-9xz6f\" (UID: \"61a41f4f-43f3-483e-bf64-66b7a0d1f2a2\") " pod="openshift-image-registry/image-registry-7b887f8f96-9xz6f" Apr 17 14:23:48.380681 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:48.380660 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/61a41f4f-43f3-483e-bf64-66b7a0d1f2a2-bound-sa-token\") pod \"image-registry-7b887f8f96-9xz6f\" (UID: \"61a41f4f-43f3-483e-bf64-66b7a0d1f2a2\") " pod="openshift-image-registry/image-registry-7b887f8f96-9xz6f" Apr 17 14:23:48.380903 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:48.380882 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6m68\" (UniqueName: \"kubernetes.io/projected/61a41f4f-43f3-483e-bf64-66b7a0d1f2a2-kube-api-access-m6m68\") pod \"image-registry-7b887f8f96-9xz6f\" (UID: \"61a41f4f-43f3-483e-bf64-66b7a0d1f2a2\") " pod="openshift-image-registry/image-registry-7b887f8f96-9xz6f" Apr 17 14:23:48.470941 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:48.470841 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/0090e783-2cdd-4c1a-b3ef-af664ae49c8f-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-zphcj\" (UID: \"0090e783-2cdd-4c1a-b3ef-af664ae49c8f\") " pod="openshift-insights/insights-runtime-extractor-zphcj" Apr 17 14:23:48.470941 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:48.470899 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/0090e783-2cdd-4c1a-b3ef-af664ae49c8f-crio-socket\") pod \"insights-runtime-extractor-zphcj\" (UID: \"0090e783-2cdd-4c1a-b3ef-af664ae49c8f\") " pod="openshift-insights/insights-runtime-extractor-zphcj" Apr 17 14:23:48.471116 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:48.470950 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5mr77\" (UniqueName: \"kubernetes.io/projected/0090e783-2cdd-4c1a-b3ef-af664ae49c8f-kube-api-access-5mr77\") pod \"insights-runtime-extractor-zphcj\" (UID: \"0090e783-2cdd-4c1a-b3ef-af664ae49c8f\") " pod="openshift-insights/insights-runtime-extractor-zphcj" Apr 17 14:23:48.471116 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:48.470969 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/0090e783-2cdd-4c1a-b3ef-af664ae49c8f-crio-socket\") pod \"insights-runtime-extractor-zphcj\" (UID: \"0090e783-2cdd-4c1a-b3ef-af664ae49c8f\") " pod="openshift-insights/insights-runtime-extractor-zphcj" Apr 17 14:23:48.471116 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:48.470986 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/0090e783-2cdd-4c1a-b3ef-af664ae49c8f-data-volume\") pod \"insights-runtime-extractor-zphcj\" (UID: \"0090e783-2cdd-4c1a-b3ef-af664ae49c8f\") " pod="openshift-insights/insights-runtime-extractor-zphcj" Apr 17 14:23:48.471116 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:48.471025 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/0090e783-2cdd-4c1a-b3ef-af664ae49c8f-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-zphcj\" (UID: \"0090e783-2cdd-4c1a-b3ef-af664ae49c8f\") " pod="openshift-insights/insights-runtime-extractor-zphcj" Apr 17 14:23:48.471302 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:48.471280 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/0090e783-2cdd-4c1a-b3ef-af664ae49c8f-data-volume\") pod \"insights-runtime-extractor-zphcj\" (UID: \"0090e783-2cdd-4c1a-b3ef-af664ae49c8f\") " pod="openshift-insights/insights-runtime-extractor-zphcj" Apr 17 14:23:48.471564 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:48.471547 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/0090e783-2cdd-4c1a-b3ef-af664ae49c8f-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-zphcj\" (UID: \"0090e783-2cdd-4c1a-b3ef-af664ae49c8f\") " pod="openshift-insights/insights-runtime-extractor-zphcj" Apr 17 14:23:48.473308 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:48.473292 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/0090e783-2cdd-4c1a-b3ef-af664ae49c8f-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-zphcj\" (UID: \"0090e783-2cdd-4c1a-b3ef-af664ae49c8f\") " pod="openshift-insights/insights-runtime-extractor-zphcj" Apr 17 14:23:48.479191 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:48.479165 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mr77\" (UniqueName: \"kubernetes.io/projected/0090e783-2cdd-4c1a-b3ef-af664ae49c8f-kube-api-access-5mr77\") pod \"insights-runtime-extractor-zphcj\" (UID: \"0090e783-2cdd-4c1a-b3ef-af664ae49c8f\") " pod="openshift-insights/insights-runtime-extractor-zphcj" Apr 17 14:23:48.531786 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:48.531756 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7b887f8f96-9xz6f" Apr 17 14:23:48.642222 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:48.642191 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-zphcj" Apr 17 14:23:48.647373 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:48.647349 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7b887f8f96-9xz6f"] Apr 17 14:23:48.650864 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:23:48.650837 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod61a41f4f_43f3_483e_bf64_66b7a0d1f2a2.slice/crio-2b0663a8a91d09c5f28e90c55a8c0aa7e6ad97eeed3982289b8cf119400cc951 WatchSource:0}: Error finding container 2b0663a8a91d09c5f28e90c55a8c0aa7e6ad97eeed3982289b8cf119400cc951: Status 404 returned error can't find the container with id 2b0663a8a91d09c5f28e90c55a8c0aa7e6ad97eeed3982289b8cf119400cc951 Apr 17 14:23:48.765390 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:48.765360 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-zphcj"] Apr 17 14:23:48.768291 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:23:48.768263 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0090e783_2cdd_4c1a_b3ef_af664ae49c8f.slice/crio-1b57bd38bb2693ddd5d8398c5313ecefb72bfcf938f49c7e1e96fda476ae1e95 WatchSource:0}: Error finding container 1b57bd38bb2693ddd5d8398c5313ecefb72bfcf938f49c7e1e96fda476ae1e95: Status 404 returned error can't find the container with id 1b57bd38bb2693ddd5d8398c5313ecefb72bfcf938f49c7e1e96fda476ae1e95 Apr 17 14:23:48.927596 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:48.927552 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-zphcj" event={"ID":"0090e783-2cdd-4c1a-b3ef-af664ae49c8f","Type":"ContainerStarted","Data":"58680ed8ff5276ef2702a676c59f8de195ae70f47091c77f93b60935504743a5"} Apr 17 14:23:48.927765 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:48.927603 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-zphcj" event={"ID":"0090e783-2cdd-4c1a-b3ef-af664ae49c8f","Type":"ContainerStarted","Data":"1b57bd38bb2693ddd5d8398c5313ecefb72bfcf938f49c7e1e96fda476ae1e95"} Apr 17 14:23:48.928836 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:48.928812 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7b887f8f96-9xz6f" event={"ID":"61a41f4f-43f3-483e-bf64-66b7a0d1f2a2","Type":"ContainerStarted","Data":"1fd569ec9349647d466de34aa3299540fca0a57c1f5435fe9b69b93b611d35cd"} Apr 17 14:23:48.928836 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:48.928840 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7b887f8f96-9xz6f" event={"ID":"61a41f4f-43f3-483e-bf64-66b7a0d1f2a2","Type":"ContainerStarted","Data":"2b0663a8a91d09c5f28e90c55a8c0aa7e6ad97eeed3982289b8cf119400cc951"} Apr 17 14:23:48.929040 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:48.928928 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-7b887f8f96-9xz6f" Apr 17 14:23:48.946668 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:48.946625 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-7b887f8f96-9xz6f" podStartSLOduration=0.946610627 podStartE2EDuration="946.610627ms" podCreationTimestamp="2026-04-17 14:23:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 14:23:48.946130896 +0000 UTC m=+147.252884517" watchObservedRunningTime="2026-04-17 14:23:48.946610627 +0000 UTC m=+147.253364225" Apr 17 14:23:49.277479 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:49.277444 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7951b390-3fe0-4bc3-bd2a-fde607c15638-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-zrlkv\" (UID: \"7951b390-3fe0-4bc3-bd2a-fde607c15638\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zrlkv" Apr 17 14:23:49.279790 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:49.279767 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7951b390-3fe0-4bc3-bd2a-fde607c15638-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-zrlkv\" (UID: \"7951b390-3fe0-4bc3-bd2a-fde607c15638\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zrlkv" Apr 17 14:23:49.496994 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:49.496958 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-dnz54\"" Apr 17 14:23:49.505376 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:49.505347 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zrlkv" Apr 17 14:23:49.627976 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:49.627946 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-zrlkv"] Apr 17 14:23:49.630653 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:23:49.630628 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7951b390_3fe0_4bc3_bd2a_fde607c15638.slice/crio-35ef7bec7822ce09207300f9984cafd0f0eb569eeca26859b83d3acead396563 WatchSource:0}: Error finding container 35ef7bec7822ce09207300f9984cafd0f0eb569eeca26859b83d3acead396563: Status 404 returned error can't find the container with id 35ef7bec7822ce09207300f9984cafd0f0eb569eeca26859b83d3acead396563 Apr 17 14:23:49.933007 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:49.932972 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-zphcj" event={"ID":"0090e783-2cdd-4c1a-b3ef-af664ae49c8f","Type":"ContainerStarted","Data":"20bfe9ea36b2cb528c568b4d71c697cfe7645a01f111d3ec8ca96578fe3f3cd2"} Apr 17 14:23:49.933845 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:49.933823 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zrlkv" event={"ID":"7951b390-3fe0-4bc3-bd2a-fde607c15638","Type":"ContainerStarted","Data":"35ef7bec7822ce09207300f9984cafd0f0eb569eeca26859b83d3acead396563"} Apr 17 14:23:51.941017 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:51.940919 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-zphcj" event={"ID":"0090e783-2cdd-4c1a-b3ef-af664ae49c8f","Type":"ContainerStarted","Data":"83942708541b57399b5f6c487edfba2d0c24f5273361e0f972c0b33897e1ef82"} Apr 17 14:23:51.942286 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:51.942262 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zrlkv" event={"ID":"7951b390-3fe0-4bc3-bd2a-fde607c15638","Type":"ContainerStarted","Data":"deb06864ad74029352e1cabd068bd9319e4f2f8150dbbbca785b7df2a4d41d0e"} Apr 17 14:23:51.960637 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:51.960588 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-zphcj" podStartSLOduration=1.213636923 podStartE2EDuration="3.960572457s" podCreationTimestamp="2026-04-17 14:23:48 +0000 UTC" firstStartedPulling="2026-04-17 14:23:48.828731562 +0000 UTC m=+147.135485168" lastFinishedPulling="2026-04-17 14:23:51.575667098 +0000 UTC m=+149.882420702" observedRunningTime="2026-04-17 14:23:51.958107027 +0000 UTC m=+150.264860647" watchObservedRunningTime="2026-04-17 14:23:51.960572457 +0000 UTC m=+150.267326074" Apr 17 14:23:51.973469 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:51.973428 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zrlkv" podStartSLOduration=33.026864091 podStartE2EDuration="34.973416999s" podCreationTimestamp="2026-04-17 14:23:17 +0000 UTC" firstStartedPulling="2026-04-17 14:23:49.632492241 +0000 UTC m=+147.939245840" lastFinishedPulling="2026-04-17 14:23:51.579045131 +0000 UTC m=+149.885798748" observedRunningTime="2026-04-17 14:23:51.973100814 +0000 UTC m=+150.279854435" watchObservedRunningTime="2026-04-17 14:23:51.973416999 +0000 UTC m=+150.280170965" Apr 17 14:23:57.535675 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:23:57.535612 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-lp8nz" podUID="822a8df9-29ea-4649-a163-22e1db926c84" Apr 17 14:23:57.551941 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:23:57.551911 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-869tz" podUID="d37b1dac-43fd-47dd-9f14-18b1f81b8155" Apr 17 14:23:57.956503 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:23:57.956474 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-lp8nz" Apr 17 14:23:58.413097 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:23:58.413055 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-ccsgf" podUID="d0eca2ae-83f6-462e-b7d9-9ab1592717a8" Apr 17 14:24:01.450436 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:01.450399 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-t7ww2"] Apr 17 14:24:01.455224 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:01.455205 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-t7ww2" Apr 17 14:24:01.459362 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:01.459341 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 17 14:24:01.460162 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:01.460144 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-qgmpp\"" Apr 17 14:24:01.460889 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:01.460849 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 17 14:24:01.461006 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:01.460932 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 17 14:24:01.470769 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:01.470741 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-t7ww2"] Apr 17 14:24:01.502250 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:01.502218 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-pnrxk"] Apr 17 14:24:01.505420 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:01.505397 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-pnrxk" Apr 17 14:24:01.511230 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:01.511206 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 17 14:24:01.511623 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:01.511494 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 17 14:24:01.511802 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:01.511786 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-c4k9f\"" Apr 17 14:24:01.512817 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:01.512788 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 17 14:24:01.581148 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:01.581111 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e8d4b72a-7437-4185-ba66-defa6c9c36cf-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-t7ww2\" (UID: \"e8d4b72a-7437-4185-ba66-defa6c9c36cf\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-t7ww2" Apr 17 14:24:01.581148 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:01.581147 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/48c89e9d-7b69-42c5-a313-71254165cc50-root\") pod \"node-exporter-pnrxk\" (UID: \"48c89e9d-7b69-42c5-a313-71254165cc50\") " pod="openshift-monitoring/node-exporter-pnrxk" Apr 17 14:24:01.581414 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:01.581168 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/48c89e9d-7b69-42c5-a313-71254165cc50-node-exporter-wtmp\") pod \"node-exporter-pnrxk\" (UID: \"48c89e9d-7b69-42c5-a313-71254165cc50\") " pod="openshift-monitoring/node-exporter-pnrxk" Apr 17 14:24:01.581414 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:01.581190 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/e8d4b72a-7437-4185-ba66-defa6c9c36cf-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-t7ww2\" (UID: \"e8d4b72a-7437-4185-ba66-defa6c9c36cf\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-t7ww2" Apr 17 14:24:01.581414 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:01.581240 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52vqp\" (UniqueName: \"kubernetes.io/projected/e8d4b72a-7437-4185-ba66-defa6c9c36cf-kube-api-access-52vqp\") pod \"openshift-state-metrics-9d44df66c-t7ww2\" (UID: \"e8d4b72a-7437-4185-ba66-defa6c9c36cf\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-t7ww2" Apr 17 14:24:01.581414 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:01.581282 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/48c89e9d-7b69-42c5-a313-71254165cc50-node-exporter-accelerators-collector-config\") pod \"node-exporter-pnrxk\" (UID: \"48c89e9d-7b69-42c5-a313-71254165cc50\") " pod="openshift-monitoring/node-exporter-pnrxk" Apr 17 14:24:01.581414 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:01.581324 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/48c89e9d-7b69-42c5-a313-71254165cc50-metrics-client-ca\") pod \"node-exporter-pnrxk\" (UID: \"48c89e9d-7b69-42c5-a313-71254165cc50\") " pod="openshift-monitoring/node-exporter-pnrxk" Apr 17 14:24:01.581414 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:01.581374 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e8d4b72a-7437-4185-ba66-defa6c9c36cf-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-t7ww2\" (UID: \"e8d4b72a-7437-4185-ba66-defa6c9c36cf\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-t7ww2" Apr 17 14:24:01.581414 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:01.581406 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bcf2\" (UniqueName: \"kubernetes.io/projected/48c89e9d-7b69-42c5-a313-71254165cc50-kube-api-access-5bcf2\") pod \"node-exporter-pnrxk\" (UID: \"48c89e9d-7b69-42c5-a313-71254165cc50\") " pod="openshift-monitoring/node-exporter-pnrxk" Apr 17 14:24:01.581685 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:01.581433 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/48c89e9d-7b69-42c5-a313-71254165cc50-node-exporter-tls\") pod \"node-exporter-pnrxk\" (UID: \"48c89e9d-7b69-42c5-a313-71254165cc50\") " pod="openshift-monitoring/node-exporter-pnrxk" Apr 17 14:24:01.581685 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:01.581487 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/48c89e9d-7b69-42c5-a313-71254165cc50-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-pnrxk\" (UID: \"48c89e9d-7b69-42c5-a313-71254165cc50\") " pod="openshift-monitoring/node-exporter-pnrxk" Apr 17 14:24:01.581685 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:01.581556 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/48c89e9d-7b69-42c5-a313-71254165cc50-sys\") pod \"node-exporter-pnrxk\" (UID: \"48c89e9d-7b69-42c5-a313-71254165cc50\") " pod="openshift-monitoring/node-exporter-pnrxk" Apr 17 14:24:01.581685 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:01.581587 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/48c89e9d-7b69-42c5-a313-71254165cc50-node-exporter-textfile\") pod \"node-exporter-pnrxk\" (UID: \"48c89e9d-7b69-42c5-a313-71254165cc50\") " pod="openshift-monitoring/node-exporter-pnrxk" Apr 17 14:24:01.683024 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:01.682986 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/48c89e9d-7b69-42c5-a313-71254165cc50-sys\") pod \"node-exporter-pnrxk\" (UID: \"48c89e9d-7b69-42c5-a313-71254165cc50\") " pod="openshift-monitoring/node-exporter-pnrxk" Apr 17 14:24:01.683024 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:01.683027 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/48c89e9d-7b69-42c5-a313-71254165cc50-node-exporter-textfile\") pod \"node-exporter-pnrxk\" (UID: \"48c89e9d-7b69-42c5-a313-71254165cc50\") " pod="openshift-monitoring/node-exporter-pnrxk" Apr 17 14:24:01.683265 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:01.683088 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e8d4b72a-7437-4185-ba66-defa6c9c36cf-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-t7ww2\" (UID: \"e8d4b72a-7437-4185-ba66-defa6c9c36cf\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-t7ww2" Apr 17 14:24:01.683265 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:01.683105 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/48c89e9d-7b69-42c5-a313-71254165cc50-sys\") pod \"node-exporter-pnrxk\" (UID: \"48c89e9d-7b69-42c5-a313-71254165cc50\") " pod="openshift-monitoring/node-exporter-pnrxk" Apr 17 14:24:01.683265 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:01.683112 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/48c89e9d-7b69-42c5-a313-71254165cc50-root\") pod \"node-exporter-pnrxk\" (UID: \"48c89e9d-7b69-42c5-a313-71254165cc50\") " pod="openshift-monitoring/node-exporter-pnrxk" Apr 17 14:24:01.683265 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:01.683150 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/48c89e9d-7b69-42c5-a313-71254165cc50-root\") pod \"node-exporter-pnrxk\" (UID: \"48c89e9d-7b69-42c5-a313-71254165cc50\") " pod="openshift-monitoring/node-exporter-pnrxk" Apr 17 14:24:01.683265 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:01.683183 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/48c89e9d-7b69-42c5-a313-71254165cc50-node-exporter-wtmp\") pod \"node-exporter-pnrxk\" (UID: \"48c89e9d-7b69-42c5-a313-71254165cc50\") " pod="openshift-monitoring/node-exporter-pnrxk" Apr 17 14:24:01.683265 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:01.683219 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/e8d4b72a-7437-4185-ba66-defa6c9c36cf-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-t7ww2\" (UID: \"e8d4b72a-7437-4185-ba66-defa6c9c36cf\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-t7ww2" Apr 17 14:24:01.683265 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:01.683245 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-52vqp\" (UniqueName: \"kubernetes.io/projected/e8d4b72a-7437-4185-ba66-defa6c9c36cf-kube-api-access-52vqp\") pod \"openshift-state-metrics-9d44df66c-t7ww2\" (UID: \"e8d4b72a-7437-4185-ba66-defa6c9c36cf\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-t7ww2" Apr 17 14:24:01.683598 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:01.683270 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/48c89e9d-7b69-42c5-a313-71254165cc50-node-exporter-accelerators-collector-config\") pod \"node-exporter-pnrxk\" (UID: \"48c89e9d-7b69-42c5-a313-71254165cc50\") " pod="openshift-monitoring/node-exporter-pnrxk" Apr 17 14:24:01.683598 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:01.683295 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/48c89e9d-7b69-42c5-a313-71254165cc50-metrics-client-ca\") pod \"node-exporter-pnrxk\" (UID: \"48c89e9d-7b69-42c5-a313-71254165cc50\") " pod="openshift-monitoring/node-exporter-pnrxk" Apr 17 14:24:01.683598 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:01.683345 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e8d4b72a-7437-4185-ba66-defa6c9c36cf-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-t7ww2\" (UID: \"e8d4b72a-7437-4185-ba66-defa6c9c36cf\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-t7ww2" Apr 17 14:24:01.683598 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:01.683344 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/48c89e9d-7b69-42c5-a313-71254165cc50-node-exporter-wtmp\") pod \"node-exporter-pnrxk\" (UID: \"48c89e9d-7b69-42c5-a313-71254165cc50\") " pod="openshift-monitoring/node-exporter-pnrxk" Apr 17 14:24:01.683598 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:01.683374 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5bcf2\" (UniqueName: \"kubernetes.io/projected/48c89e9d-7b69-42c5-a313-71254165cc50-kube-api-access-5bcf2\") pod \"node-exporter-pnrxk\" (UID: \"48c89e9d-7b69-42c5-a313-71254165cc50\") " pod="openshift-monitoring/node-exporter-pnrxk" Apr 17 14:24:01.683598 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:01.683472 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/48c89e9d-7b69-42c5-a313-71254165cc50-node-exporter-textfile\") pod \"node-exporter-pnrxk\" (UID: \"48c89e9d-7b69-42c5-a313-71254165cc50\") " pod="openshift-monitoring/node-exporter-pnrxk" Apr 17 14:24:01.683598 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:01.683546 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/48c89e9d-7b69-42c5-a313-71254165cc50-node-exporter-tls\") pod \"node-exporter-pnrxk\" (UID: \"48c89e9d-7b69-42c5-a313-71254165cc50\") " pod="openshift-monitoring/node-exporter-pnrxk" Apr 17 14:24:01.683598 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:01.683597 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/48c89e9d-7b69-42c5-a313-71254165cc50-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-pnrxk\" (UID: \"48c89e9d-7b69-42c5-a313-71254165cc50\") " pod="openshift-monitoring/node-exporter-pnrxk" Apr 17 14:24:01.684008 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:01.683923 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e8d4b72a-7437-4185-ba66-defa6c9c36cf-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-t7ww2\" (UID: \"e8d4b72a-7437-4185-ba66-defa6c9c36cf\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-t7ww2" Apr 17 14:24:01.684008 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:01.683993 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/48c89e9d-7b69-42c5-a313-71254165cc50-metrics-client-ca\") pod \"node-exporter-pnrxk\" (UID: \"48c89e9d-7b69-42c5-a313-71254165cc50\") " pod="openshift-monitoring/node-exporter-pnrxk" Apr 17 14:24:01.684175 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:01.684157 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/48c89e9d-7b69-42c5-a313-71254165cc50-node-exporter-accelerators-collector-config\") pod \"node-exporter-pnrxk\" (UID: \"48c89e9d-7b69-42c5-a313-71254165cc50\") " pod="openshift-monitoring/node-exporter-pnrxk" Apr 17 14:24:01.685753 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:01.685721 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/e8d4b72a-7437-4185-ba66-defa6c9c36cf-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-t7ww2\" (UID: \"e8d4b72a-7437-4185-ba66-defa6c9c36cf\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-t7ww2" Apr 17 14:24:01.685894 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:01.685811 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e8d4b72a-7437-4185-ba66-defa6c9c36cf-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-t7ww2\" (UID: \"e8d4b72a-7437-4185-ba66-defa6c9c36cf\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-t7ww2" Apr 17 14:24:01.686617 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:01.686598 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/48c89e9d-7b69-42c5-a313-71254165cc50-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-pnrxk\" (UID: \"48c89e9d-7b69-42c5-a313-71254165cc50\") " pod="openshift-monitoring/node-exporter-pnrxk" Apr 17 14:24:01.687019 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:01.686994 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/48c89e9d-7b69-42c5-a313-71254165cc50-node-exporter-tls\") pod \"node-exporter-pnrxk\" (UID: \"48c89e9d-7b69-42c5-a313-71254165cc50\") " pod="openshift-monitoring/node-exporter-pnrxk" Apr 17 14:24:01.694679 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:01.694658 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bcf2\" (UniqueName: \"kubernetes.io/projected/48c89e9d-7b69-42c5-a313-71254165cc50-kube-api-access-5bcf2\") pod \"node-exporter-pnrxk\" (UID: \"48c89e9d-7b69-42c5-a313-71254165cc50\") " pod="openshift-monitoring/node-exporter-pnrxk" Apr 17 14:24:01.695825 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:01.695809 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-52vqp\" (UniqueName: \"kubernetes.io/projected/e8d4b72a-7437-4185-ba66-defa6c9c36cf-kube-api-access-52vqp\") pod \"openshift-state-metrics-9d44df66c-t7ww2\" (UID: \"e8d4b72a-7437-4185-ba66-defa6c9c36cf\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-t7ww2" Apr 17 14:24:01.764780 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:01.764699 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-t7ww2" Apr 17 14:24:01.816086 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:01.815620 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-pnrxk" Apr 17 14:24:01.827298 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:24:01.827244 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48c89e9d_7b69_42c5_a313_71254165cc50.slice/crio-1b4c7de0455e15fe5554b4fcac51f84afc02c6307adcbfa9b7b21619573d55f6 WatchSource:0}: Error finding container 1b4c7de0455e15fe5554b4fcac51f84afc02c6307adcbfa9b7b21619573d55f6: Status 404 returned error can't find the container with id 1b4c7de0455e15fe5554b4fcac51f84afc02c6307adcbfa9b7b21619573d55f6 Apr 17 14:24:01.913276 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:01.913242 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-t7ww2"] Apr 17 14:24:01.916170 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:24:01.916141 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode8d4b72a_7437_4185_ba66_defa6c9c36cf.slice/crio-4a5d568fc3c8cf7763cf35c683d2fabe633af229487743b939391708c29642aa WatchSource:0}: Error finding container 4a5d568fc3c8cf7763cf35c683d2fabe633af229487743b939391708c29642aa: Status 404 returned error can't find the container with id 4a5d568fc3c8cf7763cf35c683d2fabe633af229487743b939391708c29642aa Apr 17 14:24:01.967844 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:01.967796 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-pnrxk" event={"ID":"48c89e9d-7b69-42c5-a313-71254165cc50","Type":"ContainerStarted","Data":"1b4c7de0455e15fe5554b4fcac51f84afc02c6307adcbfa9b7b21619573d55f6"} Apr 17 14:24:01.969228 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:01.969201 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-t7ww2" event={"ID":"e8d4b72a-7437-4185-ba66-defa6c9c36cf","Type":"ContainerStarted","Data":"4a5d568fc3c8cf7763cf35c683d2fabe633af229487743b939391708c29642aa"} Apr 17 14:24:02.489629 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:02.489593 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d37b1dac-43fd-47dd-9f14-18b1f81b8155-cert\") pod \"ingress-canary-869tz\" (UID: \"d37b1dac-43fd-47dd-9f14-18b1f81b8155\") " pod="openshift-ingress-canary/ingress-canary-869tz" Apr 17 14:24:02.490129 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:02.489696 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/822a8df9-29ea-4649-a163-22e1db926c84-metrics-tls\") pod \"dns-default-lp8nz\" (UID: \"822a8df9-29ea-4649-a163-22e1db926c84\") " pod="openshift-dns/dns-default-lp8nz" Apr 17 14:24:02.493157 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:02.493126 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/822a8df9-29ea-4649-a163-22e1db926c84-metrics-tls\") pod \"dns-default-lp8nz\" (UID: \"822a8df9-29ea-4649-a163-22e1db926c84\") " pod="openshift-dns/dns-default-lp8nz" Apr 17 14:24:02.493157 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:02.493149 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d37b1dac-43fd-47dd-9f14-18b1f81b8155-cert\") pod \"ingress-canary-869tz\" (UID: \"d37b1dac-43fd-47dd-9f14-18b1f81b8155\") " pod="openshift-ingress-canary/ingress-canary-869tz" Apr 17 14:24:02.516286 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:02.516253 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 14:24:02.521128 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:02.521101 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:24:02.523770 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:02.523566 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 17 14:24:02.523770 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:02.523578 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 17 14:24:02.523968 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:02.523842 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 17 14:24:02.523968 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:02.523915 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 17 14:24:02.523968 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:02.523925 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 17 14:24:02.524163 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:02.524141 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 17 14:24:02.524400 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:02.524377 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 17 14:24:02.525435 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:02.525157 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-xkzrg\"" Apr 17 14:24:02.525435 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:02.525172 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 17 14:24:02.525435 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:02.525336 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 17 14:24:02.533939 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:02.532401 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 14:24:02.591071 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:02.591015 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e49357e5-7634-4418-8ccc-f492994acdaa-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"e49357e5-7634-4418-8ccc-f492994acdaa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:24:02.591276 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:02.591119 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/e49357e5-7634-4418-8ccc-f492994acdaa-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"e49357e5-7634-4418-8ccc-f492994acdaa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:24:02.596216 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:02.591378 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/e49357e5-7634-4418-8ccc-f492994acdaa-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"e49357e5-7634-4418-8ccc-f492994acdaa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:24:02.596216 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:02.591459 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e49357e5-7634-4418-8ccc-f492994acdaa-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"e49357e5-7634-4418-8ccc-f492994acdaa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:24:02.596216 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:02.591559 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e49357e5-7634-4418-8ccc-f492994acdaa-tls-assets\") pod \"alertmanager-main-0\" (UID: \"e49357e5-7634-4418-8ccc-f492994acdaa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:24:02.596216 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:02.591656 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/e49357e5-7634-4418-8ccc-f492994acdaa-config-volume\") pod \"alertmanager-main-0\" (UID: \"e49357e5-7634-4418-8ccc-f492994acdaa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:24:02.596216 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:02.591693 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/e49357e5-7634-4418-8ccc-f492994acdaa-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"e49357e5-7634-4418-8ccc-f492994acdaa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:24:02.596216 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:02.591753 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e49357e5-7634-4418-8ccc-f492994acdaa-config-out\") pod \"alertmanager-main-0\" (UID: \"e49357e5-7634-4418-8ccc-f492994acdaa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:24:02.596216 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:02.591787 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvws9\" (UniqueName: \"kubernetes.io/projected/e49357e5-7634-4418-8ccc-f492994acdaa-kube-api-access-hvws9\") pod \"alertmanager-main-0\" (UID: \"e49357e5-7634-4418-8ccc-f492994acdaa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:24:02.596216 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:02.591838 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e49357e5-7634-4418-8ccc-f492994acdaa-web-config\") pod \"alertmanager-main-0\" (UID: \"e49357e5-7634-4418-8ccc-f492994acdaa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:24:02.596216 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:02.591898 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/e49357e5-7634-4418-8ccc-f492994acdaa-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"e49357e5-7634-4418-8ccc-f492994acdaa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:24:02.596216 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:02.591936 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e49357e5-7634-4418-8ccc-f492994acdaa-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"e49357e5-7634-4418-8ccc-f492994acdaa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:24:02.596216 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:02.591977 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e49357e5-7634-4418-8ccc-f492994acdaa-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"e49357e5-7634-4418-8ccc-f492994acdaa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:24:02.692666 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:02.692631 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e49357e5-7634-4418-8ccc-f492994acdaa-web-config\") pod \"alertmanager-main-0\" (UID: \"e49357e5-7634-4418-8ccc-f492994acdaa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:24:02.692767 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:02.692687 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/e49357e5-7634-4418-8ccc-f492994acdaa-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"e49357e5-7634-4418-8ccc-f492994acdaa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:24:02.692767 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:02.692723 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e49357e5-7634-4418-8ccc-f492994acdaa-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"e49357e5-7634-4418-8ccc-f492994acdaa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:24:02.692767 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:02.692757 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e49357e5-7634-4418-8ccc-f492994acdaa-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"e49357e5-7634-4418-8ccc-f492994acdaa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:24:02.692940 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:02.692799 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e49357e5-7634-4418-8ccc-f492994acdaa-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"e49357e5-7634-4418-8ccc-f492994acdaa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:24:02.692940 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:02.692833 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/e49357e5-7634-4418-8ccc-f492994acdaa-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"e49357e5-7634-4418-8ccc-f492994acdaa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:24:02.692940 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:02.692863 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/e49357e5-7634-4418-8ccc-f492994acdaa-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"e49357e5-7634-4418-8ccc-f492994acdaa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:24:02.692940 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:02.692920 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e49357e5-7634-4418-8ccc-f492994acdaa-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"e49357e5-7634-4418-8ccc-f492994acdaa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:24:02.693132 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:02.692950 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e49357e5-7634-4418-8ccc-f492994acdaa-tls-assets\") pod \"alertmanager-main-0\" (UID: \"e49357e5-7634-4418-8ccc-f492994acdaa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:24:02.693132 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:02.692986 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/e49357e5-7634-4418-8ccc-f492994acdaa-config-volume\") pod \"alertmanager-main-0\" (UID: \"e49357e5-7634-4418-8ccc-f492994acdaa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:24:02.693132 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:02.693009 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/e49357e5-7634-4418-8ccc-f492994acdaa-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"e49357e5-7634-4418-8ccc-f492994acdaa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:24:02.693132 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:02.693057 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e49357e5-7634-4418-8ccc-f492994acdaa-config-out\") pod \"alertmanager-main-0\" (UID: \"e49357e5-7634-4418-8ccc-f492994acdaa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:24:02.693132 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:24:02.693072 2575 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Apr 17 14:24:02.693132 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:02.693084 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hvws9\" (UniqueName: \"kubernetes.io/projected/e49357e5-7634-4418-8ccc-f492994acdaa-kube-api-access-hvws9\") pod \"alertmanager-main-0\" (UID: \"e49357e5-7634-4418-8ccc-f492994acdaa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:24:02.693420 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:24:02.693158 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e49357e5-7634-4418-8ccc-f492994acdaa-secret-alertmanager-main-tls podName:e49357e5-7634-4418-8ccc-f492994acdaa nodeName:}" failed. No retries permitted until 2026-04-17 14:24:03.193135462 +0000 UTC m=+161.499889060 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/e49357e5-7634-4418-8ccc-f492994acdaa-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "e49357e5-7634-4418-8ccc-f492994acdaa") : secret "alertmanager-main-tls" not found Apr 17 14:24:02.694214 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:02.694190 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e49357e5-7634-4418-8ccc-f492994acdaa-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"e49357e5-7634-4418-8ccc-f492994acdaa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:24:02.694326 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:24:02.694311 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e49357e5-7634-4418-8ccc-f492994acdaa-alertmanager-trusted-ca-bundle podName:e49357e5-7634-4418-8ccc-f492994acdaa nodeName:}" failed. No retries permitted until 2026-04-17 14:24:03.194292744 +0000 UTC m=+161.501046346 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/e49357e5-7634-4418-8ccc-f492994acdaa-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "e49357e5-7634-4418-8ccc-f492994acdaa") : configmap references non-existent config key: ca-bundle.crt Apr 17 14:24:02.694977 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:02.694949 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/e49357e5-7634-4418-8ccc-f492994acdaa-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"e49357e5-7634-4418-8ccc-f492994acdaa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:24:02.695650 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:02.695571 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/e49357e5-7634-4418-8ccc-f492994acdaa-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"e49357e5-7634-4418-8ccc-f492994acdaa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:24:02.695783 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:02.695755 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e49357e5-7634-4418-8ccc-f492994acdaa-web-config\") pod \"alertmanager-main-0\" (UID: \"e49357e5-7634-4418-8ccc-f492994acdaa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:24:02.696791 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:02.696592 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e49357e5-7634-4418-8ccc-f492994acdaa-config-out\") pod \"alertmanager-main-0\" (UID: \"e49357e5-7634-4418-8ccc-f492994acdaa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:24:02.696791 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:02.696754 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e49357e5-7634-4418-8ccc-f492994acdaa-tls-assets\") pod \"alertmanager-main-0\" (UID: \"e49357e5-7634-4418-8ccc-f492994acdaa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:24:02.697794 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:02.697766 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e49357e5-7634-4418-8ccc-f492994acdaa-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"e49357e5-7634-4418-8ccc-f492994acdaa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:24:02.697948 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:02.697929 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/e49357e5-7634-4418-8ccc-f492994acdaa-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"e49357e5-7634-4418-8ccc-f492994acdaa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:24:02.698047 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:02.698027 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/e49357e5-7634-4418-8ccc-f492994acdaa-config-volume\") pod \"alertmanager-main-0\" (UID: \"e49357e5-7634-4418-8ccc-f492994acdaa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:24:02.698958 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:02.698936 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e49357e5-7634-4418-8ccc-f492994acdaa-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"e49357e5-7634-4418-8ccc-f492994acdaa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:24:02.702559 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:02.702537 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvws9\" (UniqueName: \"kubernetes.io/projected/e49357e5-7634-4418-8ccc-f492994acdaa-kube-api-access-hvws9\") pod \"alertmanager-main-0\" (UID: \"e49357e5-7634-4418-8ccc-f492994acdaa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:24:02.759511 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:02.759484 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-rkhth\"" Apr 17 14:24:02.767308 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:02.767278 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-lp8nz" Apr 17 14:24:02.917889 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:02.917844 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-lp8nz"] Apr 17 14:24:02.943396 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:24:02.943321 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod822a8df9_29ea_4649_a163_22e1db926c84.slice/crio-52f05dedc2fc4d85ffade02435f5e531cc34a6bb71a28b8e6a615a8539cfb864 WatchSource:0}: Error finding container 52f05dedc2fc4d85ffade02435f5e531cc34a6bb71a28b8e6a615a8539cfb864: Status 404 returned error can't find the container with id 52f05dedc2fc4d85ffade02435f5e531cc34a6bb71a28b8e6a615a8539cfb864 Apr 17 14:24:02.973612 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:02.973574 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-lp8nz" event={"ID":"822a8df9-29ea-4649-a163-22e1db926c84","Type":"ContainerStarted","Data":"52f05dedc2fc4d85ffade02435f5e531cc34a6bb71a28b8e6a615a8539cfb864"} Apr 17 14:24:02.975311 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:02.975281 2575 generic.go:358] "Generic (PLEG): container finished" podID="48c89e9d-7b69-42c5-a313-71254165cc50" containerID="7746cd04166654b172a22ab9079c6882f3f479670e45fb91e94ef07612b310b7" exitCode=0 Apr 17 14:24:02.975428 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:02.975365 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-pnrxk" event={"ID":"48c89e9d-7b69-42c5-a313-71254165cc50","Type":"ContainerDied","Data":"7746cd04166654b172a22ab9079c6882f3f479670e45fb91e94ef07612b310b7"} Apr 17 14:24:02.977820 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:02.977794 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-t7ww2" event={"ID":"e8d4b72a-7437-4185-ba66-defa6c9c36cf","Type":"ContainerStarted","Data":"ae074723f4f5b1501b6c3d5671d73c6cb2f66f5b4741f40d19cec6ff1a7a3646"} Apr 17 14:24:02.977939 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:02.977829 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-t7ww2" event={"ID":"e8d4b72a-7437-4185-ba66-defa6c9c36cf","Type":"ContainerStarted","Data":"4166f3798db3e557f957bcdb3077829e3d61f8094f0d1e22eca6b09b508a4467"} Apr 17 14:24:03.197057 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:03.197026 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/e49357e5-7634-4418-8ccc-f492994acdaa-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"e49357e5-7634-4418-8ccc-f492994acdaa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:24:03.197233 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:03.197156 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e49357e5-7634-4418-8ccc-f492994acdaa-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"e49357e5-7634-4418-8ccc-f492994acdaa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:24:03.198037 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:03.198007 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e49357e5-7634-4418-8ccc-f492994acdaa-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"e49357e5-7634-4418-8ccc-f492994acdaa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:24:03.199741 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:03.199717 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/e49357e5-7634-4418-8ccc-f492994acdaa-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"e49357e5-7634-4418-8ccc-f492994acdaa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:24:03.435747 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:03.435705 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:24:03.565424 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:03.565397 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 14:24:03.568034 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:24:03.568005 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode49357e5_7634_4418_8ccc_f492994acdaa.slice/crio-6106b8332278d6716325ba4cae09e16900123f41fe4f9c396d8e74abf9c05ccf WatchSource:0}: Error finding container 6106b8332278d6716325ba4cae09e16900123f41fe4f9c396d8e74abf9c05ccf: Status 404 returned error can't find the container with id 6106b8332278d6716325ba4cae09e16900123f41fe4f9c396d8e74abf9c05ccf Apr 17 14:24:03.983099 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:03.983009 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-pnrxk" event={"ID":"48c89e9d-7b69-42c5-a313-71254165cc50","Type":"ContainerStarted","Data":"81c6b536a8bba91b56a860896646650c9813b2ada6c5922145af5ef806f4e0fd"} Apr 17 14:24:03.983099 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:03.983058 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-pnrxk" event={"ID":"48c89e9d-7b69-42c5-a313-71254165cc50","Type":"ContainerStarted","Data":"adf81c693a0a84e6a4f40d12e8d47c0ea146cbb95fcdc31e1b1d18731ba1882a"} Apr 17 14:24:03.985051 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:03.985019 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-t7ww2" event={"ID":"e8d4b72a-7437-4185-ba66-defa6c9c36cf","Type":"ContainerStarted","Data":"bf625764ae6df16d5768c70feb01e3305ea027cd973cd4b537c473ab197349c0"} Apr 17 14:24:03.986172 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:03.986146 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e49357e5-7634-4418-8ccc-f492994acdaa","Type":"ContainerStarted","Data":"6106b8332278d6716325ba4cae09e16900123f41fe4f9c396d8e74abf9c05ccf"} Apr 17 14:24:04.003015 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:04.002970 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-pnrxk" podStartSLOduration=2.16168253 podStartE2EDuration="3.002953273s" podCreationTimestamp="2026-04-17 14:24:01 +0000 UTC" firstStartedPulling="2026-04-17 14:24:01.829014073 +0000 UTC m=+160.135767672" lastFinishedPulling="2026-04-17 14:24:02.67028481 +0000 UTC m=+160.977038415" observedRunningTime="2026-04-17 14:24:04.0007705 +0000 UTC m=+162.307524139" watchObservedRunningTime="2026-04-17 14:24:04.002953273 +0000 UTC m=+162.309706893" Apr 17 14:24:04.017400 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:04.017356 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-t7ww2" podStartSLOduration=1.69678034 podStartE2EDuration="3.017342998s" podCreationTimestamp="2026-04-17 14:24:01 +0000 UTC" firstStartedPulling="2026-04-17 14:24:02.035596341 +0000 UTC m=+160.342349942" lastFinishedPulling="2026-04-17 14:24:03.356158997 +0000 UTC m=+161.662912600" observedRunningTime="2026-04-17 14:24:04.016591647 +0000 UTC m=+162.323345267" watchObservedRunningTime="2026-04-17 14:24:04.017342998 +0000 UTC m=+162.324096617" Apr 17 14:24:04.439845 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:04.439723 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-7f8cfbb8c7-ptxbt"] Apr 17 14:24:04.443751 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:04.443710 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-7f8cfbb8c7-ptxbt" Apr 17 14:24:04.446637 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:04.446612 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 17 14:24:04.446637 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:04.446627 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 17 14:24:04.446903 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:04.446881 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-5ojfuvo6phf3\"" Apr 17 14:24:04.447009 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:04.446937 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 17 14:24:04.447009 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:04.446863 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 17 14:24:04.447100 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:04.446942 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-2rjdw\"" Apr 17 14:24:04.447100 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:04.446892 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 17 14:24:04.460400 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:04.460375 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-7f8cfbb8c7-ptxbt"] Apr 17 14:24:04.509866 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:04.509824 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/60b0eda9-1239-430f-bd31-254ff621c737-metrics-client-ca\") pod \"thanos-querier-7f8cfbb8c7-ptxbt\" (UID: \"60b0eda9-1239-430f-bd31-254ff621c737\") " pod="openshift-monitoring/thanos-querier-7f8cfbb8c7-ptxbt" Apr 17 14:24:04.510032 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:04.509903 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/60b0eda9-1239-430f-bd31-254ff621c737-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7f8cfbb8c7-ptxbt\" (UID: \"60b0eda9-1239-430f-bd31-254ff621c737\") " pod="openshift-monitoring/thanos-querier-7f8cfbb8c7-ptxbt" Apr 17 14:24:04.510032 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:04.509965 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/60b0eda9-1239-430f-bd31-254ff621c737-secret-grpc-tls\") pod \"thanos-querier-7f8cfbb8c7-ptxbt\" (UID: \"60b0eda9-1239-430f-bd31-254ff621c737\") " pod="openshift-monitoring/thanos-querier-7f8cfbb8c7-ptxbt" Apr 17 14:24:04.510032 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:04.510015 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/60b0eda9-1239-430f-bd31-254ff621c737-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7f8cfbb8c7-ptxbt\" (UID: \"60b0eda9-1239-430f-bd31-254ff621c737\") " pod="openshift-monitoring/thanos-querier-7f8cfbb8c7-ptxbt" Apr 17 14:24:04.510136 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:04.510053 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/60b0eda9-1239-430f-bd31-254ff621c737-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7f8cfbb8c7-ptxbt\" (UID: \"60b0eda9-1239-430f-bd31-254ff621c737\") " pod="openshift-monitoring/thanos-querier-7f8cfbb8c7-ptxbt" Apr 17 14:24:04.510136 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:04.510082 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/60b0eda9-1239-430f-bd31-254ff621c737-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7f8cfbb8c7-ptxbt\" (UID: \"60b0eda9-1239-430f-bd31-254ff621c737\") " pod="openshift-monitoring/thanos-querier-7f8cfbb8c7-ptxbt" Apr 17 14:24:04.510136 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:04.510106 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/60b0eda9-1239-430f-bd31-254ff621c737-secret-thanos-querier-tls\") pod \"thanos-querier-7f8cfbb8c7-ptxbt\" (UID: \"60b0eda9-1239-430f-bd31-254ff621c737\") " pod="openshift-monitoring/thanos-querier-7f8cfbb8c7-ptxbt" Apr 17 14:24:04.510279 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:04.510215 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdp25\" (UniqueName: \"kubernetes.io/projected/60b0eda9-1239-430f-bd31-254ff621c737-kube-api-access-zdp25\") pod \"thanos-querier-7f8cfbb8c7-ptxbt\" (UID: \"60b0eda9-1239-430f-bd31-254ff621c737\") " pod="openshift-monitoring/thanos-querier-7f8cfbb8c7-ptxbt" Apr 17 14:24:04.611210 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:04.611179 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/60b0eda9-1239-430f-bd31-254ff621c737-metrics-client-ca\") pod \"thanos-querier-7f8cfbb8c7-ptxbt\" (UID: \"60b0eda9-1239-430f-bd31-254ff621c737\") " pod="openshift-monitoring/thanos-querier-7f8cfbb8c7-ptxbt" Apr 17 14:24:04.611640 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:04.611226 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/60b0eda9-1239-430f-bd31-254ff621c737-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7f8cfbb8c7-ptxbt\" (UID: \"60b0eda9-1239-430f-bd31-254ff621c737\") " pod="openshift-monitoring/thanos-querier-7f8cfbb8c7-ptxbt" Apr 17 14:24:04.611640 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:04.611273 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/60b0eda9-1239-430f-bd31-254ff621c737-secret-grpc-tls\") pod \"thanos-querier-7f8cfbb8c7-ptxbt\" (UID: \"60b0eda9-1239-430f-bd31-254ff621c737\") " pod="openshift-monitoring/thanos-querier-7f8cfbb8c7-ptxbt" Apr 17 14:24:04.611640 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:04.611302 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/60b0eda9-1239-430f-bd31-254ff621c737-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7f8cfbb8c7-ptxbt\" (UID: \"60b0eda9-1239-430f-bd31-254ff621c737\") " pod="openshift-monitoring/thanos-querier-7f8cfbb8c7-ptxbt" Apr 17 14:24:04.611640 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:04.611333 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/60b0eda9-1239-430f-bd31-254ff621c737-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7f8cfbb8c7-ptxbt\" (UID: \"60b0eda9-1239-430f-bd31-254ff621c737\") " pod="openshift-monitoring/thanos-querier-7f8cfbb8c7-ptxbt" Apr 17 14:24:04.611640 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:04.611368 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/60b0eda9-1239-430f-bd31-254ff621c737-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7f8cfbb8c7-ptxbt\" (UID: \"60b0eda9-1239-430f-bd31-254ff621c737\") " pod="openshift-monitoring/thanos-querier-7f8cfbb8c7-ptxbt" Apr 17 14:24:04.611640 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:04.611401 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/60b0eda9-1239-430f-bd31-254ff621c737-secret-thanos-querier-tls\") pod \"thanos-querier-7f8cfbb8c7-ptxbt\" (UID: \"60b0eda9-1239-430f-bd31-254ff621c737\") " pod="openshift-monitoring/thanos-querier-7f8cfbb8c7-ptxbt" Apr 17 14:24:04.611640 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:04.611488 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zdp25\" (UniqueName: \"kubernetes.io/projected/60b0eda9-1239-430f-bd31-254ff621c737-kube-api-access-zdp25\") pod \"thanos-querier-7f8cfbb8c7-ptxbt\" (UID: \"60b0eda9-1239-430f-bd31-254ff621c737\") " pod="openshift-monitoring/thanos-querier-7f8cfbb8c7-ptxbt" Apr 17 14:24:04.612934 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:04.612903 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/60b0eda9-1239-430f-bd31-254ff621c737-metrics-client-ca\") pod \"thanos-querier-7f8cfbb8c7-ptxbt\" (UID: \"60b0eda9-1239-430f-bd31-254ff621c737\") " pod="openshift-monitoring/thanos-querier-7f8cfbb8c7-ptxbt" Apr 17 14:24:04.615027 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:04.614997 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/60b0eda9-1239-430f-bd31-254ff621c737-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7f8cfbb8c7-ptxbt\" (UID: \"60b0eda9-1239-430f-bd31-254ff621c737\") " pod="openshift-monitoring/thanos-querier-7f8cfbb8c7-ptxbt" Apr 17 14:24:04.615192 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:04.615167 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/60b0eda9-1239-430f-bd31-254ff621c737-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7f8cfbb8c7-ptxbt\" (UID: \"60b0eda9-1239-430f-bd31-254ff621c737\") " pod="openshift-monitoring/thanos-querier-7f8cfbb8c7-ptxbt" Apr 17 14:24:04.615472 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:04.615438 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/60b0eda9-1239-430f-bd31-254ff621c737-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7f8cfbb8c7-ptxbt\" (UID: \"60b0eda9-1239-430f-bd31-254ff621c737\") " pod="openshift-monitoring/thanos-querier-7f8cfbb8c7-ptxbt" Apr 17 14:24:04.615472 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:04.615447 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/60b0eda9-1239-430f-bd31-254ff621c737-secret-thanos-querier-tls\") pod \"thanos-querier-7f8cfbb8c7-ptxbt\" (UID: \"60b0eda9-1239-430f-bd31-254ff621c737\") " pod="openshift-monitoring/thanos-querier-7f8cfbb8c7-ptxbt" Apr 17 14:24:04.615624 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:04.615498 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/60b0eda9-1239-430f-bd31-254ff621c737-secret-grpc-tls\") pod \"thanos-querier-7f8cfbb8c7-ptxbt\" (UID: \"60b0eda9-1239-430f-bd31-254ff621c737\") " pod="openshift-monitoring/thanos-querier-7f8cfbb8c7-ptxbt" Apr 17 14:24:04.615841 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:04.615818 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/60b0eda9-1239-430f-bd31-254ff621c737-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7f8cfbb8c7-ptxbt\" (UID: \"60b0eda9-1239-430f-bd31-254ff621c737\") " pod="openshift-monitoring/thanos-querier-7f8cfbb8c7-ptxbt" Apr 17 14:24:04.618830 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:04.618810 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdp25\" (UniqueName: \"kubernetes.io/projected/60b0eda9-1239-430f-bd31-254ff621c737-kube-api-access-zdp25\") pod \"thanos-querier-7f8cfbb8c7-ptxbt\" (UID: \"60b0eda9-1239-430f-bd31-254ff621c737\") " pod="openshift-monitoring/thanos-querier-7f8cfbb8c7-ptxbt" Apr 17 14:24:04.757056 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:04.756978 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-7f8cfbb8c7-ptxbt" Apr 17 14:24:04.949574 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:04.949525 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-7f8cfbb8c7-ptxbt"] Apr 17 14:24:04.966843 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:24:04.966809 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod60b0eda9_1239_430f_bd31_254ff621c737.slice/crio-0aff611fc4cfd5f9bf60225fab8ca96631f10e4dc1948400adb6b86537d44f82 WatchSource:0}: Error finding container 0aff611fc4cfd5f9bf60225fab8ca96631f10e4dc1948400adb6b86537d44f82: Status 404 returned error can't find the container with id 0aff611fc4cfd5f9bf60225fab8ca96631f10e4dc1948400adb6b86537d44f82 Apr 17 14:24:04.989881 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:04.989841 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-lp8nz" event={"ID":"822a8df9-29ea-4649-a163-22e1db926c84","Type":"ContainerStarted","Data":"d47bf4eb3a89ba96df888ee6455625decaa9e113bfb851c4970f29141f2cb127"} Apr 17 14:24:04.991230 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:04.991207 2575 generic.go:358] "Generic (PLEG): container finished" podID="e49357e5-7634-4418-8ccc-f492994acdaa" containerID="fbe02d03fb433871811e2a0abeaa9e4cd47e6592e307396e83be993661bd43b4" exitCode=0 Apr 17 14:24:04.991319 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:04.991286 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e49357e5-7634-4418-8ccc-f492994acdaa","Type":"ContainerDied","Data":"fbe02d03fb433871811e2a0abeaa9e4cd47e6592e307396e83be993661bd43b4"} Apr 17 14:24:04.992406 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:04.992369 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7f8cfbb8c7-ptxbt" event={"ID":"60b0eda9-1239-430f-bd31-254ff621c737","Type":"ContainerStarted","Data":"0aff611fc4cfd5f9bf60225fab8ca96631f10e4dc1948400adb6b86537d44f82"} Apr 17 14:24:05.862680 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:05.862643 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-558d8d6877-tvdz9"] Apr 17 14:24:05.865853 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:05.865835 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-558d8d6877-tvdz9" Apr 17 14:24:05.868485 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:05.868463 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 17 14:24:05.868783 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:05.868763 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 17 14:24:05.870043 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:05.870021 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-d2f8m12r8ljf8\"" Apr 17 14:24:05.870149 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:05.870113 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 17 14:24:05.870227 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:05.870168 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-46ttg\"" Apr 17 14:24:05.870280 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:05.870235 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 17 14:24:05.873477 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:05.873453 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-558d8d6877-tvdz9"] Apr 17 14:24:05.925735 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:05.925681 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpqvl\" (UniqueName: \"kubernetes.io/projected/d3b554de-ad0f-47d2-8d4e-73fe61a9588d-kube-api-access-lpqvl\") pod \"metrics-server-558d8d6877-tvdz9\" (UID: \"d3b554de-ad0f-47d2-8d4e-73fe61a9588d\") " pod="openshift-monitoring/metrics-server-558d8d6877-tvdz9" Apr 17 14:24:05.925941 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:05.925751 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/d3b554de-ad0f-47d2-8d4e-73fe61a9588d-metrics-server-audit-profiles\") pod \"metrics-server-558d8d6877-tvdz9\" (UID: \"d3b554de-ad0f-47d2-8d4e-73fe61a9588d\") " pod="openshift-monitoring/metrics-server-558d8d6877-tvdz9" Apr 17 14:24:05.925941 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:05.925788 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/d3b554de-ad0f-47d2-8d4e-73fe61a9588d-secret-metrics-server-tls\") pod \"metrics-server-558d8d6877-tvdz9\" (UID: \"d3b554de-ad0f-47d2-8d4e-73fe61a9588d\") " pod="openshift-monitoring/metrics-server-558d8d6877-tvdz9" Apr 17 14:24:05.925941 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:05.925822 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/d3b554de-ad0f-47d2-8d4e-73fe61a9588d-audit-log\") pod \"metrics-server-558d8d6877-tvdz9\" (UID: \"d3b554de-ad0f-47d2-8d4e-73fe61a9588d\") " pod="openshift-monitoring/metrics-server-558d8d6877-tvdz9" Apr 17 14:24:05.925941 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:05.925849 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3b554de-ad0f-47d2-8d4e-73fe61a9588d-client-ca-bundle\") pod \"metrics-server-558d8d6877-tvdz9\" (UID: \"d3b554de-ad0f-47d2-8d4e-73fe61a9588d\") " pod="openshift-monitoring/metrics-server-558d8d6877-tvdz9" Apr 17 14:24:05.925941 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:05.925930 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d3b554de-ad0f-47d2-8d4e-73fe61a9588d-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-558d8d6877-tvdz9\" (UID: \"d3b554de-ad0f-47d2-8d4e-73fe61a9588d\") " pod="openshift-monitoring/metrics-server-558d8d6877-tvdz9" Apr 17 14:24:05.926218 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:05.925984 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/d3b554de-ad0f-47d2-8d4e-73fe61a9588d-secret-metrics-server-client-certs\") pod \"metrics-server-558d8d6877-tvdz9\" (UID: \"d3b554de-ad0f-47d2-8d4e-73fe61a9588d\") " pod="openshift-monitoring/metrics-server-558d8d6877-tvdz9" Apr 17 14:24:06.000602 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:06.000564 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-lp8nz" event={"ID":"822a8df9-29ea-4649-a163-22e1db926c84","Type":"ContainerStarted","Data":"6d81bb7ded3023cc82d95e3e8d814cc160cdb873518a3372efe442d5a626adcf"} Apr 17 14:24:06.001263 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:06.001229 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-lp8nz" Apr 17 14:24:06.018593 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:06.018550 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-lp8nz" podStartSLOduration=130.172047475 podStartE2EDuration="2m12.01853676s" podCreationTimestamp="2026-04-17 14:21:54 +0000 UTC" firstStartedPulling="2026-04-17 14:24:02.945365437 +0000 UTC m=+161.252119039" lastFinishedPulling="2026-04-17 14:24:04.791854721 +0000 UTC m=+163.098608324" observedRunningTime="2026-04-17 14:24:06.01702632 +0000 UTC m=+164.323779983" watchObservedRunningTime="2026-04-17 14:24:06.01853676 +0000 UTC m=+164.325290380" Apr 17 14:24:06.027372 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:06.027341 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lpqvl\" (UniqueName: \"kubernetes.io/projected/d3b554de-ad0f-47d2-8d4e-73fe61a9588d-kube-api-access-lpqvl\") pod \"metrics-server-558d8d6877-tvdz9\" (UID: \"d3b554de-ad0f-47d2-8d4e-73fe61a9588d\") " pod="openshift-monitoring/metrics-server-558d8d6877-tvdz9" Apr 17 14:24:06.027524 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:06.027403 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/d3b554de-ad0f-47d2-8d4e-73fe61a9588d-metrics-server-audit-profiles\") pod \"metrics-server-558d8d6877-tvdz9\" (UID: \"d3b554de-ad0f-47d2-8d4e-73fe61a9588d\") " pod="openshift-monitoring/metrics-server-558d8d6877-tvdz9" Apr 17 14:24:06.027524 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:06.027433 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/d3b554de-ad0f-47d2-8d4e-73fe61a9588d-secret-metrics-server-tls\") pod \"metrics-server-558d8d6877-tvdz9\" (UID: \"d3b554de-ad0f-47d2-8d4e-73fe61a9588d\") " pod="openshift-monitoring/metrics-server-558d8d6877-tvdz9" Apr 17 14:24:06.027524 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:06.027469 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/d3b554de-ad0f-47d2-8d4e-73fe61a9588d-audit-log\") pod \"metrics-server-558d8d6877-tvdz9\" (UID: \"d3b554de-ad0f-47d2-8d4e-73fe61a9588d\") " pod="openshift-monitoring/metrics-server-558d8d6877-tvdz9" Apr 17 14:24:06.027524 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:06.027511 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3b554de-ad0f-47d2-8d4e-73fe61a9588d-client-ca-bundle\") pod \"metrics-server-558d8d6877-tvdz9\" (UID: \"d3b554de-ad0f-47d2-8d4e-73fe61a9588d\") " pod="openshift-monitoring/metrics-server-558d8d6877-tvdz9" Apr 17 14:24:06.027743 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:06.027545 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d3b554de-ad0f-47d2-8d4e-73fe61a9588d-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-558d8d6877-tvdz9\" (UID: \"d3b554de-ad0f-47d2-8d4e-73fe61a9588d\") " pod="openshift-monitoring/metrics-server-558d8d6877-tvdz9" Apr 17 14:24:06.027743 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:06.027576 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/d3b554de-ad0f-47d2-8d4e-73fe61a9588d-secret-metrics-server-client-certs\") pod \"metrics-server-558d8d6877-tvdz9\" (UID: \"d3b554de-ad0f-47d2-8d4e-73fe61a9588d\") " pod="openshift-monitoring/metrics-server-558d8d6877-tvdz9" Apr 17 14:24:06.028328 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:06.028259 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/d3b554de-ad0f-47d2-8d4e-73fe61a9588d-audit-log\") pod \"metrics-server-558d8d6877-tvdz9\" (UID: \"d3b554de-ad0f-47d2-8d4e-73fe61a9588d\") " pod="openshift-monitoring/metrics-server-558d8d6877-tvdz9" Apr 17 14:24:06.028615 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:06.028576 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/d3b554de-ad0f-47d2-8d4e-73fe61a9588d-metrics-server-audit-profiles\") pod \"metrics-server-558d8d6877-tvdz9\" (UID: \"d3b554de-ad0f-47d2-8d4e-73fe61a9588d\") " pod="openshift-monitoring/metrics-server-558d8d6877-tvdz9" Apr 17 14:24:06.029217 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:06.029196 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d3b554de-ad0f-47d2-8d4e-73fe61a9588d-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-558d8d6877-tvdz9\" (UID: \"d3b554de-ad0f-47d2-8d4e-73fe61a9588d\") " pod="openshift-monitoring/metrics-server-558d8d6877-tvdz9" Apr 17 14:24:06.030515 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:06.030491 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/d3b554de-ad0f-47d2-8d4e-73fe61a9588d-secret-metrics-server-tls\") pod \"metrics-server-558d8d6877-tvdz9\" (UID: \"d3b554de-ad0f-47d2-8d4e-73fe61a9588d\") " pod="openshift-monitoring/metrics-server-558d8d6877-tvdz9" Apr 17 14:24:06.030620 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:06.030570 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3b554de-ad0f-47d2-8d4e-73fe61a9588d-client-ca-bundle\") pod \"metrics-server-558d8d6877-tvdz9\" (UID: \"d3b554de-ad0f-47d2-8d4e-73fe61a9588d\") " pod="openshift-monitoring/metrics-server-558d8d6877-tvdz9" Apr 17 14:24:06.030680 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:06.030663 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/d3b554de-ad0f-47d2-8d4e-73fe61a9588d-secret-metrics-server-client-certs\") pod \"metrics-server-558d8d6877-tvdz9\" (UID: \"d3b554de-ad0f-47d2-8d4e-73fe61a9588d\") " pod="openshift-monitoring/metrics-server-558d8d6877-tvdz9" Apr 17 14:24:06.035650 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:06.035629 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpqvl\" (UniqueName: \"kubernetes.io/projected/d3b554de-ad0f-47d2-8d4e-73fe61a9588d-kube-api-access-lpqvl\") pod \"metrics-server-558d8d6877-tvdz9\" (UID: \"d3b554de-ad0f-47d2-8d4e-73fe61a9588d\") " pod="openshift-monitoring/metrics-server-558d8d6877-tvdz9" Apr 17 14:24:06.178040 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:06.177664 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-558d8d6877-tvdz9" Apr 17 14:24:06.659502 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:06.659480 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-558d8d6877-tvdz9"] Apr 17 14:24:07.075949 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:24:07.075918 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd3b554de_ad0f_47d2_8d4e_73fe61a9588d.slice/crio-ecaf13a00fa193a50ff0881ef0adf6e2a9b1902e98fb1dd772dd2448e26377cd WatchSource:0}: Error finding container ecaf13a00fa193a50ff0881ef0adf6e2a9b1902e98fb1dd772dd2448e26377cd: Status 404 returned error can't find the container with id ecaf13a00fa193a50ff0881ef0adf6e2a9b1902e98fb1dd772dd2448e26377cd Apr 17 14:24:08.011132 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:08.011086 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7f8cfbb8c7-ptxbt" event={"ID":"60b0eda9-1239-430f-bd31-254ff621c737","Type":"ContainerStarted","Data":"c85b5518fcceae4d550e63af02a3ad3b1d9555a10a57bf4e1286267bee9d8393"} Apr 17 14:24:08.011132 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:08.011136 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7f8cfbb8c7-ptxbt" event={"ID":"60b0eda9-1239-430f-bd31-254ff621c737","Type":"ContainerStarted","Data":"48f4ae9b05cccfd330915a6b244da29689714a56fad2956f10e14a423547dd5c"} Apr 17 14:24:08.011408 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:08.011153 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7f8cfbb8c7-ptxbt" event={"ID":"60b0eda9-1239-430f-bd31-254ff621c737","Type":"ContainerStarted","Data":"175957623a80d5c4ae8dbc4acc7cdfa284268f522525f8b508022b0604843495"} Apr 17 14:24:08.017771 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:08.017703 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-558d8d6877-tvdz9" event={"ID":"d3b554de-ad0f-47d2-8d4e-73fe61a9588d","Type":"ContainerStarted","Data":"ecaf13a00fa193a50ff0881ef0adf6e2a9b1902e98fb1dd772dd2448e26377cd"} Apr 17 14:24:08.023128 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:08.023026 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e49357e5-7634-4418-8ccc-f492994acdaa","Type":"ContainerStarted","Data":"e0bc0daca6dca5174b6efb519905fdf80c9d516da6c74e5e5576480aa4ee00e2"} Apr 17 14:24:08.023128 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:08.023063 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e49357e5-7634-4418-8ccc-f492994acdaa","Type":"ContainerStarted","Data":"54998b5a1e42d4b3c2c2acd563105c329b09abc72fcbf89ccd0cda1c6a254d73"} Apr 17 14:24:08.023128 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:08.023076 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e49357e5-7634-4418-8ccc-f492994acdaa","Type":"ContainerStarted","Data":"9f149de94210911e033f35147e1cd33526331e223042475d302ebd10d8329a55"} Apr 17 14:24:08.023128 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:08.023089 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e49357e5-7634-4418-8ccc-f492994acdaa","Type":"ContainerStarted","Data":"6fa1c6c448c140638c011b404eabb5fcb410da6007e9dd8e59d58538190c8dda"} Apr 17 14:24:08.023128 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:08.023101 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e49357e5-7634-4418-8ccc-f492994acdaa","Type":"ContainerStarted","Data":"1cf2d10522596483794d066c9db6edf31ba728f273580cf91e7be01438c8df03"} Apr 17 14:24:08.536177 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:08.536139 2575 patch_prober.go:28] interesting pod/image-registry-7b887f8f96-9xz6f container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 17 14:24:08.536640 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:08.536203 2575 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-7b887f8f96-9xz6f" podUID="61a41f4f-43f3-483e-bf64-66b7a0d1f2a2" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 14:24:09.026901 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:09.026851 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-558d8d6877-tvdz9" event={"ID":"d3b554de-ad0f-47d2-8d4e-73fe61a9588d","Type":"ContainerStarted","Data":"702ae02750a70e54cd76d135166c26a2cea8f698082b2f5e59e7fe59b213bd2e"} Apr 17 14:24:09.029843 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:09.029819 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e49357e5-7634-4418-8ccc-f492994acdaa","Type":"ContainerStarted","Data":"ef926ea4088da0138ffd9d1c93c4005f2be226aa47ab71c240737a1ed688c769"} Apr 17 14:24:09.032518 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:09.032496 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7f8cfbb8c7-ptxbt" event={"ID":"60b0eda9-1239-430f-bd31-254ff621c737","Type":"ContainerStarted","Data":"c8d1e1de9decb45c2240cab9d5a61d13eec9ff70bd604f1eef9511345bfa3903"} Apr 17 14:24:09.032518 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:09.032521 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7f8cfbb8c7-ptxbt" event={"ID":"60b0eda9-1239-430f-bd31-254ff621c737","Type":"ContainerStarted","Data":"1f8561808acd072e65529330dd4227d1542deaad03e547471e33f080582d50ac"} Apr 17 14:24:09.032683 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:09.032530 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7f8cfbb8c7-ptxbt" event={"ID":"60b0eda9-1239-430f-bd31-254ff621c737","Type":"ContainerStarted","Data":"68bd053442332a131a3e3f00715857be51b694a057f47e552f6a5767be7333c9"} Apr 17 14:24:09.032716 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:09.032688 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-7f8cfbb8c7-ptxbt" Apr 17 14:24:09.042974 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:09.042886 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-558d8d6877-tvdz9" podStartSLOduration=2.476477003 podStartE2EDuration="4.042853373s" podCreationTimestamp="2026-04-17 14:24:05 +0000 UTC" firstStartedPulling="2026-04-17 14:24:07.102283917 +0000 UTC m=+165.409037526" lastFinishedPulling="2026-04-17 14:24:08.668660292 +0000 UTC m=+166.975413896" observedRunningTime="2026-04-17 14:24:09.042591828 +0000 UTC m=+167.349345451" watchObservedRunningTime="2026-04-17 14:24:09.042853373 +0000 UTC m=+167.349606994" Apr 17 14:24:09.071937 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:09.071887 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=1.9739882739999999 podStartE2EDuration="7.071857592s" podCreationTimestamp="2026-04-17 14:24:02 +0000 UTC" firstStartedPulling="2026-04-17 14:24:03.569847941 +0000 UTC m=+161.876601539" lastFinishedPulling="2026-04-17 14:24:08.667717248 +0000 UTC m=+166.974470857" observedRunningTime="2026-04-17 14:24:09.069560361 +0000 UTC m=+167.376313983" watchObservedRunningTime="2026-04-17 14:24:09.071857592 +0000 UTC m=+167.378611211" Apr 17 14:24:09.089366 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:09.089321 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-7f8cfbb8c7-ptxbt" podStartSLOduration=1.388014665 podStartE2EDuration="5.089305512s" podCreationTimestamp="2026-04-17 14:24:04 +0000 UTC" firstStartedPulling="2026-04-17 14:24:04.968949976 +0000 UTC m=+163.275703574" lastFinishedPulling="2026-04-17 14:24:08.670240816 +0000 UTC m=+166.976994421" observedRunningTime="2026-04-17 14:24:09.088531673 +0000 UTC m=+167.395285294" watchObservedRunningTime="2026-04-17 14:24:09.089305512 +0000 UTC m=+167.396059162" Apr 17 14:24:09.185118 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:09.185086 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5d6789fbd5-8428x"] Apr 17 14:24:09.188097 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:09.188080 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5d6789fbd5-8428x" Apr 17 14:24:09.190972 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:09.190934 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 17 14:24:09.191080 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:09.191044 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 17 14:24:09.191258 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:09.191233 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 17 14:24:09.191381 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:09.191363 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 17 14:24:09.191445 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:09.191420 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 17 14:24:09.191522 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:09.191507 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 17 14:24:09.191574 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:09.191526 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 17 14:24:09.191774 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:09.191758 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-lrnz8\"" Apr 17 14:24:09.201043 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:09.201017 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5d6789fbd5-8428x"] Apr 17 14:24:09.261719 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:09.261675 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b98644a1-99ab-4fab-be81-f0624b8a4c92-console-config\") pod \"console-5d6789fbd5-8428x\" (UID: \"b98644a1-99ab-4fab-be81-f0624b8a4c92\") " pod="openshift-console/console-5d6789fbd5-8428x" Apr 17 14:24:09.261910 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:09.261732 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b98644a1-99ab-4fab-be81-f0624b8a4c92-oauth-serving-cert\") pod \"console-5d6789fbd5-8428x\" (UID: \"b98644a1-99ab-4fab-be81-f0624b8a4c92\") " pod="openshift-console/console-5d6789fbd5-8428x" Apr 17 14:24:09.261910 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:09.261839 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b98644a1-99ab-4fab-be81-f0624b8a4c92-console-serving-cert\") pod \"console-5d6789fbd5-8428x\" (UID: \"b98644a1-99ab-4fab-be81-f0624b8a4c92\") " pod="openshift-console/console-5d6789fbd5-8428x" Apr 17 14:24:09.261910 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:09.261899 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b98644a1-99ab-4fab-be81-f0624b8a4c92-service-ca\") pod \"console-5d6789fbd5-8428x\" (UID: \"b98644a1-99ab-4fab-be81-f0624b8a4c92\") " pod="openshift-console/console-5d6789fbd5-8428x" Apr 17 14:24:09.262009 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:09.261959 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7jb9\" (UniqueName: \"kubernetes.io/projected/b98644a1-99ab-4fab-be81-f0624b8a4c92-kube-api-access-s7jb9\") pod \"console-5d6789fbd5-8428x\" (UID: \"b98644a1-99ab-4fab-be81-f0624b8a4c92\") " pod="openshift-console/console-5d6789fbd5-8428x" Apr 17 14:24:09.262009 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:09.261989 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b98644a1-99ab-4fab-be81-f0624b8a4c92-console-oauth-config\") pod \"console-5d6789fbd5-8428x\" (UID: \"b98644a1-99ab-4fab-be81-f0624b8a4c92\") " pod="openshift-console/console-5d6789fbd5-8428x" Apr 17 14:24:09.363089 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:09.363056 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s7jb9\" (UniqueName: \"kubernetes.io/projected/b98644a1-99ab-4fab-be81-f0624b8a4c92-kube-api-access-s7jb9\") pod \"console-5d6789fbd5-8428x\" (UID: \"b98644a1-99ab-4fab-be81-f0624b8a4c92\") " pod="openshift-console/console-5d6789fbd5-8428x" Apr 17 14:24:09.363272 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:09.363103 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b98644a1-99ab-4fab-be81-f0624b8a4c92-console-oauth-config\") pod \"console-5d6789fbd5-8428x\" (UID: \"b98644a1-99ab-4fab-be81-f0624b8a4c92\") " pod="openshift-console/console-5d6789fbd5-8428x" Apr 17 14:24:09.363272 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:09.363119 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b98644a1-99ab-4fab-be81-f0624b8a4c92-console-config\") pod \"console-5d6789fbd5-8428x\" (UID: \"b98644a1-99ab-4fab-be81-f0624b8a4c92\") " pod="openshift-console/console-5d6789fbd5-8428x" Apr 17 14:24:09.363272 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:09.363140 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b98644a1-99ab-4fab-be81-f0624b8a4c92-oauth-serving-cert\") pod \"console-5d6789fbd5-8428x\" (UID: \"b98644a1-99ab-4fab-be81-f0624b8a4c92\") " pod="openshift-console/console-5d6789fbd5-8428x" Apr 17 14:24:09.363272 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:09.363188 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b98644a1-99ab-4fab-be81-f0624b8a4c92-console-serving-cert\") pod \"console-5d6789fbd5-8428x\" (UID: \"b98644a1-99ab-4fab-be81-f0624b8a4c92\") " pod="openshift-console/console-5d6789fbd5-8428x" Apr 17 14:24:09.363272 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:09.363216 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b98644a1-99ab-4fab-be81-f0624b8a4c92-service-ca\") pod \"console-5d6789fbd5-8428x\" (UID: \"b98644a1-99ab-4fab-be81-f0624b8a4c92\") " pod="openshift-console/console-5d6789fbd5-8428x" Apr 17 14:24:09.363897 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:09.363847 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b98644a1-99ab-4fab-be81-f0624b8a4c92-console-config\") pod \"console-5d6789fbd5-8428x\" (UID: \"b98644a1-99ab-4fab-be81-f0624b8a4c92\") " pod="openshift-console/console-5d6789fbd5-8428x" Apr 17 14:24:09.364012 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:09.363990 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b98644a1-99ab-4fab-be81-f0624b8a4c92-service-ca\") pod \"console-5d6789fbd5-8428x\" (UID: \"b98644a1-99ab-4fab-be81-f0624b8a4c92\") " pod="openshift-console/console-5d6789fbd5-8428x" Apr 17 14:24:09.364091 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:09.363990 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b98644a1-99ab-4fab-be81-f0624b8a4c92-oauth-serving-cert\") pod \"console-5d6789fbd5-8428x\" (UID: \"b98644a1-99ab-4fab-be81-f0624b8a4c92\") " pod="openshift-console/console-5d6789fbd5-8428x" Apr 17 14:24:09.365595 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:09.365575 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b98644a1-99ab-4fab-be81-f0624b8a4c92-console-oauth-config\") pod \"console-5d6789fbd5-8428x\" (UID: \"b98644a1-99ab-4fab-be81-f0624b8a4c92\") " pod="openshift-console/console-5d6789fbd5-8428x" Apr 17 14:24:09.365673 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:09.365656 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b98644a1-99ab-4fab-be81-f0624b8a4c92-console-serving-cert\") pod \"console-5d6789fbd5-8428x\" (UID: \"b98644a1-99ab-4fab-be81-f0624b8a4c92\") " pod="openshift-console/console-5d6789fbd5-8428x" Apr 17 14:24:09.371076 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:09.371056 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7jb9\" (UniqueName: \"kubernetes.io/projected/b98644a1-99ab-4fab-be81-f0624b8a4c92-kube-api-access-s7jb9\") pod \"console-5d6789fbd5-8428x\" (UID: \"b98644a1-99ab-4fab-be81-f0624b8a4c92\") " pod="openshift-console/console-5d6789fbd5-8428x" Apr 17 14:24:09.397712 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:09.397683 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-869tz" Apr 17 14:24:09.400550 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:09.400532 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-w9wr2\"" Apr 17 14:24:09.408082 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:09.408056 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-869tz" Apr 17 14:24:09.499444 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:09.499413 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5d6789fbd5-8428x" Apr 17 14:24:09.525930 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:09.525903 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-869tz"] Apr 17 14:24:09.528429 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:24:09.528393 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd37b1dac_43fd_47dd_9f14_18b1f81b8155.slice/crio-f1179b6ac7878d8ba0dad0f4d5a8faf34ab1788f573b5810928d894476f699b8 WatchSource:0}: Error finding container f1179b6ac7878d8ba0dad0f4d5a8faf34ab1788f573b5810928d894476f699b8: Status 404 returned error can't find the container with id f1179b6ac7878d8ba0dad0f4d5a8faf34ab1788f573b5810928d894476f699b8 Apr 17 14:24:09.620934 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:09.620848 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5d6789fbd5-8428x"] Apr 17 14:24:09.623753 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:24:09.623724 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb98644a1_99ab_4fab_be81_f0624b8a4c92.slice/crio-e3f301b030bafd1ed56285295a5deb5781a7343a4dad224c8eda9e2186a2fbaa WatchSource:0}: Error finding container e3f301b030bafd1ed56285295a5deb5781a7343a4dad224c8eda9e2186a2fbaa: Status 404 returned error can't find the container with id e3f301b030bafd1ed56285295a5deb5781a7343a4dad224c8eda9e2186a2fbaa Apr 17 14:24:09.938316 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:09.938229 2575 patch_prober.go:28] interesting pod/image-registry-7b887f8f96-9xz6f container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 17 14:24:09.938316 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:09.938285 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-7b887f8f96-9xz6f" podUID="61a41f4f-43f3-483e-bf64-66b7a0d1f2a2" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 14:24:10.039487 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:10.039452 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-869tz" event={"ID":"d37b1dac-43fd-47dd-9f14-18b1f81b8155","Type":"ContainerStarted","Data":"f1179b6ac7878d8ba0dad0f4d5a8faf34ab1788f573b5810928d894476f699b8"} Apr 17 14:24:10.040512 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:10.040482 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5d6789fbd5-8428x" event={"ID":"b98644a1-99ab-4fab-be81-f0624b8a4c92","Type":"ContainerStarted","Data":"e3f301b030bafd1ed56285295a5deb5781a7343a4dad224c8eda9e2186a2fbaa"} Apr 17 14:24:10.397977 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:10.397943 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ccsgf" Apr 17 14:24:12.049839 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:12.049796 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-869tz" event={"ID":"d37b1dac-43fd-47dd-9f14-18b1f81b8155","Type":"ContainerStarted","Data":"b9c2d8667967843cfd9d588135c7de8a1bcd78cf971cc8c2fbf706e9066b3d2c"} Apr 17 14:24:12.065905 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:12.065839 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-869tz" podStartSLOduration=136.173800207 podStartE2EDuration="2m18.065822519s" podCreationTimestamp="2026-04-17 14:21:54 +0000 UTC" firstStartedPulling="2026-04-17 14:24:09.530340749 +0000 UTC m=+167.837094347" lastFinishedPulling="2026-04-17 14:24:11.422363045 +0000 UTC m=+169.729116659" observedRunningTime="2026-04-17 14:24:12.064184631 +0000 UTC m=+170.370938276" watchObservedRunningTime="2026-04-17 14:24:12.065822519 +0000 UTC m=+170.372576142" Apr 17 14:24:13.054183 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:13.054083 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5d6789fbd5-8428x" event={"ID":"b98644a1-99ab-4fab-be81-f0624b8a4c92","Type":"ContainerStarted","Data":"2a96de5a69fa43bf00bb9a8df879e7ea2e0ba341ad74fb60319b9577cc4e9015"} Apr 17 14:24:13.071847 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:13.071800 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5d6789fbd5-8428x" podStartSLOduration=1.083916874 podStartE2EDuration="4.071783796s" podCreationTimestamp="2026-04-17 14:24:09 +0000 UTC" firstStartedPulling="2026-04-17 14:24:09.625653288 +0000 UTC m=+167.932406889" lastFinishedPulling="2026-04-17 14:24:12.61352021 +0000 UTC m=+170.920273811" observedRunningTime="2026-04-17 14:24:13.069945487 +0000 UTC m=+171.376699104" watchObservedRunningTime="2026-04-17 14:24:13.071783796 +0000 UTC m=+171.378537416" Apr 17 14:24:14.025648 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:14.025619 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-lp8nz" Apr 17 14:24:15.046498 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:15.046471 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-7f8cfbb8c7-ptxbt" Apr 17 14:24:18.535671 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:18.535633 2575 patch_prober.go:28] interesting pod/image-registry-7b887f8f96-9xz6f container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 17 14:24:18.536051 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:18.535687 2575 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-7b887f8f96-9xz6f" podUID="61a41f4f-43f3-483e-bf64-66b7a0d1f2a2" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 14:24:18.603392 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:18.603357 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6bb78d9b4d-ccnzw"] Apr 17 14:24:18.607117 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:18.607099 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6bb78d9b4d-ccnzw" Apr 17 14:24:18.616019 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:18.615992 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 17 14:24:18.617134 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:18.617112 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6bb78d9b4d-ccnzw"] Apr 17 14:24:18.653812 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:18.653769 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qr24l\" (UniqueName: \"kubernetes.io/projected/963c2c4f-a170-4ee7-804c-bd319f563375-kube-api-access-qr24l\") pod \"console-6bb78d9b4d-ccnzw\" (UID: \"963c2c4f-a170-4ee7-804c-bd319f563375\") " pod="openshift-console/console-6bb78d9b4d-ccnzw" Apr 17 14:24:18.653812 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:18.653805 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/963c2c4f-a170-4ee7-804c-bd319f563375-console-config\") pod \"console-6bb78d9b4d-ccnzw\" (UID: \"963c2c4f-a170-4ee7-804c-bd319f563375\") " pod="openshift-console/console-6bb78d9b4d-ccnzw" Apr 17 14:24:18.654088 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:18.653842 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/963c2c4f-a170-4ee7-804c-bd319f563375-console-oauth-config\") pod \"console-6bb78d9b4d-ccnzw\" (UID: \"963c2c4f-a170-4ee7-804c-bd319f563375\") " pod="openshift-console/console-6bb78d9b4d-ccnzw" Apr 17 14:24:18.654088 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:18.653918 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/963c2c4f-a170-4ee7-804c-bd319f563375-oauth-serving-cert\") pod \"console-6bb78d9b4d-ccnzw\" (UID: \"963c2c4f-a170-4ee7-804c-bd319f563375\") " pod="openshift-console/console-6bb78d9b4d-ccnzw" Apr 17 14:24:18.654088 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:18.653953 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/963c2c4f-a170-4ee7-804c-bd319f563375-console-serving-cert\") pod \"console-6bb78d9b4d-ccnzw\" (UID: \"963c2c4f-a170-4ee7-804c-bd319f563375\") " pod="openshift-console/console-6bb78d9b4d-ccnzw" Apr 17 14:24:18.654088 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:18.653993 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/963c2c4f-a170-4ee7-804c-bd319f563375-trusted-ca-bundle\") pod \"console-6bb78d9b4d-ccnzw\" (UID: \"963c2c4f-a170-4ee7-804c-bd319f563375\") " pod="openshift-console/console-6bb78d9b4d-ccnzw" Apr 17 14:24:18.654088 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:18.654050 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/963c2c4f-a170-4ee7-804c-bd319f563375-service-ca\") pod \"console-6bb78d9b4d-ccnzw\" (UID: \"963c2c4f-a170-4ee7-804c-bd319f563375\") " pod="openshift-console/console-6bb78d9b4d-ccnzw" Apr 17 14:24:18.755430 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:18.755389 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qr24l\" (UniqueName: \"kubernetes.io/projected/963c2c4f-a170-4ee7-804c-bd319f563375-kube-api-access-qr24l\") pod \"console-6bb78d9b4d-ccnzw\" (UID: \"963c2c4f-a170-4ee7-804c-bd319f563375\") " pod="openshift-console/console-6bb78d9b4d-ccnzw" Apr 17 14:24:18.755430 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:18.755436 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/963c2c4f-a170-4ee7-804c-bd319f563375-console-config\") pod \"console-6bb78d9b4d-ccnzw\" (UID: \"963c2c4f-a170-4ee7-804c-bd319f563375\") " pod="openshift-console/console-6bb78d9b4d-ccnzw" Apr 17 14:24:18.755693 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:18.755463 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/963c2c4f-a170-4ee7-804c-bd319f563375-console-oauth-config\") pod \"console-6bb78d9b4d-ccnzw\" (UID: \"963c2c4f-a170-4ee7-804c-bd319f563375\") " pod="openshift-console/console-6bb78d9b4d-ccnzw" Apr 17 14:24:18.755693 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:18.755492 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/963c2c4f-a170-4ee7-804c-bd319f563375-oauth-serving-cert\") pod \"console-6bb78d9b4d-ccnzw\" (UID: \"963c2c4f-a170-4ee7-804c-bd319f563375\") " pod="openshift-console/console-6bb78d9b4d-ccnzw" Apr 17 14:24:18.755693 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:18.755520 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/963c2c4f-a170-4ee7-804c-bd319f563375-console-serving-cert\") pod \"console-6bb78d9b4d-ccnzw\" (UID: \"963c2c4f-a170-4ee7-804c-bd319f563375\") " pod="openshift-console/console-6bb78d9b4d-ccnzw" Apr 17 14:24:18.755693 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:18.755560 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/963c2c4f-a170-4ee7-804c-bd319f563375-trusted-ca-bundle\") pod \"console-6bb78d9b4d-ccnzw\" (UID: \"963c2c4f-a170-4ee7-804c-bd319f563375\") " pod="openshift-console/console-6bb78d9b4d-ccnzw" Apr 17 14:24:18.755693 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:18.755607 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/963c2c4f-a170-4ee7-804c-bd319f563375-service-ca\") pod \"console-6bb78d9b4d-ccnzw\" (UID: \"963c2c4f-a170-4ee7-804c-bd319f563375\") " pod="openshift-console/console-6bb78d9b4d-ccnzw" Apr 17 14:24:18.756299 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:18.756269 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/963c2c4f-a170-4ee7-804c-bd319f563375-console-config\") pod \"console-6bb78d9b4d-ccnzw\" (UID: \"963c2c4f-a170-4ee7-804c-bd319f563375\") " pod="openshift-console/console-6bb78d9b4d-ccnzw" Apr 17 14:24:18.756299 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:18.756289 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/963c2c4f-a170-4ee7-804c-bd319f563375-service-ca\") pod \"console-6bb78d9b4d-ccnzw\" (UID: \"963c2c4f-a170-4ee7-804c-bd319f563375\") " pod="openshift-console/console-6bb78d9b4d-ccnzw" Apr 17 14:24:18.756452 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:18.756311 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/963c2c4f-a170-4ee7-804c-bd319f563375-oauth-serving-cert\") pod \"console-6bb78d9b4d-ccnzw\" (UID: \"963c2c4f-a170-4ee7-804c-bd319f563375\") " pod="openshift-console/console-6bb78d9b4d-ccnzw" Apr 17 14:24:18.756505 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:18.756449 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/963c2c4f-a170-4ee7-804c-bd319f563375-trusted-ca-bundle\") pod \"console-6bb78d9b4d-ccnzw\" (UID: \"963c2c4f-a170-4ee7-804c-bd319f563375\") " pod="openshift-console/console-6bb78d9b4d-ccnzw" Apr 17 14:24:18.758008 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:18.757981 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/963c2c4f-a170-4ee7-804c-bd319f563375-console-oauth-config\") pod \"console-6bb78d9b4d-ccnzw\" (UID: \"963c2c4f-a170-4ee7-804c-bd319f563375\") " pod="openshift-console/console-6bb78d9b4d-ccnzw" Apr 17 14:24:18.758161 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:18.758140 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/963c2c4f-a170-4ee7-804c-bd319f563375-console-serving-cert\") pod \"console-6bb78d9b4d-ccnzw\" (UID: \"963c2c4f-a170-4ee7-804c-bd319f563375\") " pod="openshift-console/console-6bb78d9b4d-ccnzw" Apr 17 14:24:18.767672 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:18.767648 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qr24l\" (UniqueName: \"kubernetes.io/projected/963c2c4f-a170-4ee7-804c-bd319f563375-kube-api-access-qr24l\") pod \"console-6bb78d9b4d-ccnzw\" (UID: \"963c2c4f-a170-4ee7-804c-bd319f563375\") " pod="openshift-console/console-6bb78d9b4d-ccnzw" Apr 17 14:24:18.916907 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:18.916848 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6bb78d9b4d-ccnzw" Apr 17 14:24:19.051178 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:19.051091 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6bb78d9b4d-ccnzw"] Apr 17 14:24:19.053341 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:24:19.053304 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod963c2c4f_a170_4ee7_804c_bd319f563375.slice/crio-5d437782890407d90615c5e530278f1d9fb02454a2b9e7ad89cda831c9ee08a9 WatchSource:0}: Error finding container 5d437782890407d90615c5e530278f1d9fb02454a2b9e7ad89cda831c9ee08a9: Status 404 returned error can't find the container with id 5d437782890407d90615c5e530278f1d9fb02454a2b9e7ad89cda831c9ee08a9 Apr 17 14:24:19.073722 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:19.073695 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6bb78d9b4d-ccnzw" event={"ID":"963c2c4f-a170-4ee7-804c-bd319f563375","Type":"ContainerStarted","Data":"5d437782890407d90615c5e530278f1d9fb02454a2b9e7ad89cda831c9ee08a9"} Apr 17 14:24:19.500305 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:19.500260 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5d6789fbd5-8428x" Apr 17 14:24:19.500458 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:19.500318 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5d6789fbd5-8428x" Apr 17 14:24:19.505144 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:19.505122 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5d6789fbd5-8428x" Apr 17 14:24:19.938072 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:19.938033 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-7b887f8f96-9xz6f" Apr 17 14:24:20.077732 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:20.077695 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6bb78d9b4d-ccnzw" event={"ID":"963c2c4f-a170-4ee7-804c-bd319f563375","Type":"ContainerStarted","Data":"975d889f86e944dbea587111f5d2d2e0cb0652dc035e32aa33b3e17d6ab1f9bc"} Apr 17 14:24:20.082887 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:20.082846 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5d6789fbd5-8428x" Apr 17 14:24:20.095729 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:20.095688 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6bb78d9b4d-ccnzw" podStartSLOduration=2.095676119 podStartE2EDuration="2.095676119s" podCreationTimestamp="2026-04-17 14:24:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 14:24:20.09450835 +0000 UTC m=+178.401261969" watchObservedRunningTime="2026-04-17 14:24:20.095676119 +0000 UTC m=+178.402429738" Apr 17 14:24:26.178356 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:26.178319 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-558d8d6877-tvdz9" Apr 17 14:24:26.178763 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:26.178426 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-558d8d6877-tvdz9" Apr 17 14:24:28.917322 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:28.917285 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6bb78d9b4d-ccnzw" Apr 17 14:24:28.917322 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:28.917331 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6bb78d9b4d-ccnzw" Apr 17 14:24:28.922514 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:28.922487 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6bb78d9b4d-ccnzw" Apr 17 14:24:29.105294 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:29.105267 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6bb78d9b4d-ccnzw" Apr 17 14:24:29.149275 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:29.149238 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5d6789fbd5-8428x"] Apr 17 14:24:42.139189 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:42.139155 2575 generic.go:358] "Generic (PLEG): container finished" podID="c5d9d12d-4816-4a8c-954c-b83681df2cd9" containerID="ef3e0f9ad9968bf21416fdcffcc0fdb34206f79240548344958edfafa3f18b8e" exitCode=0 Apr 17 14:24:42.139602 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:42.139223 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-b4t8r" event={"ID":"c5d9d12d-4816-4a8c-954c-b83681df2cd9","Type":"ContainerDied","Data":"ef3e0f9ad9968bf21416fdcffcc0fdb34206f79240548344958edfafa3f18b8e"} Apr 17 14:24:42.139659 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:42.139614 2575 scope.go:117] "RemoveContainer" containerID="ef3e0f9ad9968bf21416fdcffcc0fdb34206f79240548344958edfafa3f18b8e" Apr 17 14:24:42.140495 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:42.140473 2575 generic.go:358] "Generic (PLEG): container finished" podID="b3196128-fba4-41e4-a197-d8c5cb0025cb" containerID="31a27b40727544efea83907cc25720f64825efdac95e49c89ac9de1eb76f6ffe" exitCode=0 Apr 17 14:24:42.140586 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:42.140542 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rkftq" event={"ID":"b3196128-fba4-41e4-a197-d8c5cb0025cb","Type":"ContainerDied","Data":"31a27b40727544efea83907cc25720f64825efdac95e49c89ac9de1eb76f6ffe"} Apr 17 14:24:42.140813 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:42.140797 2575 scope.go:117] "RemoveContainer" containerID="31a27b40727544efea83907cc25720f64825efdac95e49c89ac9de1eb76f6ffe" Apr 17 14:24:43.146972 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:43.146942 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-b4t8r" event={"ID":"c5d9d12d-4816-4a8c-954c-b83681df2cd9","Type":"ContainerStarted","Data":"ff90276f4a9175261f2b82eddb27b04cea0c1f7cd774998a676118e454502c16"} Apr 17 14:24:43.149153 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:43.149125 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rkftq" event={"ID":"b3196128-fba4-41e4-a197-d8c5cb0025cb","Type":"ContainerStarted","Data":"86ce52834afe511307938d7eb157cc5863f2758c3610ee8728c952b265f1e9a2"} Apr 17 14:24:43.177816 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:43.177790 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-lp8nz_822a8df9-29ea-4649-a163-22e1db926c84/dns/0.log" Apr 17 14:24:43.195852 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:43.195824 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-lp8nz_822a8df9-29ea-4649-a163-22e1db926c84/kube-rbac-proxy/0.log" Apr 17 14:24:43.733616 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:43.733584 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-kvtp7_e8508015-adfb-42aa-acfc-92b24ec90241/dns-node-resolver/0.log" Apr 17 14:24:46.183488 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:46.183442 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-558d8d6877-tvdz9" Apr 17 14:24:46.187263 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:46.187238 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-558d8d6877-tvdz9" Apr 17 14:24:54.168781 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:54.168717 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5d6789fbd5-8428x" podUID="b98644a1-99ab-4fab-be81-f0624b8a4c92" containerName="console" containerID="cri-o://2a96de5a69fa43bf00bb9a8df879e7ea2e0ba341ad74fb60319b9577cc4e9015" gracePeriod=15 Apr 17 14:24:54.425164 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:54.425096 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5d6789fbd5-8428x_b98644a1-99ab-4fab-be81-f0624b8a4c92/console/0.log" Apr 17 14:24:54.425287 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:54.425178 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5d6789fbd5-8428x" Apr 17 14:24:54.590670 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:54.590634 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b98644a1-99ab-4fab-be81-f0624b8a4c92-service-ca\") pod \"b98644a1-99ab-4fab-be81-f0624b8a4c92\" (UID: \"b98644a1-99ab-4fab-be81-f0624b8a4c92\") " Apr 17 14:24:54.590845 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:54.590689 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b98644a1-99ab-4fab-be81-f0624b8a4c92-console-oauth-config\") pod \"b98644a1-99ab-4fab-be81-f0624b8a4c92\" (UID: \"b98644a1-99ab-4fab-be81-f0624b8a4c92\") " Apr 17 14:24:54.590845 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:54.590767 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b98644a1-99ab-4fab-be81-f0624b8a4c92-console-serving-cert\") pod \"b98644a1-99ab-4fab-be81-f0624b8a4c92\" (UID: \"b98644a1-99ab-4fab-be81-f0624b8a4c92\") " Apr 17 14:24:54.590845 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:54.590838 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b98644a1-99ab-4fab-be81-f0624b8a4c92-oauth-serving-cert\") pod \"b98644a1-99ab-4fab-be81-f0624b8a4c92\" (UID: \"b98644a1-99ab-4fab-be81-f0624b8a4c92\") " Apr 17 14:24:54.591025 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:54.590903 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b98644a1-99ab-4fab-be81-f0624b8a4c92-console-config\") pod \"b98644a1-99ab-4fab-be81-f0624b8a4c92\" (UID: \"b98644a1-99ab-4fab-be81-f0624b8a4c92\") " Apr 17 14:24:54.591025 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:54.590931 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s7jb9\" (UniqueName: \"kubernetes.io/projected/b98644a1-99ab-4fab-be81-f0624b8a4c92-kube-api-access-s7jb9\") pod \"b98644a1-99ab-4fab-be81-f0624b8a4c92\" (UID: \"b98644a1-99ab-4fab-be81-f0624b8a4c92\") " Apr 17 14:24:54.591124 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:54.591034 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b98644a1-99ab-4fab-be81-f0624b8a4c92-service-ca" (OuterVolumeSpecName: "service-ca") pod "b98644a1-99ab-4fab-be81-f0624b8a4c92" (UID: "b98644a1-99ab-4fab-be81-f0624b8a4c92"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 14:24:54.591282 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:54.591228 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b98644a1-99ab-4fab-be81-f0624b8a4c92-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "b98644a1-99ab-4fab-be81-f0624b8a4c92" (UID: "b98644a1-99ab-4fab-be81-f0624b8a4c92"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 14:24:54.591282 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:54.591238 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b98644a1-99ab-4fab-be81-f0624b8a4c92-console-config" (OuterVolumeSpecName: "console-config") pod "b98644a1-99ab-4fab-be81-f0624b8a4c92" (UID: "b98644a1-99ab-4fab-be81-f0624b8a4c92"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 14:24:54.591282 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:54.591248 2575 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b98644a1-99ab-4fab-be81-f0624b8a4c92-service-ca\") on node \"ip-10-0-138-3.ec2.internal\" DevicePath \"\"" Apr 17 14:24:54.593321 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:54.593290 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b98644a1-99ab-4fab-be81-f0624b8a4c92-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "b98644a1-99ab-4fab-be81-f0624b8a4c92" (UID: "b98644a1-99ab-4fab-be81-f0624b8a4c92"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 14:24:54.593451 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:54.593430 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b98644a1-99ab-4fab-be81-f0624b8a4c92-kube-api-access-s7jb9" (OuterVolumeSpecName: "kube-api-access-s7jb9") pod "b98644a1-99ab-4fab-be81-f0624b8a4c92" (UID: "b98644a1-99ab-4fab-be81-f0624b8a4c92"). InnerVolumeSpecName "kube-api-access-s7jb9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 14:24:54.593518 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:54.593434 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b98644a1-99ab-4fab-be81-f0624b8a4c92-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "b98644a1-99ab-4fab-be81-f0624b8a4c92" (UID: "b98644a1-99ab-4fab-be81-f0624b8a4c92"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 14:24:54.692233 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:54.692147 2575 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b98644a1-99ab-4fab-be81-f0624b8a4c92-console-serving-cert\") on node \"ip-10-0-138-3.ec2.internal\" DevicePath \"\"" Apr 17 14:24:54.692233 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:54.692179 2575 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b98644a1-99ab-4fab-be81-f0624b8a4c92-oauth-serving-cert\") on node \"ip-10-0-138-3.ec2.internal\" DevicePath \"\"" Apr 17 14:24:54.692233 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:54.692189 2575 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b98644a1-99ab-4fab-be81-f0624b8a4c92-console-config\") on node \"ip-10-0-138-3.ec2.internal\" DevicePath \"\"" Apr 17 14:24:54.692233 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:54.692198 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-s7jb9\" (UniqueName: \"kubernetes.io/projected/b98644a1-99ab-4fab-be81-f0624b8a4c92-kube-api-access-s7jb9\") on node \"ip-10-0-138-3.ec2.internal\" DevicePath \"\"" Apr 17 14:24:54.692233 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:54.692207 2575 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b98644a1-99ab-4fab-be81-f0624b8a4c92-console-oauth-config\") on node \"ip-10-0-138-3.ec2.internal\" DevicePath \"\"" Apr 17 14:24:55.187899 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:55.185518 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5d6789fbd5-8428x_b98644a1-99ab-4fab-be81-f0624b8a4c92/console/0.log" Apr 17 14:24:55.187899 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:55.185565 2575 generic.go:358] "Generic (PLEG): container finished" podID="b98644a1-99ab-4fab-be81-f0624b8a4c92" containerID="2a96de5a69fa43bf00bb9a8df879e7ea2e0ba341ad74fb60319b9577cc4e9015" exitCode=2 Apr 17 14:24:55.187899 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:55.185616 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5d6789fbd5-8428x" event={"ID":"b98644a1-99ab-4fab-be81-f0624b8a4c92","Type":"ContainerDied","Data":"2a96de5a69fa43bf00bb9a8df879e7ea2e0ba341ad74fb60319b9577cc4e9015"} Apr 17 14:24:55.187899 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:55.185644 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5d6789fbd5-8428x" event={"ID":"b98644a1-99ab-4fab-be81-f0624b8a4c92","Type":"ContainerDied","Data":"e3f301b030bafd1ed56285295a5deb5781a7343a4dad224c8eda9e2186a2fbaa"} Apr 17 14:24:55.187899 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:55.185666 2575 scope.go:117] "RemoveContainer" containerID="2a96de5a69fa43bf00bb9a8df879e7ea2e0ba341ad74fb60319b9577cc4e9015" Apr 17 14:24:55.187899 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:55.185826 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5d6789fbd5-8428x" Apr 17 14:24:55.197541 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:55.197519 2575 scope.go:117] "RemoveContainer" containerID="2a96de5a69fa43bf00bb9a8df879e7ea2e0ba341ad74fb60319b9577cc4e9015" Apr 17 14:24:55.197818 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:24:55.197797 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a96de5a69fa43bf00bb9a8df879e7ea2e0ba341ad74fb60319b9577cc4e9015\": container with ID starting with 2a96de5a69fa43bf00bb9a8df879e7ea2e0ba341ad74fb60319b9577cc4e9015 not found: ID does not exist" containerID="2a96de5a69fa43bf00bb9a8df879e7ea2e0ba341ad74fb60319b9577cc4e9015" Apr 17 14:24:55.197905 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:55.197827 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a96de5a69fa43bf00bb9a8df879e7ea2e0ba341ad74fb60319b9577cc4e9015"} err="failed to get container status \"2a96de5a69fa43bf00bb9a8df879e7ea2e0ba341ad74fb60319b9577cc4e9015\": rpc error: code = NotFound desc = could not find container \"2a96de5a69fa43bf00bb9a8df879e7ea2e0ba341ad74fb60319b9577cc4e9015\": container with ID starting with 2a96de5a69fa43bf00bb9a8df879e7ea2e0ba341ad74fb60319b9577cc4e9015 not found: ID does not exist" Apr 17 14:24:55.208393 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:55.208369 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5d6789fbd5-8428x"] Apr 17 14:24:55.211944 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:55.211921 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5d6789fbd5-8428x"] Apr 17 14:24:56.404290 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:24:56.404244 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b98644a1-99ab-4fab-be81-f0624b8a4c92" path="/var/lib/kubelet/pods/b98644a1-99ab-4fab-be81-f0624b8a4c92/volumes" Apr 17 14:25:21.628850 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:21.628814 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 14:25:21.629361 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:21.629216 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="e49357e5-7634-4418-8ccc-f492994acdaa" containerName="alertmanager" containerID="cri-o://1cf2d10522596483794d066c9db6edf31ba728f273580cf91e7be01438c8df03" gracePeriod=120 Apr 17 14:25:21.629361 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:21.629291 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="e49357e5-7634-4418-8ccc-f492994acdaa" containerName="kube-rbac-proxy-metric" containerID="cri-o://e0bc0daca6dca5174b6efb519905fdf80c9d516da6c74e5e5576480aa4ee00e2" gracePeriod=120 Apr 17 14:25:21.629361 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:21.629318 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="e49357e5-7634-4418-8ccc-f492994acdaa" containerName="kube-rbac-proxy-web" containerID="cri-o://9f149de94210911e033f35147e1cd33526331e223042475d302ebd10d8329a55" gracePeriod=120 Apr 17 14:25:21.629535 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:21.629339 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="e49357e5-7634-4418-8ccc-f492994acdaa" containerName="config-reloader" containerID="cri-o://6fa1c6c448c140638c011b404eabb5fcb410da6007e9dd8e59d58538190c8dda" gracePeriod=120 Apr 17 14:25:21.629535 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:21.629378 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="e49357e5-7634-4418-8ccc-f492994acdaa" containerName="prom-label-proxy" containerID="cri-o://ef926ea4088da0138ffd9d1c93c4005f2be226aa47ab71c240737a1ed688c769" gracePeriod=120 Apr 17 14:25:21.629535 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:21.629423 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="e49357e5-7634-4418-8ccc-f492994acdaa" containerName="kube-rbac-proxy" containerID="cri-o://54998b5a1e42d4b3c2c2acd563105c329b09abc72fcbf89ccd0cda1c6a254d73" gracePeriod=120 Apr 17 14:25:22.272077 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:22.272042 2575 generic.go:358] "Generic (PLEG): container finished" podID="e49357e5-7634-4418-8ccc-f492994acdaa" containerID="ef926ea4088da0138ffd9d1c93c4005f2be226aa47ab71c240737a1ed688c769" exitCode=0 Apr 17 14:25:22.272077 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:22.272073 2575 generic.go:358] "Generic (PLEG): container finished" podID="e49357e5-7634-4418-8ccc-f492994acdaa" containerID="54998b5a1e42d4b3c2c2acd563105c329b09abc72fcbf89ccd0cda1c6a254d73" exitCode=0 Apr 17 14:25:22.272077 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:22.272083 2575 generic.go:358] "Generic (PLEG): container finished" podID="e49357e5-7634-4418-8ccc-f492994acdaa" containerID="6fa1c6c448c140638c011b404eabb5fcb410da6007e9dd8e59d58538190c8dda" exitCode=0 Apr 17 14:25:22.272077 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:22.272092 2575 generic.go:358] "Generic (PLEG): container finished" podID="e49357e5-7634-4418-8ccc-f492994acdaa" containerID="1cf2d10522596483794d066c9db6edf31ba728f273580cf91e7be01438c8df03" exitCode=0 Apr 17 14:25:22.272382 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:22.272116 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e49357e5-7634-4418-8ccc-f492994acdaa","Type":"ContainerDied","Data":"ef926ea4088da0138ffd9d1c93c4005f2be226aa47ab71c240737a1ed688c769"} Apr 17 14:25:22.272382 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:22.272158 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e49357e5-7634-4418-8ccc-f492994acdaa","Type":"ContainerDied","Data":"54998b5a1e42d4b3c2c2acd563105c329b09abc72fcbf89ccd0cda1c6a254d73"} Apr 17 14:25:22.272382 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:22.272168 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e49357e5-7634-4418-8ccc-f492994acdaa","Type":"ContainerDied","Data":"6fa1c6c448c140638c011b404eabb5fcb410da6007e9dd8e59d58538190c8dda"} Apr 17 14:25:22.272382 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:22.272178 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e49357e5-7634-4418-8ccc-f492994acdaa","Type":"ContainerDied","Data":"1cf2d10522596483794d066c9db6edf31ba728f273580cf91e7be01438c8df03"} Apr 17 14:25:22.868199 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:22.868173 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:25:22.944127 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:22.944092 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hvws9\" (UniqueName: \"kubernetes.io/projected/e49357e5-7634-4418-8ccc-f492994acdaa-kube-api-access-hvws9\") pod \"e49357e5-7634-4418-8ccc-f492994acdaa\" (UID: \"e49357e5-7634-4418-8ccc-f492994acdaa\") " Apr 17 14:25:22.944288 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:22.944145 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/e49357e5-7634-4418-8ccc-f492994acdaa-secret-alertmanager-kube-rbac-proxy-metric\") pod \"e49357e5-7634-4418-8ccc-f492994acdaa\" (UID: \"e49357e5-7634-4418-8ccc-f492994acdaa\") " Apr 17 14:25:22.944288 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:22.944177 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/e49357e5-7634-4418-8ccc-f492994acdaa-cluster-tls-config\") pod \"e49357e5-7634-4418-8ccc-f492994acdaa\" (UID: \"e49357e5-7634-4418-8ccc-f492994acdaa\") " Apr 17 14:25:22.944288 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:22.944203 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e49357e5-7634-4418-8ccc-f492994acdaa-secret-alertmanager-kube-rbac-proxy-web\") pod \"e49357e5-7634-4418-8ccc-f492994acdaa\" (UID: \"e49357e5-7634-4418-8ccc-f492994acdaa\") " Apr 17 14:25:22.944288 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:22.944227 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e49357e5-7634-4418-8ccc-f492994acdaa-secret-alertmanager-kube-rbac-proxy\") pod \"e49357e5-7634-4418-8ccc-f492994acdaa\" (UID: \"e49357e5-7634-4418-8ccc-f492994acdaa\") " Apr 17 14:25:22.944288 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:22.944250 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e49357e5-7634-4418-8ccc-f492994acdaa-alertmanager-trusted-ca-bundle\") pod \"e49357e5-7634-4418-8ccc-f492994acdaa\" (UID: \"e49357e5-7634-4418-8ccc-f492994acdaa\") " Apr 17 14:25:22.944288 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:22.944270 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/e49357e5-7634-4418-8ccc-f492994acdaa-secret-alertmanager-main-tls\") pod \"e49357e5-7634-4418-8ccc-f492994acdaa\" (UID: \"e49357e5-7634-4418-8ccc-f492994acdaa\") " Apr 17 14:25:22.944608 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:22.944297 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e49357e5-7634-4418-8ccc-f492994acdaa-tls-assets\") pod \"e49357e5-7634-4418-8ccc-f492994acdaa\" (UID: \"e49357e5-7634-4418-8ccc-f492994acdaa\") " Apr 17 14:25:22.944608 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:22.944321 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e49357e5-7634-4418-8ccc-f492994acdaa-config-out\") pod \"e49357e5-7634-4418-8ccc-f492994acdaa\" (UID: \"e49357e5-7634-4418-8ccc-f492994acdaa\") " Apr 17 14:25:22.944608 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:22.944354 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/e49357e5-7634-4418-8ccc-f492994acdaa-config-volume\") pod \"e49357e5-7634-4418-8ccc-f492994acdaa\" (UID: \"e49357e5-7634-4418-8ccc-f492994acdaa\") " Apr 17 14:25:22.944608 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:22.944388 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e49357e5-7634-4418-8ccc-f492994acdaa-metrics-client-ca\") pod \"e49357e5-7634-4418-8ccc-f492994acdaa\" (UID: \"e49357e5-7634-4418-8ccc-f492994acdaa\") " Apr 17 14:25:22.944608 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:22.944433 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e49357e5-7634-4418-8ccc-f492994acdaa-web-config\") pod \"e49357e5-7634-4418-8ccc-f492994acdaa\" (UID: \"e49357e5-7634-4418-8ccc-f492994acdaa\") " Apr 17 14:25:22.944608 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:22.944475 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/e49357e5-7634-4418-8ccc-f492994acdaa-alertmanager-main-db\") pod \"e49357e5-7634-4418-8ccc-f492994acdaa\" (UID: \"e49357e5-7634-4418-8ccc-f492994acdaa\") " Apr 17 14:25:22.945349 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:22.945056 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e49357e5-7634-4418-8ccc-f492994acdaa-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "e49357e5-7634-4418-8ccc-f492994acdaa" (UID: "e49357e5-7634-4418-8ccc-f492994acdaa"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 14:25:22.945349 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:22.945083 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e49357e5-7634-4418-8ccc-f492994acdaa-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "e49357e5-7634-4418-8ccc-f492994acdaa" (UID: "e49357e5-7634-4418-8ccc-f492994acdaa"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 14:25:22.945526 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:22.945435 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e49357e5-7634-4418-8ccc-f492994acdaa-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "e49357e5-7634-4418-8ccc-f492994acdaa" (UID: "e49357e5-7634-4418-8ccc-f492994acdaa"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 14:25:22.947283 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:22.947253 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e49357e5-7634-4418-8ccc-f492994acdaa-kube-api-access-hvws9" (OuterVolumeSpecName: "kube-api-access-hvws9") pod "e49357e5-7634-4418-8ccc-f492994acdaa" (UID: "e49357e5-7634-4418-8ccc-f492994acdaa"). InnerVolumeSpecName "kube-api-access-hvws9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 14:25:22.947678 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:22.947556 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e49357e5-7634-4418-8ccc-f492994acdaa-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "e49357e5-7634-4418-8ccc-f492994acdaa" (UID: "e49357e5-7634-4418-8ccc-f492994acdaa"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 14:25:22.947678 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:22.947629 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e49357e5-7634-4418-8ccc-f492994acdaa-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "e49357e5-7634-4418-8ccc-f492994acdaa" (UID: "e49357e5-7634-4418-8ccc-f492994acdaa"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 14:25:22.947897 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:22.947834 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e49357e5-7634-4418-8ccc-f492994acdaa-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "e49357e5-7634-4418-8ccc-f492994acdaa" (UID: "e49357e5-7634-4418-8ccc-f492994acdaa"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 14:25:22.948006 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:22.947922 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e49357e5-7634-4418-8ccc-f492994acdaa-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "e49357e5-7634-4418-8ccc-f492994acdaa" (UID: "e49357e5-7634-4418-8ccc-f492994acdaa"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 14:25:22.948006 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:22.947960 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e49357e5-7634-4418-8ccc-f492994acdaa-config-volume" (OuterVolumeSpecName: "config-volume") pod "e49357e5-7634-4418-8ccc-f492994acdaa" (UID: "e49357e5-7634-4418-8ccc-f492994acdaa"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 14:25:22.949033 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:22.949007 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e49357e5-7634-4418-8ccc-f492994acdaa-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "e49357e5-7634-4418-8ccc-f492994acdaa" (UID: "e49357e5-7634-4418-8ccc-f492994acdaa"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 14:25:22.949378 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:22.949358 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e49357e5-7634-4418-8ccc-f492994acdaa-config-out" (OuterVolumeSpecName: "config-out") pod "e49357e5-7634-4418-8ccc-f492994acdaa" (UID: "e49357e5-7634-4418-8ccc-f492994acdaa"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 14:25:22.951667 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:22.951641 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e49357e5-7634-4418-8ccc-f492994acdaa-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "e49357e5-7634-4418-8ccc-f492994acdaa" (UID: "e49357e5-7634-4418-8ccc-f492994acdaa"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 14:25:22.957480 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:22.957419 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e49357e5-7634-4418-8ccc-f492994acdaa-web-config" (OuterVolumeSpecName: "web-config") pod "e49357e5-7634-4418-8ccc-f492994acdaa" (UID: "e49357e5-7634-4418-8ccc-f492994acdaa"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 14:25:23.045469 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.045433 2575 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/e49357e5-7634-4418-8ccc-f492994acdaa-alertmanager-main-db\") on node \"ip-10-0-138-3.ec2.internal\" DevicePath \"\"" Apr 17 14:25:23.045469 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.045464 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hvws9\" (UniqueName: \"kubernetes.io/projected/e49357e5-7634-4418-8ccc-f492994acdaa-kube-api-access-hvws9\") on node \"ip-10-0-138-3.ec2.internal\" DevicePath \"\"" Apr 17 14:25:23.045469 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.045475 2575 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/e49357e5-7634-4418-8ccc-f492994acdaa-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-138-3.ec2.internal\" DevicePath \"\"" Apr 17 14:25:23.045655 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.045485 2575 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/e49357e5-7634-4418-8ccc-f492994acdaa-cluster-tls-config\") on node \"ip-10-0-138-3.ec2.internal\" DevicePath \"\"" Apr 17 14:25:23.045655 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.045495 2575 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e49357e5-7634-4418-8ccc-f492994acdaa-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-138-3.ec2.internal\" DevicePath \"\"" Apr 17 14:25:23.045655 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.045504 2575 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e49357e5-7634-4418-8ccc-f492994acdaa-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-138-3.ec2.internal\" DevicePath \"\"" Apr 17 14:25:23.045655 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.045513 2575 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e49357e5-7634-4418-8ccc-f492994acdaa-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-138-3.ec2.internal\" DevicePath \"\"" Apr 17 14:25:23.045655 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.045522 2575 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/e49357e5-7634-4418-8ccc-f492994acdaa-secret-alertmanager-main-tls\") on node \"ip-10-0-138-3.ec2.internal\" DevicePath \"\"" Apr 17 14:25:23.045655 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.045531 2575 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e49357e5-7634-4418-8ccc-f492994acdaa-tls-assets\") on node \"ip-10-0-138-3.ec2.internal\" DevicePath \"\"" Apr 17 14:25:23.045655 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.045539 2575 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e49357e5-7634-4418-8ccc-f492994acdaa-config-out\") on node \"ip-10-0-138-3.ec2.internal\" DevicePath \"\"" Apr 17 14:25:23.045655 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.045547 2575 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/e49357e5-7634-4418-8ccc-f492994acdaa-config-volume\") on node \"ip-10-0-138-3.ec2.internal\" DevicePath \"\"" Apr 17 14:25:23.045655 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.045557 2575 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e49357e5-7634-4418-8ccc-f492994acdaa-metrics-client-ca\") on node \"ip-10-0-138-3.ec2.internal\" DevicePath \"\"" Apr 17 14:25:23.045655 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.045565 2575 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e49357e5-7634-4418-8ccc-f492994acdaa-web-config\") on node \"ip-10-0-138-3.ec2.internal\" DevicePath \"\"" Apr 17 14:25:23.277710 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.277626 2575 generic.go:358] "Generic (PLEG): container finished" podID="e49357e5-7634-4418-8ccc-f492994acdaa" containerID="e0bc0daca6dca5174b6efb519905fdf80c9d516da6c74e5e5576480aa4ee00e2" exitCode=0 Apr 17 14:25:23.277710 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.277658 2575 generic.go:358] "Generic (PLEG): container finished" podID="e49357e5-7634-4418-8ccc-f492994acdaa" containerID="9f149de94210911e033f35147e1cd33526331e223042475d302ebd10d8329a55" exitCode=0 Apr 17 14:25:23.277910 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.277714 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e49357e5-7634-4418-8ccc-f492994acdaa","Type":"ContainerDied","Data":"e0bc0daca6dca5174b6efb519905fdf80c9d516da6c74e5e5576480aa4ee00e2"} Apr 17 14:25:23.277910 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.277750 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:25:23.277910 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.277763 2575 scope.go:117] "RemoveContainer" containerID="ef926ea4088da0138ffd9d1c93c4005f2be226aa47ab71c240737a1ed688c769" Apr 17 14:25:23.277910 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.277752 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e49357e5-7634-4418-8ccc-f492994acdaa","Type":"ContainerDied","Data":"9f149de94210911e033f35147e1cd33526331e223042475d302ebd10d8329a55"} Apr 17 14:25:23.278098 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.277916 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e49357e5-7634-4418-8ccc-f492994acdaa","Type":"ContainerDied","Data":"6106b8332278d6716325ba4cae09e16900123f41fe4f9c396d8e74abf9c05ccf"} Apr 17 14:25:23.284755 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.284729 2575 scope.go:117] "RemoveContainer" containerID="e0bc0daca6dca5174b6efb519905fdf80c9d516da6c74e5e5576480aa4ee00e2" Apr 17 14:25:23.291384 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.291369 2575 scope.go:117] "RemoveContainer" containerID="54998b5a1e42d4b3c2c2acd563105c329b09abc72fcbf89ccd0cda1c6a254d73" Apr 17 14:25:23.297397 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.297381 2575 scope.go:117] "RemoveContainer" containerID="9f149de94210911e033f35147e1cd33526331e223042475d302ebd10d8329a55" Apr 17 14:25:23.300731 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.300711 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 14:25:23.304452 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.304434 2575 scope.go:117] "RemoveContainer" containerID="6fa1c6c448c140638c011b404eabb5fcb410da6007e9dd8e59d58538190c8dda" Apr 17 14:25:23.305397 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.305381 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 14:25:23.310602 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.310588 2575 scope.go:117] "RemoveContainer" containerID="1cf2d10522596483794d066c9db6edf31ba728f273580cf91e7be01438c8df03" Apr 17 14:25:23.316618 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.316603 2575 scope.go:117] "RemoveContainer" containerID="fbe02d03fb433871811e2a0abeaa9e4cd47e6592e307396e83be993661bd43b4" Apr 17 14:25:23.322864 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.322846 2575 scope.go:117] "RemoveContainer" containerID="ef926ea4088da0138ffd9d1c93c4005f2be226aa47ab71c240737a1ed688c769" Apr 17 14:25:23.323124 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:25:23.323108 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef926ea4088da0138ffd9d1c93c4005f2be226aa47ab71c240737a1ed688c769\": container with ID starting with ef926ea4088da0138ffd9d1c93c4005f2be226aa47ab71c240737a1ed688c769 not found: ID does not exist" containerID="ef926ea4088da0138ffd9d1c93c4005f2be226aa47ab71c240737a1ed688c769" Apr 17 14:25:23.323181 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.323131 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef926ea4088da0138ffd9d1c93c4005f2be226aa47ab71c240737a1ed688c769"} err="failed to get container status \"ef926ea4088da0138ffd9d1c93c4005f2be226aa47ab71c240737a1ed688c769\": rpc error: code = NotFound desc = could not find container \"ef926ea4088da0138ffd9d1c93c4005f2be226aa47ab71c240737a1ed688c769\": container with ID starting with ef926ea4088da0138ffd9d1c93c4005f2be226aa47ab71c240737a1ed688c769 not found: ID does not exist" Apr 17 14:25:23.323181 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.323148 2575 scope.go:117] "RemoveContainer" containerID="e0bc0daca6dca5174b6efb519905fdf80c9d516da6c74e5e5576480aa4ee00e2" Apr 17 14:25:23.323338 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:25:23.323324 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0bc0daca6dca5174b6efb519905fdf80c9d516da6c74e5e5576480aa4ee00e2\": container with ID starting with e0bc0daca6dca5174b6efb519905fdf80c9d516da6c74e5e5576480aa4ee00e2 not found: ID does not exist" containerID="e0bc0daca6dca5174b6efb519905fdf80c9d516da6c74e5e5576480aa4ee00e2" Apr 17 14:25:23.323381 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.323342 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0bc0daca6dca5174b6efb519905fdf80c9d516da6c74e5e5576480aa4ee00e2"} err="failed to get container status \"e0bc0daca6dca5174b6efb519905fdf80c9d516da6c74e5e5576480aa4ee00e2\": rpc error: code = NotFound desc = could not find container \"e0bc0daca6dca5174b6efb519905fdf80c9d516da6c74e5e5576480aa4ee00e2\": container with ID starting with e0bc0daca6dca5174b6efb519905fdf80c9d516da6c74e5e5576480aa4ee00e2 not found: ID does not exist" Apr 17 14:25:23.323381 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.323355 2575 scope.go:117] "RemoveContainer" containerID="54998b5a1e42d4b3c2c2acd563105c329b09abc72fcbf89ccd0cda1c6a254d73" Apr 17 14:25:23.323557 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:25:23.323540 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54998b5a1e42d4b3c2c2acd563105c329b09abc72fcbf89ccd0cda1c6a254d73\": container with ID starting with 54998b5a1e42d4b3c2c2acd563105c329b09abc72fcbf89ccd0cda1c6a254d73 not found: ID does not exist" containerID="54998b5a1e42d4b3c2c2acd563105c329b09abc72fcbf89ccd0cda1c6a254d73" Apr 17 14:25:23.323591 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.323563 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54998b5a1e42d4b3c2c2acd563105c329b09abc72fcbf89ccd0cda1c6a254d73"} err="failed to get container status \"54998b5a1e42d4b3c2c2acd563105c329b09abc72fcbf89ccd0cda1c6a254d73\": rpc error: code = NotFound desc = could not find container \"54998b5a1e42d4b3c2c2acd563105c329b09abc72fcbf89ccd0cda1c6a254d73\": container with ID starting with 54998b5a1e42d4b3c2c2acd563105c329b09abc72fcbf89ccd0cda1c6a254d73 not found: ID does not exist" Apr 17 14:25:23.323591 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.323580 2575 scope.go:117] "RemoveContainer" containerID="9f149de94210911e033f35147e1cd33526331e223042475d302ebd10d8329a55" Apr 17 14:25:23.323773 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:25:23.323756 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f149de94210911e033f35147e1cd33526331e223042475d302ebd10d8329a55\": container with ID starting with 9f149de94210911e033f35147e1cd33526331e223042475d302ebd10d8329a55 not found: ID does not exist" containerID="9f149de94210911e033f35147e1cd33526331e223042475d302ebd10d8329a55" Apr 17 14:25:23.323811 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.323777 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f149de94210911e033f35147e1cd33526331e223042475d302ebd10d8329a55"} err="failed to get container status \"9f149de94210911e033f35147e1cd33526331e223042475d302ebd10d8329a55\": rpc error: code = NotFound desc = could not find container \"9f149de94210911e033f35147e1cd33526331e223042475d302ebd10d8329a55\": container with ID starting with 9f149de94210911e033f35147e1cd33526331e223042475d302ebd10d8329a55 not found: ID does not exist" Apr 17 14:25:23.323811 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.323789 2575 scope.go:117] "RemoveContainer" containerID="6fa1c6c448c140638c011b404eabb5fcb410da6007e9dd8e59d58538190c8dda" Apr 17 14:25:23.324012 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:25:23.323997 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6fa1c6c448c140638c011b404eabb5fcb410da6007e9dd8e59d58538190c8dda\": container with ID starting with 6fa1c6c448c140638c011b404eabb5fcb410da6007e9dd8e59d58538190c8dda not found: ID does not exist" containerID="6fa1c6c448c140638c011b404eabb5fcb410da6007e9dd8e59d58538190c8dda" Apr 17 14:25:23.324057 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.324015 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6fa1c6c448c140638c011b404eabb5fcb410da6007e9dd8e59d58538190c8dda"} err="failed to get container status \"6fa1c6c448c140638c011b404eabb5fcb410da6007e9dd8e59d58538190c8dda\": rpc error: code = NotFound desc = could not find container \"6fa1c6c448c140638c011b404eabb5fcb410da6007e9dd8e59d58538190c8dda\": container with ID starting with 6fa1c6c448c140638c011b404eabb5fcb410da6007e9dd8e59d58538190c8dda not found: ID does not exist" Apr 17 14:25:23.324057 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.324027 2575 scope.go:117] "RemoveContainer" containerID="1cf2d10522596483794d066c9db6edf31ba728f273580cf91e7be01438c8df03" Apr 17 14:25:23.324403 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:25:23.324386 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1cf2d10522596483794d066c9db6edf31ba728f273580cf91e7be01438c8df03\": container with ID starting with 1cf2d10522596483794d066c9db6edf31ba728f273580cf91e7be01438c8df03 not found: ID does not exist" containerID="1cf2d10522596483794d066c9db6edf31ba728f273580cf91e7be01438c8df03" Apr 17 14:25:23.324471 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.324411 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cf2d10522596483794d066c9db6edf31ba728f273580cf91e7be01438c8df03"} err="failed to get container status \"1cf2d10522596483794d066c9db6edf31ba728f273580cf91e7be01438c8df03\": rpc error: code = NotFound desc = could not find container \"1cf2d10522596483794d066c9db6edf31ba728f273580cf91e7be01438c8df03\": container with ID starting with 1cf2d10522596483794d066c9db6edf31ba728f273580cf91e7be01438c8df03 not found: ID does not exist" Apr 17 14:25:23.324471 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.324431 2575 scope.go:117] "RemoveContainer" containerID="fbe02d03fb433871811e2a0abeaa9e4cd47e6592e307396e83be993661bd43b4" Apr 17 14:25:23.324625 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:25:23.324612 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fbe02d03fb433871811e2a0abeaa9e4cd47e6592e307396e83be993661bd43b4\": container with ID starting with fbe02d03fb433871811e2a0abeaa9e4cd47e6592e307396e83be993661bd43b4 not found: ID does not exist" containerID="fbe02d03fb433871811e2a0abeaa9e4cd47e6592e307396e83be993661bd43b4" Apr 17 14:25:23.324658 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.324628 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fbe02d03fb433871811e2a0abeaa9e4cd47e6592e307396e83be993661bd43b4"} err="failed to get container status \"fbe02d03fb433871811e2a0abeaa9e4cd47e6592e307396e83be993661bd43b4\": rpc error: code = NotFound desc = could not find container \"fbe02d03fb433871811e2a0abeaa9e4cd47e6592e307396e83be993661bd43b4\": container with ID starting with fbe02d03fb433871811e2a0abeaa9e4cd47e6592e307396e83be993661bd43b4 not found: ID does not exist" Apr 17 14:25:23.324658 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.324640 2575 scope.go:117] "RemoveContainer" containerID="ef926ea4088da0138ffd9d1c93c4005f2be226aa47ab71c240737a1ed688c769" Apr 17 14:25:23.324829 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.324813 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef926ea4088da0138ffd9d1c93c4005f2be226aa47ab71c240737a1ed688c769"} err="failed to get container status \"ef926ea4088da0138ffd9d1c93c4005f2be226aa47ab71c240737a1ed688c769\": rpc error: code = NotFound desc = could not find container \"ef926ea4088da0138ffd9d1c93c4005f2be226aa47ab71c240737a1ed688c769\": container with ID starting with ef926ea4088da0138ffd9d1c93c4005f2be226aa47ab71c240737a1ed688c769 not found: ID does not exist" Apr 17 14:25:23.324962 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.324831 2575 scope.go:117] "RemoveContainer" containerID="e0bc0daca6dca5174b6efb519905fdf80c9d516da6c74e5e5576480aa4ee00e2" Apr 17 14:25:23.325076 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.325059 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0bc0daca6dca5174b6efb519905fdf80c9d516da6c74e5e5576480aa4ee00e2"} err="failed to get container status \"e0bc0daca6dca5174b6efb519905fdf80c9d516da6c74e5e5576480aa4ee00e2\": rpc error: code = NotFound desc = could not find container \"e0bc0daca6dca5174b6efb519905fdf80c9d516da6c74e5e5576480aa4ee00e2\": container with ID starting with e0bc0daca6dca5174b6efb519905fdf80c9d516da6c74e5e5576480aa4ee00e2 not found: ID does not exist" Apr 17 14:25:23.325122 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.325077 2575 scope.go:117] "RemoveContainer" containerID="54998b5a1e42d4b3c2c2acd563105c329b09abc72fcbf89ccd0cda1c6a254d73" Apr 17 14:25:23.325309 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.325291 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54998b5a1e42d4b3c2c2acd563105c329b09abc72fcbf89ccd0cda1c6a254d73"} err="failed to get container status \"54998b5a1e42d4b3c2c2acd563105c329b09abc72fcbf89ccd0cda1c6a254d73\": rpc error: code = NotFound desc = could not find container \"54998b5a1e42d4b3c2c2acd563105c329b09abc72fcbf89ccd0cda1c6a254d73\": container with ID starting with 54998b5a1e42d4b3c2c2acd563105c329b09abc72fcbf89ccd0cda1c6a254d73 not found: ID does not exist" Apr 17 14:25:23.325309 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.325308 2575 scope.go:117] "RemoveContainer" containerID="9f149de94210911e033f35147e1cd33526331e223042475d302ebd10d8329a55" Apr 17 14:25:23.325661 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.325640 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f149de94210911e033f35147e1cd33526331e223042475d302ebd10d8329a55"} err="failed to get container status \"9f149de94210911e033f35147e1cd33526331e223042475d302ebd10d8329a55\": rpc error: code = NotFound desc = could not find container \"9f149de94210911e033f35147e1cd33526331e223042475d302ebd10d8329a55\": container with ID starting with 9f149de94210911e033f35147e1cd33526331e223042475d302ebd10d8329a55 not found: ID does not exist" Apr 17 14:25:23.325729 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.325661 2575 scope.go:117] "RemoveContainer" containerID="6fa1c6c448c140638c011b404eabb5fcb410da6007e9dd8e59d58538190c8dda" Apr 17 14:25:23.326050 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.326020 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6fa1c6c448c140638c011b404eabb5fcb410da6007e9dd8e59d58538190c8dda"} err="failed to get container status \"6fa1c6c448c140638c011b404eabb5fcb410da6007e9dd8e59d58538190c8dda\": rpc error: code = NotFound desc = could not find container \"6fa1c6c448c140638c011b404eabb5fcb410da6007e9dd8e59d58538190c8dda\": container with ID starting with 6fa1c6c448c140638c011b404eabb5fcb410da6007e9dd8e59d58538190c8dda not found: ID does not exist" Apr 17 14:25:23.326050 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.326049 2575 scope.go:117] "RemoveContainer" containerID="1cf2d10522596483794d066c9db6edf31ba728f273580cf91e7be01438c8df03" Apr 17 14:25:23.326279 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.326263 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cf2d10522596483794d066c9db6edf31ba728f273580cf91e7be01438c8df03"} err="failed to get container status \"1cf2d10522596483794d066c9db6edf31ba728f273580cf91e7be01438c8df03\": rpc error: code = NotFound desc = could not find container \"1cf2d10522596483794d066c9db6edf31ba728f273580cf91e7be01438c8df03\": container with ID starting with 1cf2d10522596483794d066c9db6edf31ba728f273580cf91e7be01438c8df03 not found: ID does not exist" Apr 17 14:25:23.326337 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.326281 2575 scope.go:117] "RemoveContainer" containerID="fbe02d03fb433871811e2a0abeaa9e4cd47e6592e307396e83be993661bd43b4" Apr 17 14:25:23.326542 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.326520 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fbe02d03fb433871811e2a0abeaa9e4cd47e6592e307396e83be993661bd43b4"} err="failed to get container status \"fbe02d03fb433871811e2a0abeaa9e4cd47e6592e307396e83be993661bd43b4\": rpc error: code = NotFound desc = could not find container \"fbe02d03fb433871811e2a0abeaa9e4cd47e6592e307396e83be993661bd43b4\": container with ID starting with fbe02d03fb433871811e2a0abeaa9e4cd47e6592e307396e83be993661bd43b4 not found: ID does not exist" Apr 17 14:25:23.327247 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.327231 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 14:25:23.327498 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.327488 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e49357e5-7634-4418-8ccc-f492994acdaa" containerName="init-config-reloader" Apr 17 14:25:23.327532 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.327501 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="e49357e5-7634-4418-8ccc-f492994acdaa" containerName="init-config-reloader" Apr 17 14:25:23.327532 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.327511 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e49357e5-7634-4418-8ccc-f492994acdaa" containerName="kube-rbac-proxy-metric" Apr 17 14:25:23.327532 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.327516 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="e49357e5-7634-4418-8ccc-f492994acdaa" containerName="kube-rbac-proxy-metric" Apr 17 14:25:23.327532 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.327525 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b98644a1-99ab-4fab-be81-f0624b8a4c92" containerName="console" Apr 17 14:25:23.327532 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.327530 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="b98644a1-99ab-4fab-be81-f0624b8a4c92" containerName="console" Apr 17 14:25:23.327667 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.327538 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e49357e5-7634-4418-8ccc-f492994acdaa" containerName="kube-rbac-proxy-web" Apr 17 14:25:23.327667 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.327543 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="e49357e5-7634-4418-8ccc-f492994acdaa" containerName="kube-rbac-proxy-web" Apr 17 14:25:23.327667 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.327558 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e49357e5-7634-4418-8ccc-f492994acdaa" containerName="config-reloader" Apr 17 14:25:23.327667 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.327563 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="e49357e5-7634-4418-8ccc-f492994acdaa" containerName="config-reloader" Apr 17 14:25:23.327667 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.327569 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e49357e5-7634-4418-8ccc-f492994acdaa" containerName="kube-rbac-proxy" Apr 17 14:25:23.327667 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.327575 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="e49357e5-7634-4418-8ccc-f492994acdaa" containerName="kube-rbac-proxy" Apr 17 14:25:23.327667 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.327582 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e49357e5-7634-4418-8ccc-f492994acdaa" containerName="alertmanager" Apr 17 14:25:23.327667 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.327587 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="e49357e5-7634-4418-8ccc-f492994acdaa" containerName="alertmanager" Apr 17 14:25:23.327667 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.327592 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e49357e5-7634-4418-8ccc-f492994acdaa" containerName="prom-label-proxy" Apr 17 14:25:23.327667 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.327597 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="e49357e5-7634-4418-8ccc-f492994acdaa" containerName="prom-label-proxy" Apr 17 14:25:23.327667 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.327651 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="e49357e5-7634-4418-8ccc-f492994acdaa" containerName="alertmanager" Apr 17 14:25:23.327667 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.327659 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="e49357e5-7634-4418-8ccc-f492994acdaa" containerName="config-reloader" Apr 17 14:25:23.327667 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.327667 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="e49357e5-7634-4418-8ccc-f492994acdaa" containerName="kube-rbac-proxy-web" Apr 17 14:25:23.328081 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.327673 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="e49357e5-7634-4418-8ccc-f492994acdaa" containerName="prom-label-proxy" Apr 17 14:25:23.328081 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.327680 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="e49357e5-7634-4418-8ccc-f492994acdaa" containerName="kube-rbac-proxy" Apr 17 14:25:23.328081 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.327685 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="b98644a1-99ab-4fab-be81-f0624b8a4c92" containerName="console" Apr 17 14:25:23.328081 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.327691 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="e49357e5-7634-4418-8ccc-f492994acdaa" containerName="kube-rbac-proxy-metric" Apr 17 14:25:23.332711 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.332694 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:25:23.335261 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.335243 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 17 14:25:23.335523 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.335503 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 17 14:25:23.335616 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.335556 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 17 14:25:23.335616 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.335509 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 17 14:25:23.335722 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.335646 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 17 14:25:23.335795 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.335780 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 17 14:25:23.335843 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.335810 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 17 14:25:23.335903 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.335890 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 17 14:25:23.335959 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.335920 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-xkzrg\"" Apr 17 14:25:23.340721 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.340700 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 17 14:25:23.343685 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.343664 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 14:25:23.449681 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.449639 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/931dfe5d-2ef0-45e0-9977-5d431e964c6e-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"931dfe5d-2ef0-45e0-9977-5d431e964c6e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:25:23.449681 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.449687 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/931dfe5d-2ef0-45e0-9977-5d431e964c6e-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"931dfe5d-2ef0-45e0-9977-5d431e964c6e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:25:23.449984 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.449715 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rsqct\" (UniqueName: \"kubernetes.io/projected/931dfe5d-2ef0-45e0-9977-5d431e964c6e-kube-api-access-rsqct\") pod \"alertmanager-main-0\" (UID: \"931dfe5d-2ef0-45e0-9977-5d431e964c6e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:25:23.449984 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.449742 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/931dfe5d-2ef0-45e0-9977-5d431e964c6e-config-volume\") pod \"alertmanager-main-0\" (UID: \"931dfe5d-2ef0-45e0-9977-5d431e964c6e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:25:23.449984 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.449762 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/931dfe5d-2ef0-45e0-9977-5d431e964c6e-tls-assets\") pod \"alertmanager-main-0\" (UID: \"931dfe5d-2ef0-45e0-9977-5d431e964c6e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:25:23.449984 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.449794 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/931dfe5d-2ef0-45e0-9977-5d431e964c6e-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"931dfe5d-2ef0-45e0-9977-5d431e964c6e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:25:23.449984 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.449900 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/931dfe5d-2ef0-45e0-9977-5d431e964c6e-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"931dfe5d-2ef0-45e0-9977-5d431e964c6e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:25:23.449984 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.449943 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/931dfe5d-2ef0-45e0-9977-5d431e964c6e-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"931dfe5d-2ef0-45e0-9977-5d431e964c6e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:25:23.450227 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.449989 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/931dfe5d-2ef0-45e0-9977-5d431e964c6e-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"931dfe5d-2ef0-45e0-9977-5d431e964c6e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:25:23.450227 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.450012 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/931dfe5d-2ef0-45e0-9977-5d431e964c6e-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"931dfe5d-2ef0-45e0-9977-5d431e964c6e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:25:23.450227 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.450087 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/931dfe5d-2ef0-45e0-9977-5d431e964c6e-web-config\") pod \"alertmanager-main-0\" (UID: \"931dfe5d-2ef0-45e0-9977-5d431e964c6e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:25:23.450227 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.450121 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/931dfe5d-2ef0-45e0-9977-5d431e964c6e-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"931dfe5d-2ef0-45e0-9977-5d431e964c6e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:25:23.450227 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.450171 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/931dfe5d-2ef0-45e0-9977-5d431e964c6e-config-out\") pod \"alertmanager-main-0\" (UID: \"931dfe5d-2ef0-45e0-9977-5d431e964c6e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:25:23.550743 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.550653 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/931dfe5d-2ef0-45e0-9977-5d431e964c6e-web-config\") pod \"alertmanager-main-0\" (UID: \"931dfe5d-2ef0-45e0-9977-5d431e964c6e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:25:23.550743 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.550722 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/931dfe5d-2ef0-45e0-9977-5d431e964c6e-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"931dfe5d-2ef0-45e0-9977-5d431e964c6e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:25:23.550954 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.550937 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/931dfe5d-2ef0-45e0-9977-5d431e964c6e-config-out\") pod \"alertmanager-main-0\" (UID: \"931dfe5d-2ef0-45e0-9977-5d431e964c6e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:25:23.550992 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.550978 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/931dfe5d-2ef0-45e0-9977-5d431e964c6e-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"931dfe5d-2ef0-45e0-9977-5d431e964c6e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:25:23.551121 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.551104 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/931dfe5d-2ef0-45e0-9977-5d431e964c6e-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"931dfe5d-2ef0-45e0-9977-5d431e964c6e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:25:23.551199 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.551149 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rsqct\" (UniqueName: \"kubernetes.io/projected/931dfe5d-2ef0-45e0-9977-5d431e964c6e-kube-api-access-rsqct\") pod \"alertmanager-main-0\" (UID: \"931dfe5d-2ef0-45e0-9977-5d431e964c6e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:25:23.551199 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.551179 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/931dfe5d-2ef0-45e0-9977-5d431e964c6e-config-volume\") pod \"alertmanager-main-0\" (UID: \"931dfe5d-2ef0-45e0-9977-5d431e964c6e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:25:23.551294 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.551203 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/931dfe5d-2ef0-45e0-9977-5d431e964c6e-tls-assets\") pod \"alertmanager-main-0\" (UID: \"931dfe5d-2ef0-45e0-9977-5d431e964c6e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:25:23.551294 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.551255 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/931dfe5d-2ef0-45e0-9977-5d431e964c6e-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"931dfe5d-2ef0-45e0-9977-5d431e964c6e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:25:23.551395 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.551300 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/931dfe5d-2ef0-45e0-9977-5d431e964c6e-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"931dfe5d-2ef0-45e0-9977-5d431e964c6e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:25:23.551395 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.551333 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/931dfe5d-2ef0-45e0-9977-5d431e964c6e-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"931dfe5d-2ef0-45e0-9977-5d431e964c6e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:25:23.551395 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.551374 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/931dfe5d-2ef0-45e0-9977-5d431e964c6e-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"931dfe5d-2ef0-45e0-9977-5d431e964c6e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:25:23.551564 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.551402 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/931dfe5d-2ef0-45e0-9977-5d431e964c6e-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"931dfe5d-2ef0-45e0-9977-5d431e964c6e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:25:23.551675 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.551651 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/931dfe5d-2ef0-45e0-9977-5d431e964c6e-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"931dfe5d-2ef0-45e0-9977-5d431e964c6e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:25:23.551761 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.551741 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/931dfe5d-2ef0-45e0-9977-5d431e964c6e-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"931dfe5d-2ef0-45e0-9977-5d431e964c6e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:25:23.551821 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.551777 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/931dfe5d-2ef0-45e0-9977-5d431e964c6e-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"931dfe5d-2ef0-45e0-9977-5d431e964c6e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:25:23.553805 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.553773 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/931dfe5d-2ef0-45e0-9977-5d431e964c6e-config-out\") pod \"alertmanager-main-0\" (UID: \"931dfe5d-2ef0-45e0-9977-5d431e964c6e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:25:23.554112 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.554075 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/931dfe5d-2ef0-45e0-9977-5d431e964c6e-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"931dfe5d-2ef0-45e0-9977-5d431e964c6e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:25:23.554224 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.554112 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/931dfe5d-2ef0-45e0-9977-5d431e964c6e-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"931dfe5d-2ef0-45e0-9977-5d431e964c6e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:25:23.554224 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.554213 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/931dfe5d-2ef0-45e0-9977-5d431e964c6e-web-config\") pod \"alertmanager-main-0\" (UID: \"931dfe5d-2ef0-45e0-9977-5d431e964c6e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:25:23.554408 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.554388 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/931dfe5d-2ef0-45e0-9977-5d431e964c6e-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"931dfe5d-2ef0-45e0-9977-5d431e964c6e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:25:23.554459 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.554403 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/931dfe5d-2ef0-45e0-9977-5d431e964c6e-tls-assets\") pod \"alertmanager-main-0\" (UID: \"931dfe5d-2ef0-45e0-9977-5d431e964c6e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:25:23.554587 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.554569 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/931dfe5d-2ef0-45e0-9977-5d431e964c6e-config-volume\") pod \"alertmanager-main-0\" (UID: \"931dfe5d-2ef0-45e0-9977-5d431e964c6e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:25:23.554620 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.554592 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/931dfe5d-2ef0-45e0-9977-5d431e964c6e-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"931dfe5d-2ef0-45e0-9977-5d431e964c6e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:25:23.555636 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.555618 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/931dfe5d-2ef0-45e0-9977-5d431e964c6e-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"931dfe5d-2ef0-45e0-9977-5d431e964c6e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:25:23.558739 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.558721 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rsqct\" (UniqueName: \"kubernetes.io/projected/931dfe5d-2ef0-45e0-9977-5d431e964c6e-kube-api-access-rsqct\") pod \"alertmanager-main-0\" (UID: \"931dfe5d-2ef0-45e0-9977-5d431e964c6e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:25:23.642902 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.642856 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:25:23.765526 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:23.765499 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 14:25:23.767597 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:25:23.767569 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod931dfe5d_2ef0_45e0_9977_5d431e964c6e.slice/crio-3eaf43554e345d7542a6e53bacbd53a65aeb07d62d093fe97b42f8b26fb65cf3 WatchSource:0}: Error finding container 3eaf43554e345d7542a6e53bacbd53a65aeb07d62d093fe97b42f8b26fb65cf3: Status 404 returned error can't find the container with id 3eaf43554e345d7542a6e53bacbd53a65aeb07d62d093fe97b42f8b26fb65cf3 Apr 17 14:25:24.283218 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:24.283178 2575 generic.go:358] "Generic (PLEG): container finished" podID="931dfe5d-2ef0-45e0-9977-5d431e964c6e" containerID="bd5165c56be236b08aed1c61698e1b26837ca5a35a6077000c80d480f898aecf" exitCode=0 Apr 17 14:25:24.283600 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:24.283223 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"931dfe5d-2ef0-45e0-9977-5d431e964c6e","Type":"ContainerDied","Data":"bd5165c56be236b08aed1c61698e1b26837ca5a35a6077000c80d480f898aecf"} Apr 17 14:25:24.283600 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:24.283258 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"931dfe5d-2ef0-45e0-9977-5d431e964c6e","Type":"ContainerStarted","Data":"3eaf43554e345d7542a6e53bacbd53a65aeb07d62d093fe97b42f8b26fb65cf3"} Apr 17 14:25:24.403372 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:24.403341 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e49357e5-7634-4418-8ccc-f492994acdaa" path="/var/lib/kubelet/pods/e49357e5-7634-4418-8ccc-f492994acdaa/volumes" Apr 17 14:25:25.292866 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:25.292832 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"931dfe5d-2ef0-45e0-9977-5d431e964c6e","Type":"ContainerStarted","Data":"428cde5b761ec035ecf90c47103aba63f9d97b4af3b2efdacfbab27c29776441"} Apr 17 14:25:25.292866 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:25.292866 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"931dfe5d-2ef0-45e0-9977-5d431e964c6e","Type":"ContainerStarted","Data":"d4fa43e7d351eabb9e6cb37b3eb2b737766138994137dd8b46c9332719bc7bf8"} Apr 17 14:25:25.293380 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:25.292895 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"931dfe5d-2ef0-45e0-9977-5d431e964c6e","Type":"ContainerStarted","Data":"943619e362fc193900a0dc0e4ddc1ba409eef75b6e994c53ac87586069a8fdf5"} Apr 17 14:25:25.293380 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:25.292905 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"931dfe5d-2ef0-45e0-9977-5d431e964c6e","Type":"ContainerStarted","Data":"0bdfdf510dac73706413d26fb156e705d46633b087cc2733f9af3d99ce144413"} Apr 17 14:25:25.293380 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:25.292913 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"931dfe5d-2ef0-45e0-9977-5d431e964c6e","Type":"ContainerStarted","Data":"9e66241fcd90c7323a88e01c379600d62ff5dcd73f3c7f00e7c5d37b4d664f59"} Apr 17 14:25:25.293380 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:25.292921 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"931dfe5d-2ef0-45e0-9977-5d431e964c6e","Type":"ContainerStarted","Data":"a62eaf62b4151ffd925574e549274affa97802a7b3aeb3d06d4d066d0ba426bb"} Apr 17 14:25:25.318280 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:25.318228 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.318211229 podStartE2EDuration="2.318211229s" podCreationTimestamp="2026-04-17 14:25:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 14:25:25.315914884 +0000 UTC m=+243.622668504" watchObservedRunningTime="2026-04-17 14:25:25.318211229 +0000 UTC m=+243.624964849" Apr 17 14:25:25.659697 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:25.659656 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-757dc89df9-s8xkt"] Apr 17 14:25:25.663150 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:25.663134 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-757dc89df9-s8xkt" Apr 17 14:25:25.665845 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:25.665818 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 17 14:25:25.665983 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:25.665961 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 17 14:25:25.666048 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:25.666001 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 17 14:25:25.666125 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:25.666064 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-kh8jw\"" Apr 17 14:25:25.666125 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:25.666109 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 17 14:25:25.666398 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:25.666381 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 17 14:25:25.671098 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:25.671010 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 17 14:25:25.673885 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:25.673855 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-757dc89df9-s8xkt"] Apr 17 14:25:25.778329 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:25.778293 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/63eb63dd-c3b9-4102-80f3-158a88f0d1f1-federate-client-tls\") pod \"telemeter-client-757dc89df9-s8xkt\" (UID: \"63eb63dd-c3b9-4102-80f3-158a88f0d1f1\") " pod="openshift-monitoring/telemeter-client-757dc89df9-s8xkt" Apr 17 14:25:25.778329 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:25.778334 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/63eb63dd-c3b9-4102-80f3-158a88f0d1f1-metrics-client-ca\") pod \"telemeter-client-757dc89df9-s8xkt\" (UID: \"63eb63dd-c3b9-4102-80f3-158a88f0d1f1\") " pod="openshift-monitoring/telemeter-client-757dc89df9-s8xkt" Apr 17 14:25:25.778536 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:25.778441 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/63eb63dd-c3b9-4102-80f3-158a88f0d1f1-secret-telemeter-client\") pod \"telemeter-client-757dc89df9-s8xkt\" (UID: \"63eb63dd-c3b9-4102-80f3-158a88f0d1f1\") " pod="openshift-monitoring/telemeter-client-757dc89df9-s8xkt" Apr 17 14:25:25.778536 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:25.778505 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/63eb63dd-c3b9-4102-80f3-158a88f0d1f1-telemeter-client-tls\") pod \"telemeter-client-757dc89df9-s8xkt\" (UID: \"63eb63dd-c3b9-4102-80f3-158a88f0d1f1\") " pod="openshift-monitoring/telemeter-client-757dc89df9-s8xkt" Apr 17 14:25:25.778536 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:25.778527 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/63eb63dd-c3b9-4102-80f3-158a88f0d1f1-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-757dc89df9-s8xkt\" (UID: \"63eb63dd-c3b9-4102-80f3-158a88f0d1f1\") " pod="openshift-monitoring/telemeter-client-757dc89df9-s8xkt" Apr 17 14:25:25.778645 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:25.778552 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/63eb63dd-c3b9-4102-80f3-158a88f0d1f1-telemeter-trusted-ca-bundle\") pod \"telemeter-client-757dc89df9-s8xkt\" (UID: \"63eb63dd-c3b9-4102-80f3-158a88f0d1f1\") " pod="openshift-monitoring/telemeter-client-757dc89df9-s8xkt" Apr 17 14:25:25.778645 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:25.778578 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/63eb63dd-c3b9-4102-80f3-158a88f0d1f1-serving-certs-ca-bundle\") pod \"telemeter-client-757dc89df9-s8xkt\" (UID: \"63eb63dd-c3b9-4102-80f3-158a88f0d1f1\") " pod="openshift-monitoring/telemeter-client-757dc89df9-s8xkt" Apr 17 14:25:25.778645 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:25.778613 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x855s\" (UniqueName: \"kubernetes.io/projected/63eb63dd-c3b9-4102-80f3-158a88f0d1f1-kube-api-access-x855s\") pod \"telemeter-client-757dc89df9-s8xkt\" (UID: \"63eb63dd-c3b9-4102-80f3-158a88f0d1f1\") " pod="openshift-monitoring/telemeter-client-757dc89df9-s8xkt" Apr 17 14:25:25.879580 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:25.879533 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/63eb63dd-c3b9-4102-80f3-158a88f0d1f1-serving-certs-ca-bundle\") pod \"telemeter-client-757dc89df9-s8xkt\" (UID: \"63eb63dd-c3b9-4102-80f3-158a88f0d1f1\") " pod="openshift-monitoring/telemeter-client-757dc89df9-s8xkt" Apr 17 14:25:25.879580 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:25.879585 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x855s\" (UniqueName: \"kubernetes.io/projected/63eb63dd-c3b9-4102-80f3-158a88f0d1f1-kube-api-access-x855s\") pod \"telemeter-client-757dc89df9-s8xkt\" (UID: \"63eb63dd-c3b9-4102-80f3-158a88f0d1f1\") " pod="openshift-monitoring/telemeter-client-757dc89df9-s8xkt" Apr 17 14:25:25.879790 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:25.879640 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/63eb63dd-c3b9-4102-80f3-158a88f0d1f1-federate-client-tls\") pod \"telemeter-client-757dc89df9-s8xkt\" (UID: \"63eb63dd-c3b9-4102-80f3-158a88f0d1f1\") " pod="openshift-monitoring/telemeter-client-757dc89df9-s8xkt" Apr 17 14:25:25.879790 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:25.879670 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/63eb63dd-c3b9-4102-80f3-158a88f0d1f1-metrics-client-ca\") pod \"telemeter-client-757dc89df9-s8xkt\" (UID: \"63eb63dd-c3b9-4102-80f3-158a88f0d1f1\") " pod="openshift-monitoring/telemeter-client-757dc89df9-s8xkt" Apr 17 14:25:25.879790 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:25.879731 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/63eb63dd-c3b9-4102-80f3-158a88f0d1f1-secret-telemeter-client\") pod \"telemeter-client-757dc89df9-s8xkt\" (UID: \"63eb63dd-c3b9-4102-80f3-158a88f0d1f1\") " pod="openshift-monitoring/telemeter-client-757dc89df9-s8xkt" Apr 17 14:25:25.879968 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:25.879801 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/63eb63dd-c3b9-4102-80f3-158a88f0d1f1-telemeter-client-tls\") pod \"telemeter-client-757dc89df9-s8xkt\" (UID: \"63eb63dd-c3b9-4102-80f3-158a88f0d1f1\") " pod="openshift-monitoring/telemeter-client-757dc89df9-s8xkt" Apr 17 14:25:25.879968 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:25.879828 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/63eb63dd-c3b9-4102-80f3-158a88f0d1f1-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-757dc89df9-s8xkt\" (UID: \"63eb63dd-c3b9-4102-80f3-158a88f0d1f1\") " pod="openshift-monitoring/telemeter-client-757dc89df9-s8xkt" Apr 17 14:25:25.879968 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:25.879866 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/63eb63dd-c3b9-4102-80f3-158a88f0d1f1-telemeter-trusted-ca-bundle\") pod \"telemeter-client-757dc89df9-s8xkt\" (UID: \"63eb63dd-c3b9-4102-80f3-158a88f0d1f1\") " pod="openshift-monitoring/telemeter-client-757dc89df9-s8xkt" Apr 17 14:25:25.880696 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:25.880610 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/63eb63dd-c3b9-4102-80f3-158a88f0d1f1-serving-certs-ca-bundle\") pod \"telemeter-client-757dc89df9-s8xkt\" (UID: \"63eb63dd-c3b9-4102-80f3-158a88f0d1f1\") " pod="openshift-monitoring/telemeter-client-757dc89df9-s8xkt" Apr 17 14:25:25.881166 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:25.881139 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/63eb63dd-c3b9-4102-80f3-158a88f0d1f1-telemeter-trusted-ca-bundle\") pod \"telemeter-client-757dc89df9-s8xkt\" (UID: \"63eb63dd-c3b9-4102-80f3-158a88f0d1f1\") " pod="openshift-monitoring/telemeter-client-757dc89df9-s8xkt" Apr 17 14:25:25.881753 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:25.881726 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/63eb63dd-c3b9-4102-80f3-158a88f0d1f1-metrics-client-ca\") pod \"telemeter-client-757dc89df9-s8xkt\" (UID: \"63eb63dd-c3b9-4102-80f3-158a88f0d1f1\") " pod="openshift-monitoring/telemeter-client-757dc89df9-s8xkt" Apr 17 14:25:25.885247 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:25.883243 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/63eb63dd-c3b9-4102-80f3-158a88f0d1f1-federate-client-tls\") pod \"telemeter-client-757dc89df9-s8xkt\" (UID: \"63eb63dd-c3b9-4102-80f3-158a88f0d1f1\") " pod="openshift-monitoring/telemeter-client-757dc89df9-s8xkt" Apr 17 14:25:25.885247 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:25.883304 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/63eb63dd-c3b9-4102-80f3-158a88f0d1f1-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-757dc89df9-s8xkt\" (UID: \"63eb63dd-c3b9-4102-80f3-158a88f0d1f1\") " pod="openshift-monitoring/telemeter-client-757dc89df9-s8xkt" Apr 17 14:25:25.885247 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:25.883557 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/63eb63dd-c3b9-4102-80f3-158a88f0d1f1-telemeter-client-tls\") pod \"telemeter-client-757dc89df9-s8xkt\" (UID: \"63eb63dd-c3b9-4102-80f3-158a88f0d1f1\") " pod="openshift-monitoring/telemeter-client-757dc89df9-s8xkt" Apr 17 14:25:25.888229 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:25.888204 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x855s\" (UniqueName: \"kubernetes.io/projected/63eb63dd-c3b9-4102-80f3-158a88f0d1f1-kube-api-access-x855s\") pod \"telemeter-client-757dc89df9-s8xkt\" (UID: \"63eb63dd-c3b9-4102-80f3-158a88f0d1f1\") " pod="openshift-monitoring/telemeter-client-757dc89df9-s8xkt" Apr 17 14:25:25.888311 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:25.888224 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/63eb63dd-c3b9-4102-80f3-158a88f0d1f1-secret-telemeter-client\") pod \"telemeter-client-757dc89df9-s8xkt\" (UID: \"63eb63dd-c3b9-4102-80f3-158a88f0d1f1\") " pod="openshift-monitoring/telemeter-client-757dc89df9-s8xkt" Apr 17 14:25:25.974111 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:25.974021 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-757dc89df9-s8xkt" Apr 17 14:25:26.099847 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:26.099817 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-757dc89df9-s8xkt"] Apr 17 14:25:26.103049 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:25:26.103014 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63eb63dd_c3b9_4102_80f3_158a88f0d1f1.slice/crio-6e818dc28f0fa38adc85fdceef115016822307523b85d0375ba22f136db83501 WatchSource:0}: Error finding container 6e818dc28f0fa38adc85fdceef115016822307523b85d0375ba22f136db83501: Status 404 returned error can't find the container with id 6e818dc28f0fa38adc85fdceef115016822307523b85d0375ba22f136db83501 Apr 17 14:25:26.296509 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:26.296428 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-757dc89df9-s8xkt" event={"ID":"63eb63dd-c3b9-4102-80f3-158a88f0d1f1","Type":"ContainerStarted","Data":"6e818dc28f0fa38adc85fdceef115016822307523b85d0375ba22f136db83501"} Apr 17 14:25:29.307129 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:29.307087 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-757dc89df9-s8xkt" event={"ID":"63eb63dd-c3b9-4102-80f3-158a88f0d1f1","Type":"ContainerStarted","Data":"ff54e89b061c4c04dcf2768bfd49ee6870c378aa6ec74331c8129f17cb9995f4"} Apr 17 14:25:29.307129 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:29.307125 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-757dc89df9-s8xkt" event={"ID":"63eb63dd-c3b9-4102-80f3-158a88f0d1f1","Type":"ContainerStarted","Data":"f76653587cb927fcfb1c8b418db3f853a1614667265e4128d5ff02d6753a8c61"} Apr 17 14:25:29.307129 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:29.307136 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-757dc89df9-s8xkt" event={"ID":"63eb63dd-c3b9-4102-80f3-158a88f0d1f1","Type":"ContainerStarted","Data":"0a0b775c88d1971ed36d65abd8fafccdf932f49ea67eea33d3cf6077dfc917da"} Apr 17 14:25:29.329504 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:29.329458 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-757dc89df9-s8xkt" podStartSLOduration=2.088949353 podStartE2EDuration="4.329442984s" podCreationTimestamp="2026-04-17 14:25:25 +0000 UTC" firstStartedPulling="2026-04-17 14:25:26.104896734 +0000 UTC m=+244.411650332" lastFinishedPulling="2026-04-17 14:25:28.34539036 +0000 UTC m=+246.652143963" observedRunningTime="2026-04-17 14:25:29.328734572 +0000 UTC m=+247.635488177" watchObservedRunningTime="2026-04-17 14:25:29.329442984 +0000 UTC m=+247.636196606" Apr 17 14:25:30.330815 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:30.330784 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-578d4c8b9-kkzhw"] Apr 17 14:25:30.334424 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:30.334402 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-578d4c8b9-kkzhw" Apr 17 14:25:30.346312 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:30.346288 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-578d4c8b9-kkzhw"] Apr 17 14:25:30.423924 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:30.423888 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0cc9a4f3-c03a-474d-9baf-bfc6c160ecfc-service-ca\") pod \"console-578d4c8b9-kkzhw\" (UID: \"0cc9a4f3-c03a-474d-9baf-bfc6c160ecfc\") " pod="openshift-console/console-578d4c8b9-kkzhw" Apr 17 14:25:30.424121 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:30.424098 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0cc9a4f3-c03a-474d-9baf-bfc6c160ecfc-console-serving-cert\") pod \"console-578d4c8b9-kkzhw\" (UID: \"0cc9a4f3-c03a-474d-9baf-bfc6c160ecfc\") " pod="openshift-console/console-578d4c8b9-kkzhw" Apr 17 14:25:30.424214 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:30.424179 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57mjl\" (UniqueName: \"kubernetes.io/projected/0cc9a4f3-c03a-474d-9baf-bfc6c160ecfc-kube-api-access-57mjl\") pod \"console-578d4c8b9-kkzhw\" (UID: \"0cc9a4f3-c03a-474d-9baf-bfc6c160ecfc\") " pod="openshift-console/console-578d4c8b9-kkzhw" Apr 17 14:25:30.424829 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:30.424766 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0cc9a4f3-c03a-474d-9baf-bfc6c160ecfc-oauth-serving-cert\") pod \"console-578d4c8b9-kkzhw\" (UID: \"0cc9a4f3-c03a-474d-9baf-bfc6c160ecfc\") " pod="openshift-console/console-578d4c8b9-kkzhw" Apr 17 14:25:30.424969 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:30.424860 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0cc9a4f3-c03a-474d-9baf-bfc6c160ecfc-console-oauth-config\") pod \"console-578d4c8b9-kkzhw\" (UID: \"0cc9a4f3-c03a-474d-9baf-bfc6c160ecfc\") " pod="openshift-console/console-578d4c8b9-kkzhw" Apr 17 14:25:30.424969 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:30.424911 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0cc9a4f3-c03a-474d-9baf-bfc6c160ecfc-console-config\") pod \"console-578d4c8b9-kkzhw\" (UID: \"0cc9a4f3-c03a-474d-9baf-bfc6c160ecfc\") " pod="openshift-console/console-578d4c8b9-kkzhw" Apr 17 14:25:30.425080 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:30.425009 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0cc9a4f3-c03a-474d-9baf-bfc6c160ecfc-trusted-ca-bundle\") pod \"console-578d4c8b9-kkzhw\" (UID: \"0cc9a4f3-c03a-474d-9baf-bfc6c160ecfc\") " pod="openshift-console/console-578d4c8b9-kkzhw" Apr 17 14:25:30.526472 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:30.526414 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0cc9a4f3-c03a-474d-9baf-bfc6c160ecfc-console-oauth-config\") pod \"console-578d4c8b9-kkzhw\" (UID: \"0cc9a4f3-c03a-474d-9baf-bfc6c160ecfc\") " pod="openshift-console/console-578d4c8b9-kkzhw" Apr 17 14:25:30.526472 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:30.526472 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0cc9a4f3-c03a-474d-9baf-bfc6c160ecfc-console-config\") pod \"console-578d4c8b9-kkzhw\" (UID: \"0cc9a4f3-c03a-474d-9baf-bfc6c160ecfc\") " pod="openshift-console/console-578d4c8b9-kkzhw" Apr 17 14:25:30.526725 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:30.526508 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0cc9a4f3-c03a-474d-9baf-bfc6c160ecfc-trusted-ca-bundle\") pod \"console-578d4c8b9-kkzhw\" (UID: \"0cc9a4f3-c03a-474d-9baf-bfc6c160ecfc\") " pod="openshift-console/console-578d4c8b9-kkzhw" Apr 17 14:25:30.526725 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:30.526565 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0cc9a4f3-c03a-474d-9baf-bfc6c160ecfc-service-ca\") pod \"console-578d4c8b9-kkzhw\" (UID: \"0cc9a4f3-c03a-474d-9baf-bfc6c160ecfc\") " pod="openshift-console/console-578d4c8b9-kkzhw" Apr 17 14:25:30.526725 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:30.526638 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0cc9a4f3-c03a-474d-9baf-bfc6c160ecfc-console-serving-cert\") pod \"console-578d4c8b9-kkzhw\" (UID: \"0cc9a4f3-c03a-474d-9baf-bfc6c160ecfc\") " pod="openshift-console/console-578d4c8b9-kkzhw" Apr 17 14:25:30.526725 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:30.526667 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-57mjl\" (UniqueName: \"kubernetes.io/projected/0cc9a4f3-c03a-474d-9baf-bfc6c160ecfc-kube-api-access-57mjl\") pod \"console-578d4c8b9-kkzhw\" (UID: \"0cc9a4f3-c03a-474d-9baf-bfc6c160ecfc\") " pod="openshift-console/console-578d4c8b9-kkzhw" Apr 17 14:25:30.526725 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:30.526695 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0cc9a4f3-c03a-474d-9baf-bfc6c160ecfc-oauth-serving-cert\") pod \"console-578d4c8b9-kkzhw\" (UID: \"0cc9a4f3-c03a-474d-9baf-bfc6c160ecfc\") " pod="openshift-console/console-578d4c8b9-kkzhw" Apr 17 14:25:30.527382 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:30.527348 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0cc9a4f3-c03a-474d-9baf-bfc6c160ecfc-console-config\") pod \"console-578d4c8b9-kkzhw\" (UID: \"0cc9a4f3-c03a-474d-9baf-bfc6c160ecfc\") " pod="openshift-console/console-578d4c8b9-kkzhw" Apr 17 14:25:30.527488 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:30.527429 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0cc9a4f3-c03a-474d-9baf-bfc6c160ecfc-oauth-serving-cert\") pod \"console-578d4c8b9-kkzhw\" (UID: \"0cc9a4f3-c03a-474d-9baf-bfc6c160ecfc\") " pod="openshift-console/console-578d4c8b9-kkzhw" Apr 17 14:25:30.527527 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:30.527509 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0cc9a4f3-c03a-474d-9baf-bfc6c160ecfc-trusted-ca-bundle\") pod \"console-578d4c8b9-kkzhw\" (UID: \"0cc9a4f3-c03a-474d-9baf-bfc6c160ecfc\") " pod="openshift-console/console-578d4c8b9-kkzhw" Apr 17 14:25:30.527564 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:30.527518 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0cc9a4f3-c03a-474d-9baf-bfc6c160ecfc-service-ca\") pod \"console-578d4c8b9-kkzhw\" (UID: \"0cc9a4f3-c03a-474d-9baf-bfc6c160ecfc\") " pod="openshift-console/console-578d4c8b9-kkzhw" Apr 17 14:25:30.528846 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:30.528819 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0cc9a4f3-c03a-474d-9baf-bfc6c160ecfc-console-oauth-config\") pod \"console-578d4c8b9-kkzhw\" (UID: \"0cc9a4f3-c03a-474d-9baf-bfc6c160ecfc\") " pod="openshift-console/console-578d4c8b9-kkzhw" Apr 17 14:25:30.529028 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:30.529011 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0cc9a4f3-c03a-474d-9baf-bfc6c160ecfc-console-serving-cert\") pod \"console-578d4c8b9-kkzhw\" (UID: \"0cc9a4f3-c03a-474d-9baf-bfc6c160ecfc\") " pod="openshift-console/console-578d4c8b9-kkzhw" Apr 17 14:25:30.534333 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:30.534310 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-57mjl\" (UniqueName: \"kubernetes.io/projected/0cc9a4f3-c03a-474d-9baf-bfc6c160ecfc-kube-api-access-57mjl\") pod \"console-578d4c8b9-kkzhw\" (UID: \"0cc9a4f3-c03a-474d-9baf-bfc6c160ecfc\") " pod="openshift-console/console-578d4c8b9-kkzhw" Apr 17 14:25:30.643726 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:30.643639 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-578d4c8b9-kkzhw" Apr 17 14:25:30.760517 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:30.760364 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-578d4c8b9-kkzhw"] Apr 17 14:25:30.763991 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:25:30.763957 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0cc9a4f3_c03a_474d_9baf_bfc6c160ecfc.slice/crio-3d938d3b4d4680d2f7ba8b5a4652ccc73f4e824daf37fbc212283437f7a3e307 WatchSource:0}: Error finding container 3d938d3b4d4680d2f7ba8b5a4652ccc73f4e824daf37fbc212283437f7a3e307: Status 404 returned error can't find the container with id 3d938d3b4d4680d2f7ba8b5a4652ccc73f4e824daf37fbc212283437f7a3e307 Apr 17 14:25:31.315354 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:31.315316 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-578d4c8b9-kkzhw" event={"ID":"0cc9a4f3-c03a-474d-9baf-bfc6c160ecfc","Type":"ContainerStarted","Data":"ee3b4a7a72e07d8d5ad64b361ff545c5cfe2a554d59625e620274fa838cceba7"} Apr 17 14:25:31.315354 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:31.315355 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-578d4c8b9-kkzhw" event={"ID":"0cc9a4f3-c03a-474d-9baf-bfc6c160ecfc","Type":"ContainerStarted","Data":"3d938d3b4d4680d2f7ba8b5a4652ccc73f4e824daf37fbc212283437f7a3e307"} Apr 17 14:25:31.333153 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:31.333109 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-578d4c8b9-kkzhw" podStartSLOduration=1.333097751 podStartE2EDuration="1.333097751s" podCreationTimestamp="2026-04-17 14:25:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 14:25:31.331533893 +0000 UTC m=+249.638287512" watchObservedRunningTime="2026-04-17 14:25:31.333097751 +0000 UTC m=+249.639851370" Apr 17 14:25:33.153507 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:33.153471 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d0eca2ae-83f6-462e-b7d9-9ab1592717a8-metrics-certs\") pod \"network-metrics-daemon-ccsgf\" (UID: \"d0eca2ae-83f6-462e-b7d9-9ab1592717a8\") " pod="openshift-multus/network-metrics-daemon-ccsgf" Apr 17 14:25:33.155814 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:33.155789 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d0eca2ae-83f6-462e-b7d9-9ab1592717a8-metrics-certs\") pod \"network-metrics-daemon-ccsgf\" (UID: \"d0eca2ae-83f6-462e-b7d9-9ab1592717a8\") " pod="openshift-multus/network-metrics-daemon-ccsgf" Apr 17 14:25:33.202067 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:33.202035 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-hxvwz\"" Apr 17 14:25:33.209627 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:33.209609 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ccsgf" Apr 17 14:25:33.339536 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:33.339512 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-ccsgf"] Apr 17 14:25:33.341939 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:25:33.341914 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd0eca2ae_83f6_462e_b7d9_9ab1592717a8.slice/crio-2a46c18910b12ae415b58d04a117ee6af9fee39062f0d9d74fb258e7c9377cb1 WatchSource:0}: Error finding container 2a46c18910b12ae415b58d04a117ee6af9fee39062f0d9d74fb258e7c9377cb1: Status 404 returned error can't find the container with id 2a46c18910b12ae415b58d04a117ee6af9fee39062f0d9d74fb258e7c9377cb1 Apr 17 14:25:34.326050 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:34.326017 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-ccsgf" event={"ID":"d0eca2ae-83f6-462e-b7d9-9ab1592717a8","Type":"ContainerStarted","Data":"2a46c18910b12ae415b58d04a117ee6af9fee39062f0d9d74fb258e7c9377cb1"} Apr 17 14:25:35.330821 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:35.330780 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-ccsgf" event={"ID":"d0eca2ae-83f6-462e-b7d9-9ab1592717a8","Type":"ContainerStarted","Data":"e6ed75953a80763a2a972c53fb9449f4eb11d09251f2ad50f3238247f6d9569f"} Apr 17 14:25:35.331201 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:35.330826 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-ccsgf" event={"ID":"d0eca2ae-83f6-462e-b7d9-9ab1592717a8","Type":"ContainerStarted","Data":"3aca1feb157631ccff2ff86a3e809bd1abb0cc08a19ed3993242ac42010a0d1e"} Apr 17 14:25:35.348228 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:35.348181 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-ccsgf" podStartSLOduration=252.341118875 podStartE2EDuration="4m13.348161989s" podCreationTimestamp="2026-04-17 14:21:22 +0000 UTC" firstStartedPulling="2026-04-17 14:25:33.343772667 +0000 UTC m=+251.650526264" lastFinishedPulling="2026-04-17 14:25:34.35081578 +0000 UTC m=+252.657569378" observedRunningTime="2026-04-17 14:25:35.345867005 +0000 UTC m=+253.652620649" watchObservedRunningTime="2026-04-17 14:25:35.348161989 +0000 UTC m=+253.654915611" Apr 17 14:25:40.644781 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:40.644676 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-578d4c8b9-kkzhw" Apr 17 14:25:40.644781 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:40.644736 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-578d4c8b9-kkzhw" Apr 17 14:25:40.649461 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:40.649439 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-578d4c8b9-kkzhw" Apr 17 14:25:41.354113 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:41.354087 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-578d4c8b9-kkzhw" Apr 17 14:25:41.399029 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:25:41.398994 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6bb78d9b4d-ccnzw"] Apr 17 14:26:06.419928 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:26:06.419853 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6bb78d9b4d-ccnzw" podUID="963c2c4f-a170-4ee7-804c-bd319f563375" containerName="console" containerID="cri-o://975d889f86e944dbea587111f5d2d2e0cb0652dc035e32aa33b3e17d6ab1f9bc" gracePeriod=15 Apr 17 14:26:06.664697 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:26:06.664671 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6bb78d9b4d-ccnzw_963c2c4f-a170-4ee7-804c-bd319f563375/console/0.log" Apr 17 14:26:06.664830 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:26:06.664744 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6bb78d9b4d-ccnzw" Apr 17 14:26:06.743865 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:26:06.743777 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/963c2c4f-a170-4ee7-804c-bd319f563375-console-config\") pod \"963c2c4f-a170-4ee7-804c-bd319f563375\" (UID: \"963c2c4f-a170-4ee7-804c-bd319f563375\") " Apr 17 14:26:06.743865 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:26:06.743815 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/963c2c4f-a170-4ee7-804c-bd319f563375-console-serving-cert\") pod \"963c2c4f-a170-4ee7-804c-bd319f563375\" (UID: \"963c2c4f-a170-4ee7-804c-bd319f563375\") " Apr 17 14:26:06.743865 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:26:06.743845 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/963c2c4f-a170-4ee7-804c-bd319f563375-trusted-ca-bundle\") pod \"963c2c4f-a170-4ee7-804c-bd319f563375\" (UID: \"963c2c4f-a170-4ee7-804c-bd319f563375\") " Apr 17 14:26:06.744165 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:26:06.743901 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/963c2c4f-a170-4ee7-804c-bd319f563375-service-ca\") pod \"963c2c4f-a170-4ee7-804c-bd319f563375\" (UID: \"963c2c4f-a170-4ee7-804c-bd319f563375\") " Apr 17 14:26:06.744165 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:26:06.743947 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/963c2c4f-a170-4ee7-804c-bd319f563375-console-oauth-config\") pod \"963c2c4f-a170-4ee7-804c-bd319f563375\" (UID: \"963c2c4f-a170-4ee7-804c-bd319f563375\") " Apr 17 14:26:06.744165 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:26:06.743966 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/963c2c4f-a170-4ee7-804c-bd319f563375-oauth-serving-cert\") pod \"963c2c4f-a170-4ee7-804c-bd319f563375\" (UID: \"963c2c4f-a170-4ee7-804c-bd319f563375\") " Apr 17 14:26:06.744165 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:26:06.743999 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qr24l\" (UniqueName: \"kubernetes.io/projected/963c2c4f-a170-4ee7-804c-bd319f563375-kube-api-access-qr24l\") pod \"963c2c4f-a170-4ee7-804c-bd319f563375\" (UID: \"963c2c4f-a170-4ee7-804c-bd319f563375\") " Apr 17 14:26:06.744362 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:26:06.744280 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/963c2c4f-a170-4ee7-804c-bd319f563375-console-config" (OuterVolumeSpecName: "console-config") pod "963c2c4f-a170-4ee7-804c-bd319f563375" (UID: "963c2c4f-a170-4ee7-804c-bd319f563375"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 14:26:06.744465 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:26:06.744437 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/963c2c4f-a170-4ee7-804c-bd319f563375-service-ca" (OuterVolumeSpecName: "service-ca") pod "963c2c4f-a170-4ee7-804c-bd319f563375" (UID: "963c2c4f-a170-4ee7-804c-bd319f563375"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 14:26:06.744533 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:26:06.744504 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/963c2c4f-a170-4ee7-804c-bd319f563375-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "963c2c4f-a170-4ee7-804c-bd319f563375" (UID: "963c2c4f-a170-4ee7-804c-bd319f563375"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 14:26:06.744585 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:26:06.744526 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/963c2c4f-a170-4ee7-804c-bd319f563375-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "963c2c4f-a170-4ee7-804c-bd319f563375" (UID: "963c2c4f-a170-4ee7-804c-bd319f563375"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 14:26:06.746106 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:26:06.746087 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/963c2c4f-a170-4ee7-804c-bd319f563375-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "963c2c4f-a170-4ee7-804c-bd319f563375" (UID: "963c2c4f-a170-4ee7-804c-bd319f563375"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 14:26:06.746502 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:26:06.746479 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/963c2c4f-a170-4ee7-804c-bd319f563375-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "963c2c4f-a170-4ee7-804c-bd319f563375" (UID: "963c2c4f-a170-4ee7-804c-bd319f563375"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 14:26:06.746565 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:26:06.746509 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/963c2c4f-a170-4ee7-804c-bd319f563375-kube-api-access-qr24l" (OuterVolumeSpecName: "kube-api-access-qr24l") pod "963c2c4f-a170-4ee7-804c-bd319f563375" (UID: "963c2c4f-a170-4ee7-804c-bd319f563375"). InnerVolumeSpecName "kube-api-access-qr24l". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 14:26:06.845188 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:26:06.845149 2575 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/963c2c4f-a170-4ee7-804c-bd319f563375-console-oauth-config\") on node \"ip-10-0-138-3.ec2.internal\" DevicePath \"\"" Apr 17 14:26:06.845188 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:26:06.845183 2575 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/963c2c4f-a170-4ee7-804c-bd319f563375-oauth-serving-cert\") on node \"ip-10-0-138-3.ec2.internal\" DevicePath \"\"" Apr 17 14:26:06.845188 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:26:06.845193 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qr24l\" (UniqueName: \"kubernetes.io/projected/963c2c4f-a170-4ee7-804c-bd319f563375-kube-api-access-qr24l\") on node \"ip-10-0-138-3.ec2.internal\" DevicePath \"\"" Apr 17 14:26:06.845415 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:26:06.845203 2575 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/963c2c4f-a170-4ee7-804c-bd319f563375-console-config\") on node \"ip-10-0-138-3.ec2.internal\" DevicePath \"\"" Apr 17 14:26:06.845415 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:26:06.845212 2575 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/963c2c4f-a170-4ee7-804c-bd319f563375-console-serving-cert\") on node \"ip-10-0-138-3.ec2.internal\" DevicePath \"\"" Apr 17 14:26:06.845415 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:26:06.845221 2575 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/963c2c4f-a170-4ee7-804c-bd319f563375-trusted-ca-bundle\") on node \"ip-10-0-138-3.ec2.internal\" DevicePath \"\"" Apr 17 14:26:06.845415 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:26:06.845230 2575 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/963c2c4f-a170-4ee7-804c-bd319f563375-service-ca\") on node \"ip-10-0-138-3.ec2.internal\" DevicePath \"\"" Apr 17 14:26:07.426402 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:26:07.426369 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6bb78d9b4d-ccnzw_963c2c4f-a170-4ee7-804c-bd319f563375/console/0.log" Apr 17 14:26:07.426790 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:26:07.426411 2575 generic.go:358] "Generic (PLEG): container finished" podID="963c2c4f-a170-4ee7-804c-bd319f563375" containerID="975d889f86e944dbea587111f5d2d2e0cb0652dc035e32aa33b3e17d6ab1f9bc" exitCode=2 Apr 17 14:26:07.426790 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:26:07.426448 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6bb78d9b4d-ccnzw" event={"ID":"963c2c4f-a170-4ee7-804c-bd319f563375","Type":"ContainerDied","Data":"975d889f86e944dbea587111f5d2d2e0cb0652dc035e32aa33b3e17d6ab1f9bc"} Apr 17 14:26:07.426790 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:26:07.426486 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6bb78d9b4d-ccnzw" Apr 17 14:26:07.426790 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:26:07.426491 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6bb78d9b4d-ccnzw" event={"ID":"963c2c4f-a170-4ee7-804c-bd319f563375","Type":"ContainerDied","Data":"5d437782890407d90615c5e530278f1d9fb02454a2b9e7ad89cda831c9ee08a9"} Apr 17 14:26:07.426790 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:26:07.426512 2575 scope.go:117] "RemoveContainer" containerID="975d889f86e944dbea587111f5d2d2e0cb0652dc035e32aa33b3e17d6ab1f9bc" Apr 17 14:26:07.434969 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:26:07.434953 2575 scope.go:117] "RemoveContainer" containerID="975d889f86e944dbea587111f5d2d2e0cb0652dc035e32aa33b3e17d6ab1f9bc" Apr 17 14:26:07.435195 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:26:07.435178 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"975d889f86e944dbea587111f5d2d2e0cb0652dc035e32aa33b3e17d6ab1f9bc\": container with ID starting with 975d889f86e944dbea587111f5d2d2e0cb0652dc035e32aa33b3e17d6ab1f9bc not found: ID does not exist" containerID="975d889f86e944dbea587111f5d2d2e0cb0652dc035e32aa33b3e17d6ab1f9bc" Apr 17 14:26:07.435238 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:26:07.435202 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"975d889f86e944dbea587111f5d2d2e0cb0652dc035e32aa33b3e17d6ab1f9bc"} err="failed to get container status \"975d889f86e944dbea587111f5d2d2e0cb0652dc035e32aa33b3e17d6ab1f9bc\": rpc error: code = NotFound desc = could not find container \"975d889f86e944dbea587111f5d2d2e0cb0652dc035e32aa33b3e17d6ab1f9bc\": container with ID starting with 975d889f86e944dbea587111f5d2d2e0cb0652dc035e32aa33b3e17d6ab1f9bc not found: ID does not exist" Apr 17 14:26:07.446538 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:26:07.446513 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6bb78d9b4d-ccnzw"] Apr 17 14:26:07.449902 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:26:07.449861 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6bb78d9b4d-ccnzw"] Apr 17 14:26:08.401959 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:26:08.401923 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="963c2c4f-a170-4ee7-804c-bd319f563375" path="/var/lib/kubelet/pods/963c2c4f-a170-4ee7-804c-bd319f563375/volumes" Apr 17 14:26:22.289119 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:26:22.289094 2575 kubelet.go:1628] "Image garbage collection succeeded" Apr 17 14:27:04.483984 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:27:04.483944 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-64b6cfcbd7-kcp65"] Apr 17 14:27:04.484385 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:27:04.484288 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="963c2c4f-a170-4ee7-804c-bd319f563375" containerName="console" Apr 17 14:27:04.484385 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:27:04.484301 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="963c2c4f-a170-4ee7-804c-bd319f563375" containerName="console" Apr 17 14:27:04.484385 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:27:04.484365 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="963c2c4f-a170-4ee7-804c-bd319f563375" containerName="console" Apr 17 14:27:04.487531 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:27:04.487514 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-64b6cfcbd7-kcp65" Apr 17 14:27:04.495402 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:27:04.495376 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-64b6cfcbd7-kcp65"] Apr 17 14:27:04.543197 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:27:04.543160 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/418f4446-5565-4732-aca2-3070fbe1d5c4-console-serving-cert\") pod \"console-64b6cfcbd7-kcp65\" (UID: \"418f4446-5565-4732-aca2-3070fbe1d5c4\") " pod="openshift-console/console-64b6cfcbd7-kcp65" Apr 17 14:27:04.543197 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:27:04.543201 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4hwr\" (UniqueName: \"kubernetes.io/projected/418f4446-5565-4732-aca2-3070fbe1d5c4-kube-api-access-l4hwr\") pod \"console-64b6cfcbd7-kcp65\" (UID: \"418f4446-5565-4732-aca2-3070fbe1d5c4\") " pod="openshift-console/console-64b6cfcbd7-kcp65" Apr 17 14:27:04.543403 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:27:04.543271 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/418f4446-5565-4732-aca2-3070fbe1d5c4-console-config\") pod \"console-64b6cfcbd7-kcp65\" (UID: \"418f4446-5565-4732-aca2-3070fbe1d5c4\") " pod="openshift-console/console-64b6cfcbd7-kcp65" Apr 17 14:27:04.543403 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:27:04.543322 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/418f4446-5565-4732-aca2-3070fbe1d5c4-oauth-serving-cert\") pod \"console-64b6cfcbd7-kcp65\" (UID: \"418f4446-5565-4732-aca2-3070fbe1d5c4\") " pod="openshift-console/console-64b6cfcbd7-kcp65" Apr 17 14:27:04.543403 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:27:04.543346 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/418f4446-5565-4732-aca2-3070fbe1d5c4-console-oauth-config\") pod \"console-64b6cfcbd7-kcp65\" (UID: \"418f4446-5565-4732-aca2-3070fbe1d5c4\") " pod="openshift-console/console-64b6cfcbd7-kcp65" Apr 17 14:27:04.543497 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:27:04.543421 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/418f4446-5565-4732-aca2-3070fbe1d5c4-trusted-ca-bundle\") pod \"console-64b6cfcbd7-kcp65\" (UID: \"418f4446-5565-4732-aca2-3070fbe1d5c4\") " pod="openshift-console/console-64b6cfcbd7-kcp65" Apr 17 14:27:04.543497 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:27:04.543466 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/418f4446-5565-4732-aca2-3070fbe1d5c4-service-ca\") pod \"console-64b6cfcbd7-kcp65\" (UID: \"418f4446-5565-4732-aca2-3070fbe1d5c4\") " pod="openshift-console/console-64b6cfcbd7-kcp65" Apr 17 14:27:04.644674 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:27:04.644625 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/418f4446-5565-4732-aca2-3070fbe1d5c4-console-serving-cert\") pod \"console-64b6cfcbd7-kcp65\" (UID: \"418f4446-5565-4732-aca2-3070fbe1d5c4\") " pod="openshift-console/console-64b6cfcbd7-kcp65" Apr 17 14:27:04.644674 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:27:04.644675 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l4hwr\" (UniqueName: \"kubernetes.io/projected/418f4446-5565-4732-aca2-3070fbe1d5c4-kube-api-access-l4hwr\") pod \"console-64b6cfcbd7-kcp65\" (UID: \"418f4446-5565-4732-aca2-3070fbe1d5c4\") " pod="openshift-console/console-64b6cfcbd7-kcp65" Apr 17 14:27:04.644997 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:27:04.644723 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/418f4446-5565-4732-aca2-3070fbe1d5c4-console-config\") pod \"console-64b6cfcbd7-kcp65\" (UID: \"418f4446-5565-4732-aca2-3070fbe1d5c4\") " pod="openshift-console/console-64b6cfcbd7-kcp65" Apr 17 14:27:04.644997 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:27:04.644783 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/418f4446-5565-4732-aca2-3070fbe1d5c4-oauth-serving-cert\") pod \"console-64b6cfcbd7-kcp65\" (UID: \"418f4446-5565-4732-aca2-3070fbe1d5c4\") " pod="openshift-console/console-64b6cfcbd7-kcp65" Apr 17 14:27:04.644997 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:27:04.644817 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/418f4446-5565-4732-aca2-3070fbe1d5c4-console-oauth-config\") pod \"console-64b6cfcbd7-kcp65\" (UID: \"418f4446-5565-4732-aca2-3070fbe1d5c4\") " pod="openshift-console/console-64b6cfcbd7-kcp65" Apr 17 14:27:04.644997 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:27:04.644844 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/418f4446-5565-4732-aca2-3070fbe1d5c4-trusted-ca-bundle\") pod \"console-64b6cfcbd7-kcp65\" (UID: \"418f4446-5565-4732-aca2-3070fbe1d5c4\") " pod="openshift-console/console-64b6cfcbd7-kcp65" Apr 17 14:27:04.644997 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:27:04.644865 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/418f4446-5565-4732-aca2-3070fbe1d5c4-service-ca\") pod \"console-64b6cfcbd7-kcp65\" (UID: \"418f4446-5565-4732-aca2-3070fbe1d5c4\") " pod="openshift-console/console-64b6cfcbd7-kcp65" Apr 17 14:27:04.645957 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:27:04.645917 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/418f4446-5565-4732-aca2-3070fbe1d5c4-oauth-serving-cert\") pod \"console-64b6cfcbd7-kcp65\" (UID: \"418f4446-5565-4732-aca2-3070fbe1d5c4\") " pod="openshift-console/console-64b6cfcbd7-kcp65" Apr 17 14:27:04.646124 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:27:04.645988 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/418f4446-5565-4732-aca2-3070fbe1d5c4-service-ca\") pod \"console-64b6cfcbd7-kcp65\" (UID: \"418f4446-5565-4732-aca2-3070fbe1d5c4\") " pod="openshift-console/console-64b6cfcbd7-kcp65" Apr 17 14:27:04.646239 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:27:04.646037 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/418f4446-5565-4732-aca2-3070fbe1d5c4-trusted-ca-bundle\") pod \"console-64b6cfcbd7-kcp65\" (UID: \"418f4446-5565-4732-aca2-3070fbe1d5c4\") " pod="openshift-console/console-64b6cfcbd7-kcp65" Apr 17 14:27:04.646319 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:27:04.646118 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/418f4446-5565-4732-aca2-3070fbe1d5c4-console-config\") pod \"console-64b6cfcbd7-kcp65\" (UID: \"418f4446-5565-4732-aca2-3070fbe1d5c4\") " pod="openshift-console/console-64b6cfcbd7-kcp65" Apr 17 14:27:04.648188 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:27:04.648155 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/418f4446-5565-4732-aca2-3070fbe1d5c4-console-oauth-config\") pod \"console-64b6cfcbd7-kcp65\" (UID: \"418f4446-5565-4732-aca2-3070fbe1d5c4\") " pod="openshift-console/console-64b6cfcbd7-kcp65" Apr 17 14:27:04.648300 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:27:04.648258 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/418f4446-5565-4732-aca2-3070fbe1d5c4-console-serving-cert\") pod \"console-64b6cfcbd7-kcp65\" (UID: \"418f4446-5565-4732-aca2-3070fbe1d5c4\") " pod="openshift-console/console-64b6cfcbd7-kcp65" Apr 17 14:27:04.653266 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:27:04.653241 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4hwr\" (UniqueName: \"kubernetes.io/projected/418f4446-5565-4732-aca2-3070fbe1d5c4-kube-api-access-l4hwr\") pod \"console-64b6cfcbd7-kcp65\" (UID: \"418f4446-5565-4732-aca2-3070fbe1d5c4\") " pod="openshift-console/console-64b6cfcbd7-kcp65" Apr 17 14:27:04.797327 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:27:04.797231 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-64b6cfcbd7-kcp65" Apr 17 14:27:04.914515 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:27:04.914489 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-64b6cfcbd7-kcp65"] Apr 17 14:27:04.917213 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:27:04.917184 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod418f4446_5565_4732_aca2_3070fbe1d5c4.slice/crio-791ef1e11d333d4bd907837cc4e6903aedcb5cd28c510c15a68774e14337dade WatchSource:0}: Error finding container 791ef1e11d333d4bd907837cc4e6903aedcb5cd28c510c15a68774e14337dade: Status 404 returned error can't find the container with id 791ef1e11d333d4bd907837cc4e6903aedcb5cd28c510c15a68774e14337dade Apr 17 14:27:04.919175 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:27:04.919157 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 14:27:05.596824 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:27:05.596786 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-64b6cfcbd7-kcp65" event={"ID":"418f4446-5565-4732-aca2-3070fbe1d5c4","Type":"ContainerStarted","Data":"9d12b4a3466a7b76a6ff836a07fd3ca39390461d079a800b9edc072e45f9cf82"} Apr 17 14:27:05.596824 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:27:05.596824 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-64b6cfcbd7-kcp65" event={"ID":"418f4446-5565-4732-aca2-3070fbe1d5c4","Type":"ContainerStarted","Data":"791ef1e11d333d4bd907837cc4e6903aedcb5cd28c510c15a68774e14337dade"} Apr 17 14:27:05.618098 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:27:05.618042 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-64b6cfcbd7-kcp65" podStartSLOduration=1.618025424 podStartE2EDuration="1.618025424s" podCreationTimestamp="2026-04-17 14:27:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 14:27:05.61673269 +0000 UTC m=+343.923486303" watchObservedRunningTime="2026-04-17 14:27:05.618025424 +0000 UTC m=+343.924779042" Apr 17 14:27:14.797971 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:27:14.797850 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-64b6cfcbd7-kcp65" Apr 17 14:27:14.797971 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:27:14.797919 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-64b6cfcbd7-kcp65" Apr 17 14:27:14.802472 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:27:14.802449 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-64b6cfcbd7-kcp65" Apr 17 14:27:15.628945 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:27:15.628917 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-64b6cfcbd7-kcp65" Apr 17 14:27:15.673391 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:27:15.673357 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-578d4c8b9-kkzhw"] Apr 17 14:27:40.693439 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:27:40.693380 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-578d4c8b9-kkzhw" podUID="0cc9a4f3-c03a-474d-9baf-bfc6c160ecfc" containerName="console" containerID="cri-o://ee3b4a7a72e07d8d5ad64b361ff545c5cfe2a554d59625e620274fa838cceba7" gracePeriod=15 Apr 17 14:27:40.930008 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:27:40.929987 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-578d4c8b9-kkzhw_0cc9a4f3-c03a-474d-9baf-bfc6c160ecfc/console/0.log" Apr 17 14:27:40.930130 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:27:40.930045 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-578d4c8b9-kkzhw" Apr 17 14:27:41.076159 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:27:41.076129 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0cc9a4f3-c03a-474d-9baf-bfc6c160ecfc-service-ca\") pod \"0cc9a4f3-c03a-474d-9baf-bfc6c160ecfc\" (UID: \"0cc9a4f3-c03a-474d-9baf-bfc6c160ecfc\") " Apr 17 14:27:41.076344 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:27:41.076172 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0cc9a4f3-c03a-474d-9baf-bfc6c160ecfc-console-serving-cert\") pod \"0cc9a4f3-c03a-474d-9baf-bfc6c160ecfc\" (UID: \"0cc9a4f3-c03a-474d-9baf-bfc6c160ecfc\") " Apr 17 14:27:41.076344 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:27:41.076211 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0cc9a4f3-c03a-474d-9baf-bfc6c160ecfc-trusted-ca-bundle\") pod \"0cc9a4f3-c03a-474d-9baf-bfc6c160ecfc\" (UID: \"0cc9a4f3-c03a-474d-9baf-bfc6c160ecfc\") " Apr 17 14:27:41.076344 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:27:41.076247 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0cc9a4f3-c03a-474d-9baf-bfc6c160ecfc-console-config\") pod \"0cc9a4f3-c03a-474d-9baf-bfc6c160ecfc\" (UID: \"0cc9a4f3-c03a-474d-9baf-bfc6c160ecfc\") " Apr 17 14:27:41.076344 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:27:41.076285 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-57mjl\" (UniqueName: \"kubernetes.io/projected/0cc9a4f3-c03a-474d-9baf-bfc6c160ecfc-kube-api-access-57mjl\") pod \"0cc9a4f3-c03a-474d-9baf-bfc6c160ecfc\" (UID: \"0cc9a4f3-c03a-474d-9baf-bfc6c160ecfc\") " Apr 17 14:27:41.076344 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:27:41.076323 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0cc9a4f3-c03a-474d-9baf-bfc6c160ecfc-console-oauth-config\") pod \"0cc9a4f3-c03a-474d-9baf-bfc6c160ecfc\" (UID: \"0cc9a4f3-c03a-474d-9baf-bfc6c160ecfc\") " Apr 17 14:27:41.076581 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:27:41.076374 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0cc9a4f3-c03a-474d-9baf-bfc6c160ecfc-oauth-serving-cert\") pod \"0cc9a4f3-c03a-474d-9baf-bfc6c160ecfc\" (UID: \"0cc9a4f3-c03a-474d-9baf-bfc6c160ecfc\") " Apr 17 14:27:41.076581 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:27:41.076566 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0cc9a4f3-c03a-474d-9baf-bfc6c160ecfc-service-ca" (OuterVolumeSpecName: "service-ca") pod "0cc9a4f3-c03a-474d-9baf-bfc6c160ecfc" (UID: "0cc9a4f3-c03a-474d-9baf-bfc6c160ecfc"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 14:27:41.076694 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:27:41.076661 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0cc9a4f3-c03a-474d-9baf-bfc6c160ecfc-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "0cc9a4f3-c03a-474d-9baf-bfc6c160ecfc" (UID: "0cc9a4f3-c03a-474d-9baf-bfc6c160ecfc"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 14:27:41.076790 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:27:41.076667 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0cc9a4f3-c03a-474d-9baf-bfc6c160ecfc-console-config" (OuterVolumeSpecName: "console-config") pod "0cc9a4f3-c03a-474d-9baf-bfc6c160ecfc" (UID: "0cc9a4f3-c03a-474d-9baf-bfc6c160ecfc"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 14:27:41.076951 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:27:41.076927 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0cc9a4f3-c03a-474d-9baf-bfc6c160ecfc-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "0cc9a4f3-c03a-474d-9baf-bfc6c160ecfc" (UID: "0cc9a4f3-c03a-474d-9baf-bfc6c160ecfc"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 14:27:41.078595 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:27:41.078561 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cc9a4f3-c03a-474d-9baf-bfc6c160ecfc-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "0cc9a4f3-c03a-474d-9baf-bfc6c160ecfc" (UID: "0cc9a4f3-c03a-474d-9baf-bfc6c160ecfc"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 14:27:41.078682 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:27:41.078585 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cc9a4f3-c03a-474d-9baf-bfc6c160ecfc-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "0cc9a4f3-c03a-474d-9baf-bfc6c160ecfc" (UID: "0cc9a4f3-c03a-474d-9baf-bfc6c160ecfc"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 14:27:41.078682 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:27:41.078599 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0cc9a4f3-c03a-474d-9baf-bfc6c160ecfc-kube-api-access-57mjl" (OuterVolumeSpecName: "kube-api-access-57mjl") pod "0cc9a4f3-c03a-474d-9baf-bfc6c160ecfc" (UID: "0cc9a4f3-c03a-474d-9baf-bfc6c160ecfc"). InnerVolumeSpecName "kube-api-access-57mjl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 14:27:41.177707 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:27:41.177671 2575 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0cc9a4f3-c03a-474d-9baf-bfc6c160ecfc-trusted-ca-bundle\") on node \"ip-10-0-138-3.ec2.internal\" DevicePath \"\"" Apr 17 14:27:41.177707 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:27:41.177699 2575 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0cc9a4f3-c03a-474d-9baf-bfc6c160ecfc-console-config\") on node \"ip-10-0-138-3.ec2.internal\" DevicePath \"\"" Apr 17 14:27:41.177707 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:27:41.177710 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-57mjl\" (UniqueName: \"kubernetes.io/projected/0cc9a4f3-c03a-474d-9baf-bfc6c160ecfc-kube-api-access-57mjl\") on node \"ip-10-0-138-3.ec2.internal\" DevicePath \"\"" Apr 17 14:27:41.177707 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:27:41.177719 2575 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0cc9a4f3-c03a-474d-9baf-bfc6c160ecfc-console-oauth-config\") on node \"ip-10-0-138-3.ec2.internal\" DevicePath \"\"" Apr 17 14:27:41.177998 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:27:41.177729 2575 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0cc9a4f3-c03a-474d-9baf-bfc6c160ecfc-oauth-serving-cert\") on node \"ip-10-0-138-3.ec2.internal\" DevicePath \"\"" Apr 17 14:27:41.177998 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:27:41.177738 2575 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0cc9a4f3-c03a-474d-9baf-bfc6c160ecfc-service-ca\") on node \"ip-10-0-138-3.ec2.internal\" DevicePath \"\"" Apr 17 14:27:41.177998 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:27:41.177746 2575 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0cc9a4f3-c03a-474d-9baf-bfc6c160ecfc-console-serving-cert\") on node \"ip-10-0-138-3.ec2.internal\" DevicePath \"\"" Apr 17 14:27:41.700700 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:27:41.700674 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-578d4c8b9-kkzhw_0cc9a4f3-c03a-474d-9baf-bfc6c160ecfc/console/0.log" Apr 17 14:27:41.701151 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:27:41.700726 2575 generic.go:358] "Generic (PLEG): container finished" podID="0cc9a4f3-c03a-474d-9baf-bfc6c160ecfc" containerID="ee3b4a7a72e07d8d5ad64b361ff545c5cfe2a554d59625e620274fa838cceba7" exitCode=2 Apr 17 14:27:41.701151 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:27:41.700819 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-578d4c8b9-kkzhw" Apr 17 14:27:41.701151 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:27:41.700816 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-578d4c8b9-kkzhw" event={"ID":"0cc9a4f3-c03a-474d-9baf-bfc6c160ecfc","Type":"ContainerDied","Data":"ee3b4a7a72e07d8d5ad64b361ff545c5cfe2a554d59625e620274fa838cceba7"} Apr 17 14:27:41.701151 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:27:41.700920 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-578d4c8b9-kkzhw" event={"ID":"0cc9a4f3-c03a-474d-9baf-bfc6c160ecfc","Type":"ContainerDied","Data":"3d938d3b4d4680d2f7ba8b5a4652ccc73f4e824daf37fbc212283437f7a3e307"} Apr 17 14:27:41.701151 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:27:41.700944 2575 scope.go:117] "RemoveContainer" containerID="ee3b4a7a72e07d8d5ad64b361ff545c5cfe2a554d59625e620274fa838cceba7" Apr 17 14:27:41.709246 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:27:41.709231 2575 scope.go:117] "RemoveContainer" containerID="ee3b4a7a72e07d8d5ad64b361ff545c5cfe2a554d59625e620274fa838cceba7" Apr 17 14:27:41.709488 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:27:41.709468 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee3b4a7a72e07d8d5ad64b361ff545c5cfe2a554d59625e620274fa838cceba7\": container with ID starting with ee3b4a7a72e07d8d5ad64b361ff545c5cfe2a554d59625e620274fa838cceba7 not found: ID does not exist" containerID="ee3b4a7a72e07d8d5ad64b361ff545c5cfe2a554d59625e620274fa838cceba7" Apr 17 14:27:41.709537 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:27:41.709496 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee3b4a7a72e07d8d5ad64b361ff545c5cfe2a554d59625e620274fa838cceba7"} err="failed to get container status \"ee3b4a7a72e07d8d5ad64b361ff545c5cfe2a554d59625e620274fa838cceba7\": rpc error: code = NotFound desc = could not find container \"ee3b4a7a72e07d8d5ad64b361ff545c5cfe2a554d59625e620274fa838cceba7\": container with ID starting with ee3b4a7a72e07d8d5ad64b361ff545c5cfe2a554d59625e620274fa838cceba7 not found: ID does not exist" Apr 17 14:27:41.721152 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:27:41.721128 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-578d4c8b9-kkzhw"] Apr 17 14:27:41.725356 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:27:41.725324 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-578d4c8b9-kkzhw"] Apr 17 14:27:42.402767 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:27:42.402727 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0cc9a4f3-c03a-474d-9baf-bfc6c160ecfc" path="/var/lib/kubelet/pods/0cc9a4f3-c03a-474d-9baf-bfc6c160ecfc/volumes" Apr 17 14:28:04.576452 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:28:04.576416 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-rkbxx"] Apr 17 14:28:04.576893 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:28:04.576781 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0cc9a4f3-c03a-474d-9baf-bfc6c160ecfc" containerName="console" Apr 17 14:28:04.576893 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:28:04.576793 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cc9a4f3-c03a-474d-9baf-bfc6c160ecfc" containerName="console" Apr 17 14:28:04.576893 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:28:04.576847 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="0cc9a4f3-c03a-474d-9baf-bfc6c160ecfc" containerName="console" Apr 17 14:28:04.581153 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:28:04.581136 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-rkbxx" Apr 17 14:28:04.583591 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:28:04.583568 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 17 14:28:04.583810 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:28:04.583794 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-8mwj5\"" Apr 17 14:28:04.583891 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:28:04.583815 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 17 14:28:04.590736 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:28:04.590713 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-rkbxx"] Apr 17 14:28:04.675429 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:28:04.675387 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/19b16ae0-25e2-42e8-b720-e11645a965f5-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-rkbxx\" (UID: \"19b16ae0-25e2-42e8-b720-e11645a965f5\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-rkbxx" Apr 17 14:28:04.675429 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:28:04.675429 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckzsk\" (UniqueName: \"kubernetes.io/projected/19b16ae0-25e2-42e8-b720-e11645a965f5-kube-api-access-ckzsk\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-rkbxx\" (UID: \"19b16ae0-25e2-42e8-b720-e11645a965f5\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-rkbxx" Apr 17 14:28:04.776839 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:28:04.776807 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/19b16ae0-25e2-42e8-b720-e11645a965f5-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-rkbxx\" (UID: \"19b16ae0-25e2-42e8-b720-e11645a965f5\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-rkbxx" Apr 17 14:28:04.776839 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:28:04.776845 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ckzsk\" (UniqueName: \"kubernetes.io/projected/19b16ae0-25e2-42e8-b720-e11645a965f5-kube-api-access-ckzsk\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-rkbxx\" (UID: \"19b16ae0-25e2-42e8-b720-e11645a965f5\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-rkbxx" Apr 17 14:28:04.777188 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:28:04.777169 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/19b16ae0-25e2-42e8-b720-e11645a965f5-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-rkbxx\" (UID: \"19b16ae0-25e2-42e8-b720-e11645a965f5\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-rkbxx" Apr 17 14:28:04.785326 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:28:04.785303 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckzsk\" (UniqueName: \"kubernetes.io/projected/19b16ae0-25e2-42e8-b720-e11645a965f5-kube-api-access-ckzsk\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-rkbxx\" (UID: \"19b16ae0-25e2-42e8-b720-e11645a965f5\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-rkbxx" Apr 17 14:28:04.891269 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:28:04.891181 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-rkbxx" Apr 17 14:28:05.016051 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:28:05.016024 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-rkbxx"] Apr 17 14:28:05.018796 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:28:05.018768 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19b16ae0_25e2_42e8_b720_e11645a965f5.slice/crio-fdc1511d4ee4c039e2ce9cc48df1cab326ce10d8f34017eb4d66fb2dd27e600a WatchSource:0}: Error finding container fdc1511d4ee4c039e2ce9cc48df1cab326ce10d8f34017eb4d66fb2dd27e600a: Status 404 returned error can't find the container with id fdc1511d4ee4c039e2ce9cc48df1cab326ce10d8f34017eb4d66fb2dd27e600a Apr 17 14:28:05.772049 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:28:05.772009 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-rkbxx" event={"ID":"19b16ae0-25e2-42e8-b720-e11645a965f5","Type":"ContainerStarted","Data":"fdc1511d4ee4c039e2ce9cc48df1cab326ce10d8f34017eb4d66fb2dd27e600a"} Apr 17 14:28:10.791400 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:28:10.791360 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-rkbxx" event={"ID":"19b16ae0-25e2-42e8-b720-e11645a965f5","Type":"ContainerStarted","Data":"7bfe363bf0144ebc2209053d7ffd90a46c4d2cf0e182a7dba5d0e655f55e85b9"} Apr 17 14:28:10.812549 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:28:10.812495 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-rkbxx" podStartSLOduration=1.959765483 podStartE2EDuration="6.812478051s" podCreationTimestamp="2026-04-17 14:28:04 +0000 UTC" firstStartedPulling="2026-04-17 14:28:05.021199004 +0000 UTC m=+403.327952615" lastFinishedPulling="2026-04-17 14:28:09.873911584 +0000 UTC m=+408.180665183" observedRunningTime="2026-04-17 14:28:10.809368208 +0000 UTC m=+409.116121827" watchObservedRunningTime="2026-04-17 14:28:10.812478051 +0000 UTC m=+409.119231670" Apr 17 14:28:16.066896 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:28:16.066852 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-tvd6r"] Apr 17 14:28:16.069882 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:28:16.069857 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-tvd6r" Apr 17 14:28:16.072495 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:28:16.072464 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 17 14:28:16.072624 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:28:16.072523 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 17 14:28:16.073739 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:28:16.073726 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-dfv7x\"" Apr 17 14:28:16.077505 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:28:16.077486 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-tvd6r"] Apr 17 14:28:16.173529 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:28:16.173496 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c7a956e0-933f-4307-b894-215ad8b570bf-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-tvd6r\" (UID: \"c7a956e0-933f-4307-b894-215ad8b570bf\") " pod="cert-manager/cert-manager-webhook-597b96b99b-tvd6r" Apr 17 14:28:16.173529 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:28:16.173525 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wbg8\" (UniqueName: \"kubernetes.io/projected/c7a956e0-933f-4307-b894-215ad8b570bf-kube-api-access-6wbg8\") pod \"cert-manager-webhook-597b96b99b-tvd6r\" (UID: \"c7a956e0-933f-4307-b894-215ad8b570bf\") " pod="cert-manager/cert-manager-webhook-597b96b99b-tvd6r" Apr 17 14:28:16.274011 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:28:16.273988 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c7a956e0-933f-4307-b894-215ad8b570bf-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-tvd6r\" (UID: \"c7a956e0-933f-4307-b894-215ad8b570bf\") " pod="cert-manager/cert-manager-webhook-597b96b99b-tvd6r" Apr 17 14:28:16.274011 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:28:16.274014 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6wbg8\" (UniqueName: \"kubernetes.io/projected/c7a956e0-933f-4307-b894-215ad8b570bf-kube-api-access-6wbg8\") pod \"cert-manager-webhook-597b96b99b-tvd6r\" (UID: \"c7a956e0-933f-4307-b894-215ad8b570bf\") " pod="cert-manager/cert-manager-webhook-597b96b99b-tvd6r" Apr 17 14:28:16.281817 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:28:16.281793 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c7a956e0-933f-4307-b894-215ad8b570bf-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-tvd6r\" (UID: \"c7a956e0-933f-4307-b894-215ad8b570bf\") " pod="cert-manager/cert-manager-webhook-597b96b99b-tvd6r" Apr 17 14:28:16.281986 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:28:16.281967 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wbg8\" (UniqueName: \"kubernetes.io/projected/c7a956e0-933f-4307-b894-215ad8b570bf-kube-api-access-6wbg8\") pod \"cert-manager-webhook-597b96b99b-tvd6r\" (UID: \"c7a956e0-933f-4307-b894-215ad8b570bf\") " pod="cert-manager/cert-manager-webhook-597b96b99b-tvd6r" Apr 17 14:28:16.389180 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:28:16.389067 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-tvd6r" Apr 17 14:28:16.502095 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:28:16.502063 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-tvd6r"] Apr 17 14:28:16.505008 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:28:16.504983 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc7a956e0_933f_4307_b894_215ad8b570bf.slice/crio-dbf589e26b98ac17a580f53ce6a014a6d26810de4d5660d937e42cf53bfbd8ab WatchSource:0}: Error finding container dbf589e26b98ac17a580f53ce6a014a6d26810de4d5660d937e42cf53bfbd8ab: Status 404 returned error can't find the container with id dbf589e26b98ac17a580f53ce6a014a6d26810de4d5660d937e42cf53bfbd8ab Apr 17 14:28:16.810343 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:28:16.810315 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-tvd6r" event={"ID":"c7a956e0-933f-4307-b894-215ad8b570bf","Type":"ContainerStarted","Data":"dbf589e26b98ac17a580f53ce6a014a6d26810de4d5660d937e42cf53bfbd8ab"} Apr 17 14:28:19.822058 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:28:19.822016 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-tvd6r" event={"ID":"c7a956e0-933f-4307-b894-215ad8b570bf","Type":"ContainerStarted","Data":"32d6c3b59b7cfbdf668cbdda0127ef96f9b0f6ae36655bf667ef65c0beb398aa"} Apr 17 14:28:19.822453 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:28:19.822141 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-597b96b99b-tvd6r" Apr 17 14:28:19.838362 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:28:19.838315 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-597b96b99b-tvd6r" podStartSLOduration=1.142599559 podStartE2EDuration="3.838303957s" podCreationTimestamp="2026-04-17 14:28:16 +0000 UTC" firstStartedPulling="2026-04-17 14:28:16.506861677 +0000 UTC m=+414.813615274" lastFinishedPulling="2026-04-17 14:28:19.202566075 +0000 UTC m=+417.509319672" observedRunningTime="2026-04-17 14:28:19.837071413 +0000 UTC m=+418.143825034" watchObservedRunningTime="2026-04-17 14:28:19.838303957 +0000 UTC m=+418.145057571" Apr 17 14:28:25.491582 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:28:25.491547 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-fhlmq"] Apr 17 14:28:25.495817 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:28:25.495801 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-fhlmq" Apr 17 14:28:25.498442 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:28:25.498424 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"openshift-lws-operator-dockercfg-kj5pq\"" Apr 17 14:28:25.498552 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:28:25.498426 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 17 14:28:25.499558 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:28:25.499542 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 17 14:28:25.503483 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:28:25.503461 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-fhlmq"] Apr 17 14:28:25.653727 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:28:25.653690 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lltzg\" (UniqueName: \"kubernetes.io/projected/0abf4507-e6c4-4663-b676-41c4b08ede12-kube-api-access-lltzg\") pod \"openshift-lws-operator-bfc7f696d-fhlmq\" (UID: \"0abf4507-e6c4-4663-b676-41c4b08ede12\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-fhlmq" Apr 17 14:28:25.653950 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:28:25.653787 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0abf4507-e6c4-4663-b676-41c4b08ede12-tmp\") pod \"openshift-lws-operator-bfc7f696d-fhlmq\" (UID: \"0abf4507-e6c4-4663-b676-41c4b08ede12\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-fhlmq" Apr 17 14:28:25.754650 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:28:25.754570 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lltzg\" (UniqueName: \"kubernetes.io/projected/0abf4507-e6c4-4663-b676-41c4b08ede12-kube-api-access-lltzg\") pod \"openshift-lws-operator-bfc7f696d-fhlmq\" (UID: \"0abf4507-e6c4-4663-b676-41c4b08ede12\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-fhlmq" Apr 17 14:28:25.754650 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:28:25.754647 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0abf4507-e6c4-4663-b676-41c4b08ede12-tmp\") pod \"openshift-lws-operator-bfc7f696d-fhlmq\" (UID: \"0abf4507-e6c4-4663-b676-41c4b08ede12\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-fhlmq" Apr 17 14:28:25.755006 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:28:25.754989 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0abf4507-e6c4-4663-b676-41c4b08ede12-tmp\") pod \"openshift-lws-operator-bfc7f696d-fhlmq\" (UID: \"0abf4507-e6c4-4663-b676-41c4b08ede12\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-fhlmq" Apr 17 14:28:25.763296 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:28:25.763277 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lltzg\" (UniqueName: \"kubernetes.io/projected/0abf4507-e6c4-4663-b676-41c4b08ede12-kube-api-access-lltzg\") pod \"openshift-lws-operator-bfc7f696d-fhlmq\" (UID: \"0abf4507-e6c4-4663-b676-41c4b08ede12\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-fhlmq" Apr 17 14:28:25.805205 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:28:25.805181 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-fhlmq" Apr 17 14:28:25.829348 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:28:25.829329 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-597b96b99b-tvd6r" Apr 17 14:28:25.928971 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:28:25.928941 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-fhlmq"] Apr 17 14:28:25.932335 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:28:25.932306 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0abf4507_e6c4_4663_b676_41c4b08ede12.slice/crio-34a4e372dbbf56c18497ee2379fc6b85ee49abd74ac2a5c9e79e16801e89d231 WatchSource:0}: Error finding container 34a4e372dbbf56c18497ee2379fc6b85ee49abd74ac2a5c9e79e16801e89d231: Status 404 returned error can't find the container with id 34a4e372dbbf56c18497ee2379fc6b85ee49abd74ac2a5c9e79e16801e89d231 Apr 17 14:28:26.850098 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:28:26.850050 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-fhlmq" event={"ID":"0abf4507-e6c4-4663-b676-41c4b08ede12","Type":"ContainerStarted","Data":"34a4e372dbbf56c18497ee2379fc6b85ee49abd74ac2a5c9e79e16801e89d231"} Apr 17 14:28:28.857788 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:28:28.857755 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-fhlmq" event={"ID":"0abf4507-e6c4-4663-b676-41c4b08ede12","Type":"ContainerStarted","Data":"4186cf54fa0d0f59fc53af8436f153f560db52c178f84d426a4fdb3fcfe97d1f"} Apr 17 14:28:28.881965 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:28:28.881902 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-fhlmq" podStartSLOduration=1.406600515 podStartE2EDuration="3.881887587s" podCreationTimestamp="2026-04-17 14:28:25 +0000 UTC" firstStartedPulling="2026-04-17 14:28:25.933860434 +0000 UTC m=+424.240614044" lastFinishedPulling="2026-04-17 14:28:28.409147506 +0000 UTC m=+426.715901116" observedRunningTime="2026-04-17 14:28:28.879826587 +0000 UTC m=+427.186580211" watchObservedRunningTime="2026-04-17 14:28:28.881887587 +0000 UTC m=+427.188641198" Apr 17 14:28:48.029495 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:28:48.029422 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-58c8f88b6d-rhcxr"] Apr 17 14:28:48.036977 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:28:48.036955 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-58c8f88b6d-rhcxr" Apr 17 14:28:48.039505 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:28:48.039482 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 17 14:28:48.039909 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:28:48.039881 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 17 14:28:48.040046 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:28:48.039908 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-268bl\"" Apr 17 14:28:48.040046 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:28:48.039930 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 17 14:28:48.040133 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:28:48.040050 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 17 14:28:48.048421 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:28:48.048401 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-58c8f88b6d-rhcxr"] Apr 17 14:28:48.138270 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:28:48.138240 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5e4f5f7f-c245-4add-bc95-575616281d6a-webhook-cert\") pod \"opendatahub-operator-controller-manager-58c8f88b6d-rhcxr\" (UID: \"5e4f5f7f-c245-4add-bc95-575616281d6a\") " pod="opendatahub/opendatahub-operator-controller-manager-58c8f88b6d-rhcxr" Apr 17 14:28:48.138409 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:28:48.138294 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5e4f5f7f-c245-4add-bc95-575616281d6a-apiservice-cert\") pod \"opendatahub-operator-controller-manager-58c8f88b6d-rhcxr\" (UID: \"5e4f5f7f-c245-4add-bc95-575616281d6a\") " pod="opendatahub/opendatahub-operator-controller-manager-58c8f88b6d-rhcxr" Apr 17 14:28:48.138409 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:28:48.138342 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqnhk\" (UniqueName: \"kubernetes.io/projected/5e4f5f7f-c245-4add-bc95-575616281d6a-kube-api-access-hqnhk\") pod \"opendatahub-operator-controller-manager-58c8f88b6d-rhcxr\" (UID: \"5e4f5f7f-c245-4add-bc95-575616281d6a\") " pod="opendatahub/opendatahub-operator-controller-manager-58c8f88b6d-rhcxr" Apr 17 14:28:48.239339 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:28:48.239313 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5e4f5f7f-c245-4add-bc95-575616281d6a-webhook-cert\") pod \"opendatahub-operator-controller-manager-58c8f88b6d-rhcxr\" (UID: \"5e4f5f7f-c245-4add-bc95-575616281d6a\") " pod="opendatahub/opendatahub-operator-controller-manager-58c8f88b6d-rhcxr" Apr 17 14:28:48.239471 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:28:48.239361 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5e4f5f7f-c245-4add-bc95-575616281d6a-apiservice-cert\") pod \"opendatahub-operator-controller-manager-58c8f88b6d-rhcxr\" (UID: \"5e4f5f7f-c245-4add-bc95-575616281d6a\") " pod="opendatahub/opendatahub-operator-controller-manager-58c8f88b6d-rhcxr" Apr 17 14:28:48.239471 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:28:48.239394 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hqnhk\" (UniqueName: \"kubernetes.io/projected/5e4f5f7f-c245-4add-bc95-575616281d6a-kube-api-access-hqnhk\") pod \"opendatahub-operator-controller-manager-58c8f88b6d-rhcxr\" (UID: \"5e4f5f7f-c245-4add-bc95-575616281d6a\") " pod="opendatahub/opendatahub-operator-controller-manager-58c8f88b6d-rhcxr" Apr 17 14:28:48.241698 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:28:48.241675 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5e4f5f7f-c245-4add-bc95-575616281d6a-webhook-cert\") pod \"opendatahub-operator-controller-manager-58c8f88b6d-rhcxr\" (UID: \"5e4f5f7f-c245-4add-bc95-575616281d6a\") " pod="opendatahub/opendatahub-operator-controller-manager-58c8f88b6d-rhcxr" Apr 17 14:28:48.241830 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:28:48.241809 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5e4f5f7f-c245-4add-bc95-575616281d6a-apiservice-cert\") pod \"opendatahub-operator-controller-manager-58c8f88b6d-rhcxr\" (UID: \"5e4f5f7f-c245-4add-bc95-575616281d6a\") " pod="opendatahub/opendatahub-operator-controller-manager-58c8f88b6d-rhcxr" Apr 17 14:28:48.249781 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:28:48.249758 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqnhk\" (UniqueName: \"kubernetes.io/projected/5e4f5f7f-c245-4add-bc95-575616281d6a-kube-api-access-hqnhk\") pod \"opendatahub-operator-controller-manager-58c8f88b6d-rhcxr\" (UID: \"5e4f5f7f-c245-4add-bc95-575616281d6a\") " pod="opendatahub/opendatahub-operator-controller-manager-58c8f88b6d-rhcxr" Apr 17 14:28:48.347743 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:28:48.347710 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-58c8f88b6d-rhcxr" Apr 17 14:28:48.477314 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:28:48.477286 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-58c8f88b6d-rhcxr"] Apr 17 14:28:48.479420 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:28:48.479391 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e4f5f7f_c245_4add_bc95_575616281d6a.slice/crio-98d70c988bcf0cae143342c4603329fc6edd9723f68b257472337e9e5c5aee1f WatchSource:0}: Error finding container 98d70c988bcf0cae143342c4603329fc6edd9723f68b257472337e9e5c5aee1f: Status 404 returned error can't find the container with id 98d70c988bcf0cae143342c4603329fc6edd9723f68b257472337e9e5c5aee1f Apr 17 14:28:48.930825 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:28:48.930786 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-58c8f88b6d-rhcxr" event={"ID":"5e4f5f7f-c245-4add-bc95-575616281d6a","Type":"ContainerStarted","Data":"98d70c988bcf0cae143342c4603329fc6edd9723f68b257472337e9e5c5aee1f"} Apr 17 14:28:51.942805 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:28:51.942771 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-58c8f88b6d-rhcxr" event={"ID":"5e4f5f7f-c245-4add-bc95-575616281d6a","Type":"ContainerStarted","Data":"870fda7d81113bcc3ed9301cab877a467b0b461ba2bd3f86f66af685ce9a7b04"} Apr 17 14:28:51.943250 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:28:51.942910 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-58c8f88b6d-rhcxr" Apr 17 14:28:51.964246 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:28:51.964204 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-58c8f88b6d-rhcxr" podStartSLOduration=1.373643143 podStartE2EDuration="3.964191593s" podCreationTimestamp="2026-04-17 14:28:48 +0000 UTC" firstStartedPulling="2026-04-17 14:28:48.481148151 +0000 UTC m=+446.787901750" lastFinishedPulling="2026-04-17 14:28:51.071696586 +0000 UTC m=+449.378450200" observedRunningTime="2026-04-17 14:28:51.9618368 +0000 UTC m=+450.268590443" watchObservedRunningTime="2026-04-17 14:28:51.964191593 +0000 UTC m=+450.270945212" Apr 17 14:29:02.948190 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:02.948161 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-58c8f88b6d-rhcxr" Apr 17 14:29:05.165713 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:05.165680 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-5b474cc896-hrn2m"] Apr 17 14:29:05.172635 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:05.172614 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-5b474cc896-hrn2m" Apr 17 14:29:05.175195 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:05.175176 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 17 14:29:05.175365 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:05.175342 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 17 14:29:05.176468 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:05.176448 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-m9jnm\"" Apr 17 14:29:05.178679 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:05.178654 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-5b474cc896-hrn2m"] Apr 17 14:29:05.296830 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:05.296791 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1b235a5d-f63d-4963-bd85-69721b597cd4-tls-certs\") pod \"kube-auth-proxy-5b474cc896-hrn2m\" (UID: \"1b235a5d-f63d-4963-bd85-69721b597cd4\") " pod="openshift-ingress/kube-auth-proxy-5b474cc896-hrn2m" Apr 17 14:29:05.296830 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:05.296835 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8j72\" (UniqueName: \"kubernetes.io/projected/1b235a5d-f63d-4963-bd85-69721b597cd4-kube-api-access-s8j72\") pod \"kube-auth-proxy-5b474cc896-hrn2m\" (UID: \"1b235a5d-f63d-4963-bd85-69721b597cd4\") " pod="openshift-ingress/kube-auth-proxy-5b474cc896-hrn2m" Apr 17 14:29:05.297085 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:05.296944 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1b235a5d-f63d-4963-bd85-69721b597cd4-tmp\") pod \"kube-auth-proxy-5b474cc896-hrn2m\" (UID: \"1b235a5d-f63d-4963-bd85-69721b597cd4\") " pod="openshift-ingress/kube-auth-proxy-5b474cc896-hrn2m" Apr 17 14:29:05.397631 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:05.397597 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1b235a5d-f63d-4963-bd85-69721b597cd4-tls-certs\") pod \"kube-auth-proxy-5b474cc896-hrn2m\" (UID: \"1b235a5d-f63d-4963-bd85-69721b597cd4\") " pod="openshift-ingress/kube-auth-proxy-5b474cc896-hrn2m" Apr 17 14:29:05.397803 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:05.397640 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s8j72\" (UniqueName: \"kubernetes.io/projected/1b235a5d-f63d-4963-bd85-69721b597cd4-kube-api-access-s8j72\") pod \"kube-auth-proxy-5b474cc896-hrn2m\" (UID: \"1b235a5d-f63d-4963-bd85-69721b597cd4\") " pod="openshift-ingress/kube-auth-proxy-5b474cc896-hrn2m" Apr 17 14:29:05.397803 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:29:05.397757 2575 secret.go:189] Couldn't get secret openshift-ingress/kube-auth-proxy-tls: secret "kube-auth-proxy-tls" not found Apr 17 14:29:05.397974 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:29:05.397848 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b235a5d-f63d-4963-bd85-69721b597cd4-tls-certs podName:1b235a5d-f63d-4963-bd85-69721b597cd4 nodeName:}" failed. No retries permitted until 2026-04-17 14:29:05.897826586 +0000 UTC m=+464.204580190 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/1b235a5d-f63d-4963-bd85-69721b597cd4-tls-certs") pod "kube-auth-proxy-5b474cc896-hrn2m" (UID: "1b235a5d-f63d-4963-bd85-69721b597cd4") : secret "kube-auth-proxy-tls" not found Apr 17 14:29:05.397974 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:05.397753 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1b235a5d-f63d-4963-bd85-69721b597cd4-tmp\") pod \"kube-auth-proxy-5b474cc896-hrn2m\" (UID: \"1b235a5d-f63d-4963-bd85-69721b597cd4\") " pod="openshift-ingress/kube-auth-proxy-5b474cc896-hrn2m" Apr 17 14:29:05.400033 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:05.400008 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1b235a5d-f63d-4963-bd85-69721b597cd4-tmp\") pod \"kube-auth-proxy-5b474cc896-hrn2m\" (UID: \"1b235a5d-f63d-4963-bd85-69721b597cd4\") " pod="openshift-ingress/kube-auth-proxy-5b474cc896-hrn2m" Apr 17 14:29:05.407239 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:05.407218 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8j72\" (UniqueName: \"kubernetes.io/projected/1b235a5d-f63d-4963-bd85-69721b597cd4-kube-api-access-s8j72\") pod \"kube-auth-proxy-5b474cc896-hrn2m\" (UID: \"1b235a5d-f63d-4963-bd85-69721b597cd4\") " pod="openshift-ingress/kube-auth-proxy-5b474cc896-hrn2m" Apr 17 14:29:05.901902 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:05.901825 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1b235a5d-f63d-4963-bd85-69721b597cd4-tls-certs\") pod \"kube-auth-proxy-5b474cc896-hrn2m\" (UID: \"1b235a5d-f63d-4963-bd85-69721b597cd4\") " pod="openshift-ingress/kube-auth-proxy-5b474cc896-hrn2m" Apr 17 14:29:05.904317 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:05.904293 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1b235a5d-f63d-4963-bd85-69721b597cd4-tls-certs\") pod \"kube-auth-proxy-5b474cc896-hrn2m\" (UID: \"1b235a5d-f63d-4963-bd85-69721b597cd4\") " pod="openshift-ingress/kube-auth-proxy-5b474cc896-hrn2m" Apr 17 14:29:06.084396 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:06.084366 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-5b474cc896-hrn2m" Apr 17 14:29:06.221578 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:06.221553 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-5b474cc896-hrn2m"] Apr 17 14:29:06.224061 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:29:06.224033 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b235a5d_f63d_4963_bd85_69721b597cd4.slice/crio-227ffbbb9e5551f65ebaef35ef87d6baaa56d8495b0c0aa8e9700ecaaabe7f41 WatchSource:0}: Error finding container 227ffbbb9e5551f65ebaef35ef87d6baaa56d8495b0c0aa8e9700ecaaabe7f41: Status 404 returned error can't find the container with id 227ffbbb9e5551f65ebaef35ef87d6baaa56d8495b0c0aa8e9700ecaaabe7f41 Apr 17 14:29:06.997360 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:06.997319 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-5b474cc896-hrn2m" event={"ID":"1b235a5d-f63d-4963-bd85-69721b597cd4","Type":"ContainerStarted","Data":"227ffbbb9e5551f65ebaef35ef87d6baaa56d8495b0c0aa8e9700ecaaabe7f41"} Apr 17 14:29:08.191015 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:08.190981 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-24dl9"] Apr 17 14:29:08.194799 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:08.194776 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-24dl9" Apr 17 14:29:08.197631 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:08.197606 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-webhook-cert\"" Apr 17 14:29:08.197995 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:08.197610 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-dockercfg-bssm4\"" Apr 17 14:29:08.201239 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:08.201217 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-24dl9"] Apr 17 14:29:08.323957 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:08.323866 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wp96c\" (UniqueName: \"kubernetes.io/projected/c4818cef-8813-4ed3-ae34-1ff8d2acb789-kube-api-access-wp96c\") pod \"odh-model-controller-858dbf95b8-24dl9\" (UID: \"c4818cef-8813-4ed3-ae34-1ff8d2acb789\") " pod="opendatahub/odh-model-controller-858dbf95b8-24dl9" Apr 17 14:29:08.323957 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:08.323926 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c4818cef-8813-4ed3-ae34-1ff8d2acb789-cert\") pod \"odh-model-controller-858dbf95b8-24dl9\" (UID: \"c4818cef-8813-4ed3-ae34-1ff8d2acb789\") " pod="opendatahub/odh-model-controller-858dbf95b8-24dl9" Apr 17 14:29:08.425117 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:08.425081 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wp96c\" (UniqueName: \"kubernetes.io/projected/c4818cef-8813-4ed3-ae34-1ff8d2acb789-kube-api-access-wp96c\") pod \"odh-model-controller-858dbf95b8-24dl9\" (UID: \"c4818cef-8813-4ed3-ae34-1ff8d2acb789\") " pod="opendatahub/odh-model-controller-858dbf95b8-24dl9" Apr 17 14:29:08.425117 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:08.425115 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c4818cef-8813-4ed3-ae34-1ff8d2acb789-cert\") pod \"odh-model-controller-858dbf95b8-24dl9\" (UID: \"c4818cef-8813-4ed3-ae34-1ff8d2acb789\") " pod="opendatahub/odh-model-controller-858dbf95b8-24dl9" Apr 17 14:29:08.425367 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:29:08.425231 2575 secret.go:189] Couldn't get secret opendatahub/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 17 14:29:08.425367 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:29:08.425291 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c4818cef-8813-4ed3-ae34-1ff8d2acb789-cert podName:c4818cef-8813-4ed3-ae34-1ff8d2acb789 nodeName:}" failed. No retries permitted until 2026-04-17 14:29:08.925276166 +0000 UTC m=+467.232029764 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c4818cef-8813-4ed3-ae34-1ff8d2acb789-cert") pod "odh-model-controller-858dbf95b8-24dl9" (UID: "c4818cef-8813-4ed3-ae34-1ff8d2acb789") : secret "odh-model-controller-webhook-cert" not found Apr 17 14:29:08.433687 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:08.433659 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wp96c\" (UniqueName: \"kubernetes.io/projected/c4818cef-8813-4ed3-ae34-1ff8d2acb789-kube-api-access-wp96c\") pod \"odh-model-controller-858dbf95b8-24dl9\" (UID: \"c4818cef-8813-4ed3-ae34-1ff8d2acb789\") " pod="opendatahub/odh-model-controller-858dbf95b8-24dl9" Apr 17 14:29:08.928652 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:08.928612 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c4818cef-8813-4ed3-ae34-1ff8d2acb789-cert\") pod \"odh-model-controller-858dbf95b8-24dl9\" (UID: \"c4818cef-8813-4ed3-ae34-1ff8d2acb789\") " pod="opendatahub/odh-model-controller-858dbf95b8-24dl9" Apr 17 14:29:08.931075 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:08.931049 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c4818cef-8813-4ed3-ae34-1ff8d2acb789-cert\") pod \"odh-model-controller-858dbf95b8-24dl9\" (UID: \"c4818cef-8813-4ed3-ae34-1ff8d2acb789\") " pod="opendatahub/odh-model-controller-858dbf95b8-24dl9" Apr 17 14:29:09.108910 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:09.108859 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-24dl9" Apr 17 14:29:09.768114 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:09.768093 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-24dl9"] Apr 17 14:29:09.769376 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:29:09.769346 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc4818cef_8813_4ed3_ae34_1ff8d2acb789.slice/crio-1344cb4b5944749a58b316ce38466a025d741d0d238226cd88357f87e163d2c8 WatchSource:0}: Error finding container 1344cb4b5944749a58b316ce38466a025d741d0d238226cd88357f87e163d2c8: Status 404 returned error can't find the container with id 1344cb4b5944749a58b316ce38466a025d741d0d238226cd88357f87e163d2c8 Apr 17 14:29:10.009404 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:10.009309 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-5b474cc896-hrn2m" event={"ID":"1b235a5d-f63d-4963-bd85-69721b597cd4","Type":"ContainerStarted","Data":"1f773715581548021fdded7d58cc36a2a6fc8ae5a5fa7e8c28d6eb4b713156d3"} Apr 17 14:29:10.010275 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:10.010249 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-24dl9" event={"ID":"c4818cef-8813-4ed3-ae34-1ff8d2acb789","Type":"ContainerStarted","Data":"1344cb4b5944749a58b316ce38466a025d741d0d238226cd88357f87e163d2c8"} Apr 17 14:29:10.024481 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:10.024304 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-5b474cc896-hrn2m" podStartSLOduration=1.5394387219999999 podStartE2EDuration="5.02428824s" podCreationTimestamp="2026-04-17 14:29:05 +0000 UTC" firstStartedPulling="2026-04-17 14:29:06.22572074 +0000 UTC m=+464.532474337" lastFinishedPulling="2026-04-17 14:29:09.710570256 +0000 UTC m=+468.017323855" observedRunningTime="2026-04-17 14:29:10.024140364 +0000 UTC m=+468.330893984" watchObservedRunningTime="2026-04-17 14:29:10.02428824 +0000 UTC m=+468.331041860" Apr 17 14:29:13.022188 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:13.022156 2575 generic.go:358] "Generic (PLEG): container finished" podID="c4818cef-8813-4ed3-ae34-1ff8d2acb789" containerID="73949c38f9602a8d54d0b01c8c263ebaa09299bb925a413b0fc12d8b1b26f2a7" exitCode=1 Apr 17 14:29:13.022552 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:13.022202 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-24dl9" event={"ID":"c4818cef-8813-4ed3-ae34-1ff8d2acb789","Type":"ContainerDied","Data":"73949c38f9602a8d54d0b01c8c263ebaa09299bb925a413b0fc12d8b1b26f2a7"} Apr 17 14:29:13.022552 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:13.022434 2575 scope.go:117] "RemoveContainer" containerID="73949c38f9602a8d54d0b01c8c263ebaa09299bb925a413b0fc12d8b1b26f2a7" Apr 17 14:29:13.731774 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:13.731738 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-sp7bk"] Apr 17 14:29:13.735129 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:13.735113 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-sp7bk" Apr 17 14:29:13.737695 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:13.737671 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-webhook-server-cert\"" Apr 17 14:29:13.737802 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:13.737731 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-controller-manager-dockercfg-wwcnc\"" Apr 17 14:29:13.743238 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:13.743215 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-sp7bk"] Apr 17 14:29:13.873079 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:13.873047 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n66k5\" (UniqueName: \"kubernetes.io/projected/cda3b0f0-f542-412c-aab3-748fec9a0d43-kube-api-access-n66k5\") pod \"kserve-controller-manager-856948b99f-sp7bk\" (UID: \"cda3b0f0-f542-412c-aab3-748fec9a0d43\") " pod="opendatahub/kserve-controller-manager-856948b99f-sp7bk" Apr 17 14:29:13.873243 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:13.873113 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cda3b0f0-f542-412c-aab3-748fec9a0d43-cert\") pod \"kserve-controller-manager-856948b99f-sp7bk\" (UID: \"cda3b0f0-f542-412c-aab3-748fec9a0d43\") " pod="opendatahub/kserve-controller-manager-856948b99f-sp7bk" Apr 17 14:29:13.974446 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:13.974416 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n66k5\" (UniqueName: \"kubernetes.io/projected/cda3b0f0-f542-412c-aab3-748fec9a0d43-kube-api-access-n66k5\") pod \"kserve-controller-manager-856948b99f-sp7bk\" (UID: \"cda3b0f0-f542-412c-aab3-748fec9a0d43\") " pod="opendatahub/kserve-controller-manager-856948b99f-sp7bk" Apr 17 14:29:13.974627 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:13.974480 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cda3b0f0-f542-412c-aab3-748fec9a0d43-cert\") pod \"kserve-controller-manager-856948b99f-sp7bk\" (UID: \"cda3b0f0-f542-412c-aab3-748fec9a0d43\") " pod="opendatahub/kserve-controller-manager-856948b99f-sp7bk" Apr 17 14:29:13.974627 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:29:13.974567 2575 secret.go:189] Couldn't get secret opendatahub/kserve-webhook-server-cert: secret "kserve-webhook-server-cert" not found Apr 17 14:29:13.974627 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:29:13.974621 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cda3b0f0-f542-412c-aab3-748fec9a0d43-cert podName:cda3b0f0-f542-412c-aab3-748fec9a0d43 nodeName:}" failed. No retries permitted until 2026-04-17 14:29:14.474605154 +0000 UTC m=+472.781358752 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cda3b0f0-f542-412c-aab3-748fec9a0d43-cert") pod "kserve-controller-manager-856948b99f-sp7bk" (UID: "cda3b0f0-f542-412c-aab3-748fec9a0d43") : secret "kserve-webhook-server-cert" not found Apr 17 14:29:13.991625 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:13.991553 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n66k5\" (UniqueName: \"kubernetes.io/projected/cda3b0f0-f542-412c-aab3-748fec9a0d43-kube-api-access-n66k5\") pod \"kserve-controller-manager-856948b99f-sp7bk\" (UID: \"cda3b0f0-f542-412c-aab3-748fec9a0d43\") " pod="opendatahub/kserve-controller-manager-856948b99f-sp7bk" Apr 17 14:29:14.027587 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:14.027556 2575 generic.go:358] "Generic (PLEG): container finished" podID="c4818cef-8813-4ed3-ae34-1ff8d2acb789" containerID="67ff7b8f4f1e0d5dc9338d92bd6b0d1bc7d3c01eecc0a84120e65d9b068b88a9" exitCode=1 Apr 17 14:29:14.027993 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:14.027645 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-24dl9" event={"ID":"c4818cef-8813-4ed3-ae34-1ff8d2acb789","Type":"ContainerDied","Data":"67ff7b8f4f1e0d5dc9338d92bd6b0d1bc7d3c01eecc0a84120e65d9b068b88a9"} Apr 17 14:29:14.027993 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:14.027688 2575 scope.go:117] "RemoveContainer" containerID="73949c38f9602a8d54d0b01c8c263ebaa09299bb925a413b0fc12d8b1b26f2a7" Apr 17 14:29:14.027993 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:14.027911 2575 scope.go:117] "RemoveContainer" containerID="67ff7b8f4f1e0d5dc9338d92bd6b0d1bc7d3c01eecc0a84120e65d9b068b88a9" Apr 17 14:29:14.028159 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:29:14.028139 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-24dl9_opendatahub(c4818cef-8813-4ed3-ae34-1ff8d2acb789)\"" pod="opendatahub/odh-model-controller-858dbf95b8-24dl9" podUID="c4818cef-8813-4ed3-ae34-1ff8d2acb789" Apr 17 14:29:14.480082 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:14.480049 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cda3b0f0-f542-412c-aab3-748fec9a0d43-cert\") pod \"kserve-controller-manager-856948b99f-sp7bk\" (UID: \"cda3b0f0-f542-412c-aab3-748fec9a0d43\") " pod="opendatahub/kserve-controller-manager-856948b99f-sp7bk" Apr 17 14:29:14.482406 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:14.482385 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cda3b0f0-f542-412c-aab3-748fec9a0d43-cert\") pod \"kserve-controller-manager-856948b99f-sp7bk\" (UID: \"cda3b0f0-f542-412c-aab3-748fec9a0d43\") " pod="opendatahub/kserve-controller-manager-856948b99f-sp7bk" Apr 17 14:29:14.646619 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:14.646575 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-sp7bk" Apr 17 14:29:14.787455 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:14.787354 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-sp7bk"] Apr 17 14:29:14.790313 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:29:14.790272 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcda3b0f0_f542_412c_aab3_748fec9a0d43.slice/crio-a1fc7c15f3e48b4d71c88dacc0b87a865599f09cb77f107edfb10b6a7d82ee33 WatchSource:0}: Error finding container a1fc7c15f3e48b4d71c88dacc0b87a865599f09cb77f107edfb10b6a7d82ee33: Status 404 returned error can't find the container with id a1fc7c15f3e48b4d71c88dacc0b87a865599f09cb77f107edfb10b6a7d82ee33 Apr 17 14:29:15.033333 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:15.033241 2575 scope.go:117] "RemoveContainer" containerID="67ff7b8f4f1e0d5dc9338d92bd6b0d1bc7d3c01eecc0a84120e65d9b068b88a9" Apr 17 14:29:15.033733 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:29:15.033498 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-24dl9_opendatahub(c4818cef-8813-4ed3-ae34-1ff8d2acb789)\"" pod="opendatahub/odh-model-controller-858dbf95b8-24dl9" podUID="c4818cef-8813-4ed3-ae34-1ff8d2acb789" Apr 17 14:29:15.034122 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:15.034102 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-sp7bk" event={"ID":"cda3b0f0-f542-412c-aab3-748fec9a0d43","Type":"ContainerStarted","Data":"a1fc7c15f3e48b4d71c88dacc0b87a865599f09cb77f107edfb10b6a7d82ee33"} Apr 17 14:29:17.849647 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:17.849606 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-44ggh"] Apr 17 14:29:17.853011 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:17.852994 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-44ggh" Apr 17 14:29:17.856377 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:17.856354 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"openshift-service-ca.crt\"" Apr 17 14:29:17.856492 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:17.856356 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"servicemesh-operator3-dockercfg-jmd77\"" Apr 17 14:29:17.856492 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:17.856474 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"kube-root-ca.crt\"" Apr 17 14:29:17.866444 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:17.866422 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-44ggh"] Apr 17 14:29:18.013631 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:18.013547 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5xsw\" (UniqueName: \"kubernetes.io/projected/767d4c36-c7d8-4c0d-950e-6ccbb55d7003-kube-api-access-q5xsw\") pod \"servicemesh-operator3-55f49c5f94-44ggh\" (UID: \"767d4c36-c7d8-4c0d-950e-6ccbb55d7003\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-44ggh" Apr 17 14:29:18.013767 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:18.013649 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/767d4c36-c7d8-4c0d-950e-6ccbb55d7003-operator-config\") pod \"servicemesh-operator3-55f49c5f94-44ggh\" (UID: \"767d4c36-c7d8-4c0d-950e-6ccbb55d7003\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-44ggh" Apr 17 14:29:18.049986 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:18.049933 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-sp7bk" event={"ID":"cda3b0f0-f542-412c-aab3-748fec9a0d43","Type":"ContainerStarted","Data":"c464f10ec14eed349acfb58fb3a64408420b127d72e304dc7620dc565f2c0c92"} Apr 17 14:29:18.050178 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:18.050031 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/kserve-controller-manager-856948b99f-sp7bk" Apr 17 14:29:18.068313 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:18.068267 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/kserve-controller-manager-856948b99f-sp7bk" podStartSLOduration=2.228757377 podStartE2EDuration="5.06825349s" podCreationTimestamp="2026-04-17 14:29:13 +0000 UTC" firstStartedPulling="2026-04-17 14:29:14.791764274 +0000 UTC m=+473.098517871" lastFinishedPulling="2026-04-17 14:29:17.631260387 +0000 UTC m=+475.938013984" observedRunningTime="2026-04-17 14:29:18.067475302 +0000 UTC m=+476.374228921" watchObservedRunningTime="2026-04-17 14:29:18.06825349 +0000 UTC m=+476.375007108" Apr 17 14:29:18.114476 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:18.114446 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q5xsw\" (UniqueName: \"kubernetes.io/projected/767d4c36-c7d8-4c0d-950e-6ccbb55d7003-kube-api-access-q5xsw\") pod \"servicemesh-operator3-55f49c5f94-44ggh\" (UID: \"767d4c36-c7d8-4c0d-950e-6ccbb55d7003\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-44ggh" Apr 17 14:29:18.114627 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:18.114512 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/767d4c36-c7d8-4c0d-950e-6ccbb55d7003-operator-config\") pod \"servicemesh-operator3-55f49c5f94-44ggh\" (UID: \"767d4c36-c7d8-4c0d-950e-6ccbb55d7003\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-44ggh" Apr 17 14:29:18.117060 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:18.117036 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/767d4c36-c7d8-4c0d-950e-6ccbb55d7003-operator-config\") pod \"servicemesh-operator3-55f49c5f94-44ggh\" (UID: \"767d4c36-c7d8-4c0d-950e-6ccbb55d7003\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-44ggh" Apr 17 14:29:18.125025 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:18.125005 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5xsw\" (UniqueName: \"kubernetes.io/projected/767d4c36-c7d8-4c0d-950e-6ccbb55d7003-kube-api-access-q5xsw\") pod \"servicemesh-operator3-55f49c5f94-44ggh\" (UID: \"767d4c36-c7d8-4c0d-950e-6ccbb55d7003\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-44ggh" Apr 17 14:29:18.162587 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:18.162564 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-44ggh" Apr 17 14:29:18.297152 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:18.297085 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-44ggh"] Apr 17 14:29:18.299526 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:29:18.299498 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod767d4c36_c7d8_4c0d_950e_6ccbb55d7003.slice/crio-1617b145a98cf52672382d78352c930bf6311af7f58d3e8078b9f1cea8e1283d WatchSource:0}: Error finding container 1617b145a98cf52672382d78352c930bf6311af7f58d3e8078b9f1cea8e1283d: Status 404 returned error can't find the container with id 1617b145a98cf52672382d78352c930bf6311af7f58d3e8078b9f1cea8e1283d Apr 17 14:29:19.054676 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:19.054641 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-44ggh" event={"ID":"767d4c36-c7d8-4c0d-950e-6ccbb55d7003","Type":"ContainerStarted","Data":"1617b145a98cf52672382d78352c930bf6311af7f58d3e8078b9f1cea8e1283d"} Apr 17 14:29:19.109129 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:19.109091 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-24dl9" Apr 17 14:29:19.109446 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:19.109434 2575 scope.go:117] "RemoveContainer" containerID="67ff7b8f4f1e0d5dc9338d92bd6b0d1bc7d3c01eecc0a84120e65d9b068b88a9" Apr 17 14:29:19.109616 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:29:19.109601 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-24dl9_opendatahub(c4818cef-8813-4ed3-ae34-1ff8d2acb789)\"" pod="opendatahub/odh-model-controller-858dbf95b8-24dl9" podUID="c4818cef-8813-4ed3-ae34-1ff8d2acb789" Apr 17 14:29:22.067778 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:22.067747 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-44ggh" event={"ID":"767d4c36-c7d8-4c0d-950e-6ccbb55d7003","Type":"ContainerStarted","Data":"8df356022db7ed024edfba9f6ec92ef2c768197a2ac1d40955bc74a331772bf5"} Apr 17 14:29:22.068160 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:22.067846 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-44ggh" Apr 17 14:29:22.090733 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:22.090693 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/servicemesh-operator3-55f49c5f94-44ggh" podStartSLOduration=1.956249811 podStartE2EDuration="5.090681663s" podCreationTimestamp="2026-04-17 14:29:17 +0000 UTC" firstStartedPulling="2026-04-17 14:29:18.301969116 +0000 UTC m=+476.608722714" lastFinishedPulling="2026-04-17 14:29:21.436400955 +0000 UTC m=+479.743154566" observedRunningTime="2026-04-17 14:29:22.089054019 +0000 UTC m=+480.395807664" watchObservedRunningTime="2026-04-17 14:29:22.090681663 +0000 UTC m=+480.397435281" Apr 17 14:29:27.895839 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:27.895805 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-4hj5k"] Apr 17 14:29:27.901726 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:27.901705 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4hj5k" Apr 17 14:29:27.904398 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:27.904375 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"cacerts\"" Apr 17 14:29:27.904519 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:27.904375 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-tls\"" Apr 17 14:29:27.904519 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:27.904375 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 17 14:29:27.904519 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:27.904427 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-openshift-gateway-dockercfg-g2wsf\"" Apr 17 14:29:27.904733 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:27.904703 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istio-kubeconfig\"" Apr 17 14:29:27.908857 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:27.908831 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-4hj5k"] Apr 17 14:29:28.000028 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:27.999997 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/0adf8995-4617-448e-b417-f69e0ac85b30-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-4hj5k\" (UID: \"0adf8995-4617-448e-b417-f69e0ac85b30\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4hj5k" Apr 17 14:29:28.000184 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:28.000057 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/0adf8995-4617-448e-b417-f69e0ac85b30-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-4hj5k\" (UID: \"0adf8995-4617-448e-b417-f69e0ac85b30\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4hj5k" Apr 17 14:29:28.000184 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:28.000106 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/0adf8995-4617-448e-b417-f69e0ac85b30-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-4hj5k\" (UID: \"0adf8995-4617-448e-b417-f69e0ac85b30\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4hj5k" Apr 17 14:29:28.000184 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:28.000147 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/0adf8995-4617-448e-b417-f69e0ac85b30-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-4hj5k\" (UID: \"0adf8995-4617-448e-b417-f69e0ac85b30\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4hj5k" Apr 17 14:29:28.000290 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:28.000183 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sv4mr\" (UniqueName: \"kubernetes.io/projected/0adf8995-4617-448e-b417-f69e0ac85b30-kube-api-access-sv4mr\") pod \"istiod-openshift-gateway-55ff986f96-4hj5k\" (UID: \"0adf8995-4617-448e-b417-f69e0ac85b30\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4hj5k" Apr 17 14:29:28.000290 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:28.000229 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/0adf8995-4617-448e-b417-f69e0ac85b30-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-4hj5k\" (UID: \"0adf8995-4617-448e-b417-f69e0ac85b30\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4hj5k" Apr 17 14:29:28.000290 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:28.000261 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/0adf8995-4617-448e-b417-f69e0ac85b30-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-4hj5k\" (UID: \"0adf8995-4617-448e-b417-f69e0ac85b30\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4hj5k" Apr 17 14:29:28.100982 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:28.100941 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/0adf8995-4617-448e-b417-f69e0ac85b30-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-4hj5k\" (UID: \"0adf8995-4617-448e-b417-f69e0ac85b30\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4hj5k" Apr 17 14:29:28.101169 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:28.101000 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/0adf8995-4617-448e-b417-f69e0ac85b30-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-4hj5k\" (UID: \"0adf8995-4617-448e-b417-f69e0ac85b30\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4hj5k" Apr 17 14:29:28.101169 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:28.101118 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/0adf8995-4617-448e-b417-f69e0ac85b30-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-4hj5k\" (UID: \"0adf8995-4617-448e-b417-f69e0ac85b30\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4hj5k" Apr 17 14:29:28.101169 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:28.101158 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sv4mr\" (UniqueName: \"kubernetes.io/projected/0adf8995-4617-448e-b417-f69e0ac85b30-kube-api-access-sv4mr\") pod \"istiod-openshift-gateway-55ff986f96-4hj5k\" (UID: \"0adf8995-4617-448e-b417-f69e0ac85b30\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4hj5k" Apr 17 14:29:28.101340 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:28.101202 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/0adf8995-4617-448e-b417-f69e0ac85b30-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-4hj5k\" (UID: \"0adf8995-4617-448e-b417-f69e0ac85b30\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4hj5k" Apr 17 14:29:28.101340 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:28.101229 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/0adf8995-4617-448e-b417-f69e0ac85b30-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-4hj5k\" (UID: \"0adf8995-4617-448e-b417-f69e0ac85b30\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4hj5k" Apr 17 14:29:28.101340 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:28.101284 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/0adf8995-4617-448e-b417-f69e0ac85b30-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-4hj5k\" (UID: \"0adf8995-4617-448e-b417-f69e0ac85b30\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4hj5k" Apr 17 14:29:28.101632 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:28.101603 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/0adf8995-4617-448e-b417-f69e0ac85b30-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-4hj5k\" (UID: \"0adf8995-4617-448e-b417-f69e0ac85b30\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4hj5k" Apr 17 14:29:28.103795 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:28.103770 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/0adf8995-4617-448e-b417-f69e0ac85b30-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-4hj5k\" (UID: \"0adf8995-4617-448e-b417-f69e0ac85b30\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4hj5k" Apr 17 14:29:28.103987 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:28.103965 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/0adf8995-4617-448e-b417-f69e0ac85b30-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-4hj5k\" (UID: \"0adf8995-4617-448e-b417-f69e0ac85b30\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4hj5k" Apr 17 14:29:28.104153 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:28.104131 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/0adf8995-4617-448e-b417-f69e0ac85b30-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-4hj5k\" (UID: \"0adf8995-4617-448e-b417-f69e0ac85b30\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4hj5k" Apr 17 14:29:28.104203 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:28.104189 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/0adf8995-4617-448e-b417-f69e0ac85b30-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-4hj5k\" (UID: \"0adf8995-4617-448e-b417-f69e0ac85b30\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4hj5k" Apr 17 14:29:28.109591 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:28.109564 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sv4mr\" (UniqueName: \"kubernetes.io/projected/0adf8995-4617-448e-b417-f69e0ac85b30-kube-api-access-sv4mr\") pod \"istiod-openshift-gateway-55ff986f96-4hj5k\" (UID: \"0adf8995-4617-448e-b417-f69e0ac85b30\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4hj5k" Apr 17 14:29:28.109664 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:28.109614 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/0adf8995-4617-448e-b417-f69e0ac85b30-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-4hj5k\" (UID: \"0adf8995-4617-448e-b417-f69e0ac85b30\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4hj5k" Apr 17 14:29:28.212923 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:28.212850 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4hj5k" Apr 17 14:29:28.345655 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:28.345625 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-4hj5k"] Apr 17 14:29:28.348220 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:29:28.348158 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0adf8995_4617_448e_b417_f69e0ac85b30.slice/crio-988dba12ee514dfbc034f99e391af0cb1bcfcafb681e713ff76b0f0992e68efa WatchSource:0}: Error finding container 988dba12ee514dfbc034f99e391af0cb1bcfcafb681e713ff76b0f0992e68efa: Status 404 returned error can't find the container with id 988dba12ee514dfbc034f99e391af0cb1bcfcafb681e713ff76b0f0992e68efa Apr 17 14:29:29.092831 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:29.092797 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4hj5k" event={"ID":"0adf8995-4617-448e-b417-f69e0ac85b30","Type":"ContainerStarted","Data":"988dba12ee514dfbc034f99e391af0cb1bcfcafb681e713ff76b0f0992e68efa"} Apr 17 14:29:29.109292 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:29.109267 2575 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="opendatahub/odh-model-controller-858dbf95b8-24dl9" Apr 17 14:29:29.109722 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:29.109706 2575 scope.go:117] "RemoveContainer" containerID="67ff7b8f4f1e0d5dc9338d92bd6b0d1bc7d3c01eecc0a84120e65d9b068b88a9" Apr 17 14:29:30.099165 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:30.099125 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-24dl9" event={"ID":"c4818cef-8813-4ed3-ae34-1ff8d2acb789","Type":"ContainerStarted","Data":"59e23e213c3d5f7778ae1515dfdd24cd9bbc532323dcfdc47bb0ba85584c3da8"} Apr 17 14:29:30.099625 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:30.099353 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-24dl9" Apr 17 14:29:30.116775 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:30.116701 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/odh-model-controller-858dbf95b8-24dl9" podStartSLOduration=2.476584152 podStartE2EDuration="22.116685617s" podCreationTimestamp="2026-04-17 14:29:08 +0000 UTC" firstStartedPulling="2026-04-17 14:29:09.770814622 +0000 UTC m=+468.077568224" lastFinishedPulling="2026-04-17 14:29:29.410916092 +0000 UTC m=+487.717669689" observedRunningTime="2026-04-17 14:29:30.115713807 +0000 UTC m=+488.422467429" watchObservedRunningTime="2026-04-17 14:29:30.116685617 +0000 UTC m=+488.423439235" Apr 17 14:29:31.706958 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:31.706912 2575 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 17 14:29:31.707231 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:31.706995 2575 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 17 14:29:32.109426 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:32.109390 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4hj5k" event={"ID":"0adf8995-4617-448e-b417-f69e0ac85b30","Type":"ContainerStarted","Data":"f5bbfc173ccf2004403c4ee882d15deedffe6dab2a1c3b97dd4d472249c97431"} Apr 17 14:29:32.109604 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:32.109591 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4hj5k" Apr 17 14:29:32.110937 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:32.110909 2575 patch_prober.go:28] interesting pod/istiod-openshift-gateway-55ff986f96-4hj5k container/discovery namespace/openshift-ingress: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body= Apr 17 14:29:32.111066 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:32.110971 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4hj5k" podUID="0adf8995-4617-448e-b417-f69e0ac85b30" containerName="discovery" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 14:29:32.130194 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:32.130151 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4hj5k" podStartSLOduration=1.773732026 podStartE2EDuration="5.130136524s" podCreationTimestamp="2026-04-17 14:29:27 +0000 UTC" firstStartedPulling="2026-04-17 14:29:28.350213994 +0000 UTC m=+486.656967596" lastFinishedPulling="2026-04-17 14:29:31.706618496 +0000 UTC m=+490.013372094" observedRunningTime="2026-04-17 14:29:32.127182565 +0000 UTC m=+490.433936196" watchObservedRunningTime="2026-04-17 14:29:32.130136524 +0000 UTC m=+490.436890143" Apr 17 14:29:33.073025 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:33.072992 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-44ggh" Apr 17 14:29:33.113395 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:33.113365 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4hj5k" Apr 17 14:29:41.108284 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:41.108252 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/odh-model-controller-858dbf95b8-24dl9" Apr 17 14:29:49.060288 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:29:49.060262 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/kserve-controller-manager-856948b99f-sp7bk" Apr 17 14:30:39.451278 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:39.451239 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-rchsq"] Apr 17 14:30:39.454999 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:39.454977 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-rchsq" Apr 17 14:30:39.457782 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:39.457756 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-hgfvd\"" Apr 17 14:30:39.457916 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:39.457862 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 17 14:30:39.458064 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:39.458045 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 17 14:30:39.465155 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:39.465133 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-rchsq"] Apr 17 14:30:39.536344 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:39.536314 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/78d0fc5a-9e2b-4bde-bc8e-17dd4a7dba87-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-rchsq\" (UID: \"78d0fc5a-9e2b-4bde-bc8e-17dd4a7dba87\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-rchsq" Apr 17 14:30:39.536615 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:39.536584 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4z8x\" (UniqueName: \"kubernetes.io/projected/78d0fc5a-9e2b-4bde-bc8e-17dd4a7dba87-kube-api-access-d4z8x\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-rchsq\" (UID: \"78d0fc5a-9e2b-4bde-bc8e-17dd4a7dba87\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-rchsq" Apr 17 14:30:39.637256 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:39.637218 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d4z8x\" (UniqueName: \"kubernetes.io/projected/78d0fc5a-9e2b-4bde-bc8e-17dd4a7dba87-kube-api-access-d4z8x\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-rchsq\" (UID: \"78d0fc5a-9e2b-4bde-bc8e-17dd4a7dba87\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-rchsq" Apr 17 14:30:39.637422 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:39.637283 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/78d0fc5a-9e2b-4bde-bc8e-17dd4a7dba87-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-rchsq\" (UID: \"78d0fc5a-9e2b-4bde-bc8e-17dd4a7dba87\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-rchsq" Apr 17 14:30:39.637631 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:39.637613 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/78d0fc5a-9e2b-4bde-bc8e-17dd4a7dba87-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-rchsq\" (UID: \"78d0fc5a-9e2b-4bde-bc8e-17dd4a7dba87\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-rchsq" Apr 17 14:30:39.652115 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:39.652088 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4z8x\" (UniqueName: \"kubernetes.io/projected/78d0fc5a-9e2b-4bde-bc8e-17dd4a7dba87-kube-api-access-d4z8x\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-rchsq\" (UID: \"78d0fc5a-9e2b-4bde-bc8e-17dd4a7dba87\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-rchsq" Apr 17 14:30:39.766278 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:39.766193 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-rchsq" Apr 17 14:30:39.894533 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:39.894510 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-rchsq"] Apr 17 14:30:39.897211 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:30:39.897182 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod78d0fc5a_9e2b_4bde_bc8e_17dd4a7dba87.slice/crio-d006c62ac169ee440af9f589b02b8f2561b2ffc37915d51d14a470337813e2fb WatchSource:0}: Error finding container d006c62ac169ee440af9f589b02b8f2561b2ffc37915d51d14a470337813e2fb: Status 404 returned error can't find the container with id d006c62ac169ee440af9f589b02b8f2561b2ffc37915d51d14a470337813e2fb Apr 17 14:30:40.345087 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:40.345052 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-rchsq" event={"ID":"78d0fc5a-9e2b-4bde-bc8e-17dd4a7dba87","Type":"ContainerStarted","Data":"d006c62ac169ee440af9f589b02b8f2561b2ffc37915d51d14a470337813e2fb"} Apr 17 14:30:44.571689 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:44.571655 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-ctpdf"] Apr 17 14:30:44.575467 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:44.575440 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-ctpdf" Apr 17 14:30:44.578323 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:44.578300 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-manager-dockercfg-kfllh\"" Apr 17 14:30:44.578452 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:44.578299 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-env\"" Apr 17 14:30:44.585321 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:44.585278 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-ctpdf"] Apr 17 14:30:44.687543 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:44.687500 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cm98s\" (UniqueName: \"kubernetes.io/projected/cd186241-8519-4040-941f-5bfbe4ca67a8-kube-api-access-cm98s\") pod \"dns-operator-controller-manager-648d5c98bc-ctpdf\" (UID: \"cd186241-8519-4040-941f-5bfbe4ca67a8\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-ctpdf" Apr 17 14:30:44.789045 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:44.788994 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cm98s\" (UniqueName: \"kubernetes.io/projected/cd186241-8519-4040-941f-5bfbe4ca67a8-kube-api-access-cm98s\") pod \"dns-operator-controller-manager-648d5c98bc-ctpdf\" (UID: \"cd186241-8519-4040-941f-5bfbe4ca67a8\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-ctpdf" Apr 17 14:30:44.799005 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:44.798971 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cm98s\" (UniqueName: \"kubernetes.io/projected/cd186241-8519-4040-941f-5bfbe4ca67a8-kube-api-access-cm98s\") pod \"dns-operator-controller-manager-648d5c98bc-ctpdf\" (UID: \"cd186241-8519-4040-941f-5bfbe4ca67a8\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-ctpdf" Apr 17 14:30:44.889974 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:44.889852 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-ctpdf" Apr 17 14:30:45.803910 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:45.803863 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-ctpdf"] Apr 17 14:30:45.807242 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:30:45.807213 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd186241_8519_4040_941f_5bfbe4ca67a8.slice/crio-87122bc404401c8b2db2fc985885425e92715e594b4ab480400a6bf340d53eca WatchSource:0}: Error finding container 87122bc404401c8b2db2fc985885425e92715e594b4ab480400a6bf340d53eca: Status 404 returned error can't find the container with id 87122bc404401c8b2db2fc985885425e92715e594b4ab480400a6bf340d53eca Apr 17 14:30:46.368537 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:46.368500 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-ctpdf" event={"ID":"cd186241-8519-4040-941f-5bfbe4ca67a8","Type":"ContainerStarted","Data":"87122bc404401c8b2db2fc985885425e92715e594b4ab480400a6bf340d53eca"} Apr 17 14:30:46.370029 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:46.370004 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-rchsq" event={"ID":"78d0fc5a-9e2b-4bde-bc8e-17dd4a7dba87","Type":"ContainerStarted","Data":"d89e1814328f544e4e3b677b224fac34b46b4a5c51c8a8bd398f3b1dbc680784"} Apr 17 14:30:46.370214 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:46.370095 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-rchsq" Apr 17 14:30:46.403550 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:46.403487 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-rchsq" podStartSLOduration=1.5702486169999998 podStartE2EDuration="7.403466825s" podCreationTimestamp="2026-04-17 14:30:39 +0000 UTC" firstStartedPulling="2026-04-17 14:30:39.899636606 +0000 UTC m=+558.206390203" lastFinishedPulling="2026-04-17 14:30:45.732854808 +0000 UTC m=+564.039608411" observedRunningTime="2026-04-17 14:30:46.398773641 +0000 UTC m=+564.705527297" watchObservedRunningTime="2026-04-17 14:30:46.403466825 +0000 UTC m=+564.710220446" Apr 17 14:30:48.863681 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:48.863644 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7f577ffdcb-v5fb5"] Apr 17 14:30:48.867513 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:48.867488 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7f577ffdcb-v5fb5" Apr 17 14:30:48.877061 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:48.877035 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7f577ffdcb-v5fb5"] Apr 17 14:30:49.024691 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:49.024656 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0f3d3e54-7019-4a48-81d1-e73537173f93-console-oauth-config\") pod \"console-7f577ffdcb-v5fb5\" (UID: \"0f3d3e54-7019-4a48-81d1-e73537173f93\") " pod="openshift-console/console-7f577ffdcb-v5fb5" Apr 17 14:30:49.024862 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:49.024708 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0f3d3e54-7019-4a48-81d1-e73537173f93-service-ca\") pod \"console-7f577ffdcb-v5fb5\" (UID: \"0f3d3e54-7019-4a48-81d1-e73537173f93\") " pod="openshift-console/console-7f577ffdcb-v5fb5" Apr 17 14:30:49.024862 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:49.024757 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0f3d3e54-7019-4a48-81d1-e73537173f93-trusted-ca-bundle\") pod \"console-7f577ffdcb-v5fb5\" (UID: \"0f3d3e54-7019-4a48-81d1-e73537173f93\") " pod="openshift-console/console-7f577ffdcb-v5fb5" Apr 17 14:30:49.024862 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:49.024824 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0f3d3e54-7019-4a48-81d1-e73537173f93-console-serving-cert\") pod \"console-7f577ffdcb-v5fb5\" (UID: \"0f3d3e54-7019-4a48-81d1-e73537173f93\") " pod="openshift-console/console-7f577ffdcb-v5fb5" Apr 17 14:30:49.024862 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:49.024850 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0f3d3e54-7019-4a48-81d1-e73537173f93-console-config\") pod \"console-7f577ffdcb-v5fb5\" (UID: \"0f3d3e54-7019-4a48-81d1-e73537173f93\") " pod="openshift-console/console-7f577ffdcb-v5fb5" Apr 17 14:30:49.025046 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:49.024927 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0f3d3e54-7019-4a48-81d1-e73537173f93-oauth-serving-cert\") pod \"console-7f577ffdcb-v5fb5\" (UID: \"0f3d3e54-7019-4a48-81d1-e73537173f93\") " pod="openshift-console/console-7f577ffdcb-v5fb5" Apr 17 14:30:49.025046 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:49.024958 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7cqx\" (UniqueName: \"kubernetes.io/projected/0f3d3e54-7019-4a48-81d1-e73537173f93-kube-api-access-d7cqx\") pod \"console-7f577ffdcb-v5fb5\" (UID: \"0f3d3e54-7019-4a48-81d1-e73537173f93\") " pod="openshift-console/console-7f577ffdcb-v5fb5" Apr 17 14:30:49.126160 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:49.126073 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0f3d3e54-7019-4a48-81d1-e73537173f93-console-oauth-config\") pod \"console-7f577ffdcb-v5fb5\" (UID: \"0f3d3e54-7019-4a48-81d1-e73537173f93\") " pod="openshift-console/console-7f577ffdcb-v5fb5" Apr 17 14:30:49.126160 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:49.126122 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0f3d3e54-7019-4a48-81d1-e73537173f93-service-ca\") pod \"console-7f577ffdcb-v5fb5\" (UID: \"0f3d3e54-7019-4a48-81d1-e73537173f93\") " pod="openshift-console/console-7f577ffdcb-v5fb5" Apr 17 14:30:49.126357 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:49.126241 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0f3d3e54-7019-4a48-81d1-e73537173f93-trusted-ca-bundle\") pod \"console-7f577ffdcb-v5fb5\" (UID: \"0f3d3e54-7019-4a48-81d1-e73537173f93\") " pod="openshift-console/console-7f577ffdcb-v5fb5" Apr 17 14:30:49.126357 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:49.126342 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0f3d3e54-7019-4a48-81d1-e73537173f93-console-serving-cert\") pod \"console-7f577ffdcb-v5fb5\" (UID: \"0f3d3e54-7019-4a48-81d1-e73537173f93\") " pod="openshift-console/console-7f577ffdcb-v5fb5" Apr 17 14:30:49.126461 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:49.126372 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0f3d3e54-7019-4a48-81d1-e73537173f93-console-config\") pod \"console-7f577ffdcb-v5fb5\" (UID: \"0f3d3e54-7019-4a48-81d1-e73537173f93\") " pod="openshift-console/console-7f577ffdcb-v5fb5" Apr 17 14:30:49.126461 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:49.126404 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0f3d3e54-7019-4a48-81d1-e73537173f93-oauth-serving-cert\") pod \"console-7f577ffdcb-v5fb5\" (UID: \"0f3d3e54-7019-4a48-81d1-e73537173f93\") " pod="openshift-console/console-7f577ffdcb-v5fb5" Apr 17 14:30:49.126461 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:49.126438 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d7cqx\" (UniqueName: \"kubernetes.io/projected/0f3d3e54-7019-4a48-81d1-e73537173f93-kube-api-access-d7cqx\") pod \"console-7f577ffdcb-v5fb5\" (UID: \"0f3d3e54-7019-4a48-81d1-e73537173f93\") " pod="openshift-console/console-7f577ffdcb-v5fb5" Apr 17 14:30:49.127102 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:49.127035 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0f3d3e54-7019-4a48-81d1-e73537173f93-service-ca\") pod \"console-7f577ffdcb-v5fb5\" (UID: \"0f3d3e54-7019-4a48-81d1-e73537173f93\") " pod="openshift-console/console-7f577ffdcb-v5fb5" Apr 17 14:30:49.127248 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:49.127223 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0f3d3e54-7019-4a48-81d1-e73537173f93-trusted-ca-bundle\") pod \"console-7f577ffdcb-v5fb5\" (UID: \"0f3d3e54-7019-4a48-81d1-e73537173f93\") " pod="openshift-console/console-7f577ffdcb-v5fb5" Apr 17 14:30:49.127314 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:49.127233 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0f3d3e54-7019-4a48-81d1-e73537173f93-console-config\") pod \"console-7f577ffdcb-v5fb5\" (UID: \"0f3d3e54-7019-4a48-81d1-e73537173f93\") " pod="openshift-console/console-7f577ffdcb-v5fb5" Apr 17 14:30:49.127314 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:49.127262 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0f3d3e54-7019-4a48-81d1-e73537173f93-oauth-serving-cert\") pod \"console-7f577ffdcb-v5fb5\" (UID: \"0f3d3e54-7019-4a48-81d1-e73537173f93\") " pod="openshift-console/console-7f577ffdcb-v5fb5" Apr 17 14:30:49.128855 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:49.128834 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0f3d3e54-7019-4a48-81d1-e73537173f93-console-oauth-config\") pod \"console-7f577ffdcb-v5fb5\" (UID: \"0f3d3e54-7019-4a48-81d1-e73537173f93\") " pod="openshift-console/console-7f577ffdcb-v5fb5" Apr 17 14:30:49.129088 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:49.129068 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0f3d3e54-7019-4a48-81d1-e73537173f93-console-serving-cert\") pod \"console-7f577ffdcb-v5fb5\" (UID: \"0f3d3e54-7019-4a48-81d1-e73537173f93\") " pod="openshift-console/console-7f577ffdcb-v5fb5" Apr 17 14:30:49.134747 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:49.134724 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7cqx\" (UniqueName: \"kubernetes.io/projected/0f3d3e54-7019-4a48-81d1-e73537173f93-kube-api-access-d7cqx\") pod \"console-7f577ffdcb-v5fb5\" (UID: \"0f3d3e54-7019-4a48-81d1-e73537173f93\") " pod="openshift-console/console-7f577ffdcb-v5fb5" Apr 17 14:30:49.178966 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:49.178938 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7f577ffdcb-v5fb5" Apr 17 14:30:49.301974 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:49.301946 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7f577ffdcb-v5fb5"] Apr 17 14:30:49.304139 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:30:49.304108 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f3d3e54_7019_4a48_81d1_e73537173f93.slice/crio-90e9ff6b97e6324147e1859ddbf1900d7ae0e198541aff61257e42ccee0d9f92 WatchSource:0}: Error finding container 90e9ff6b97e6324147e1859ddbf1900d7ae0e198541aff61257e42ccee0d9f92: Status 404 returned error can't find the container with id 90e9ff6b97e6324147e1859ddbf1900d7ae0e198541aff61257e42ccee0d9f92 Apr 17 14:30:49.384093 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:49.383986 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7f577ffdcb-v5fb5" event={"ID":"0f3d3e54-7019-4a48-81d1-e73537173f93","Type":"ContainerStarted","Data":"0d68f50c29a58d4338b9f3ef69ec984bcc52904b23dc14bf9143430eef8f4ca2"} Apr 17 14:30:49.384093 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:49.384029 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7f577ffdcb-v5fb5" event={"ID":"0f3d3e54-7019-4a48-81d1-e73537173f93","Type":"ContainerStarted","Data":"90e9ff6b97e6324147e1859ddbf1900d7ae0e198541aff61257e42ccee0d9f92"} Apr 17 14:30:49.385726 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:49.385697 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-ctpdf" event={"ID":"cd186241-8519-4040-941f-5bfbe4ca67a8","Type":"ContainerStarted","Data":"2e13d3d72758ca1c2668e3a22204b635d765ac05c0a147c6cc85157a1838e49d"} Apr 17 14:30:49.385856 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:49.385809 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-ctpdf" Apr 17 14:30:49.404660 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:49.404598 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7f577ffdcb-v5fb5" podStartSLOduration=1.404579396 podStartE2EDuration="1.404579396s" podCreationTimestamp="2026-04-17 14:30:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 14:30:49.40225113 +0000 UTC m=+567.709004751" watchObservedRunningTime="2026-04-17 14:30:49.404579396 +0000 UTC m=+567.711333016" Apr 17 14:30:49.442585 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:49.442535 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-ctpdf" podStartSLOduration=2.819065991 podStartE2EDuration="5.442519156s" podCreationTimestamp="2026-04-17 14:30:44 +0000 UTC" firstStartedPulling="2026-04-17 14:30:45.80916478 +0000 UTC m=+564.115918384" lastFinishedPulling="2026-04-17 14:30:48.432617948 +0000 UTC m=+566.739371549" observedRunningTime="2026-04-17 14:30:49.439991537 +0000 UTC m=+567.746745156" watchObservedRunningTime="2026-04-17 14:30:49.442519156 +0000 UTC m=+567.749272775" Apr 17 14:30:57.376125 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:57.376089 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-rchsq" Apr 17 14:30:58.241477 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:58.241433 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-rchsq"] Apr 17 14:30:58.241788 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:58.241741 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-rchsq" podUID="78d0fc5a-9e2b-4bde-bc8e-17dd4a7dba87" containerName="manager" containerID="cri-o://d89e1814328f544e4e3b677b224fac34b46b4a5c51c8a8bd398f3b1dbc680784" gracePeriod=2 Apr 17 14:30:58.243978 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:58.243952 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-rchsq"] Apr 17 14:30:58.244143 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:58.244111 2575 status_manager.go:895] "Failed to get status for pod" podUID="78d0fc5a-9e2b-4bde-bc8e-17dd4a7dba87" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-rchsq" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-rchsq\" is forbidden: User \"system:node:ip-10-0-138-3.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-138-3.ec2.internal' and this object" Apr 17 14:30:58.266558 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:58.266533 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-njj79"] Apr 17 14:30:58.266996 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:58.266976 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="78d0fc5a-9e2b-4bde-bc8e-17dd4a7dba87" containerName="manager" Apr 17 14:30:58.266996 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:58.266994 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="78d0fc5a-9e2b-4bde-bc8e-17dd4a7dba87" containerName="manager" Apr 17 14:30:58.267175 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:58.267078 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="78d0fc5a-9e2b-4bde-bc8e-17dd4a7dba87" containerName="manager" Apr 17 14:30:58.270720 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:58.270694 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-njj79" Apr 17 14:30:58.286718 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:58.286685 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-2x6kn"] Apr 17 14:30:58.290485 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:58.290459 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-njj79"] Apr 17 14:30:58.290618 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:58.290591 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-2x6kn" Apr 17 14:30:58.297139 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:58.297113 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"limitador-operator-controller-manager-dockercfg-9kjcm\"" Apr 17 14:30:58.303200 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:58.303167 2575 status_manager.go:895] "Failed to get status for pod" podUID="78d0fc5a-9e2b-4bde-bc8e-17dd4a7dba87" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-rchsq" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-rchsq\" is forbidden: User \"system:node:ip-10-0-138-3.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-138-3.ec2.internal' and this object" Apr 17 14:30:58.303614 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:58.303597 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-2x6kn"] Apr 17 14:30:58.305360 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:58.305330 2575 status_manager.go:895] "Failed to get status for pod" podUID="78d0fc5a-9e2b-4bde-bc8e-17dd4a7dba87" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-rchsq" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-rchsq\" is forbidden: User \"system:node:ip-10-0-138-3.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-138-3.ec2.internal' and this object" Apr 17 14:30:58.412482 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:58.412453 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxdt7\" (UniqueName: \"kubernetes.io/projected/6b6833b0-f9b4-4249-8452-5234926b013c-kube-api-access-wxdt7\") pod \"limitador-operator-controller-manager-85c4996f8c-2x6kn\" (UID: \"6b6833b0-f9b4-4249-8452-5234926b013c\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-2x6kn" Apr 17 14:30:58.413244 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:58.412501 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gvgj\" (UniqueName: \"kubernetes.io/projected/5aee7dd1-89ff-4f57-8de1-0e33fceddce6-kube-api-access-2gvgj\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-njj79\" (UID: \"5aee7dd1-89ff-4f57-8de1-0e33fceddce6\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-njj79" Apr 17 14:30:58.413244 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:58.412555 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/5aee7dd1-89ff-4f57-8de1-0e33fceddce6-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-njj79\" (UID: \"5aee7dd1-89ff-4f57-8de1-0e33fceddce6\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-njj79" Apr 17 14:30:58.420905 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:58.420863 2575 generic.go:358] "Generic (PLEG): container finished" podID="78d0fc5a-9e2b-4bde-bc8e-17dd4a7dba87" containerID="d89e1814328f544e4e3b677b224fac34b46b4a5c51c8a8bd398f3b1dbc680784" exitCode=0 Apr 17 14:30:58.478160 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:58.478137 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-rchsq" Apr 17 14:30:58.513689 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:58.513599 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wxdt7\" (UniqueName: \"kubernetes.io/projected/6b6833b0-f9b4-4249-8452-5234926b013c-kube-api-access-wxdt7\") pod \"limitador-operator-controller-manager-85c4996f8c-2x6kn\" (UID: \"6b6833b0-f9b4-4249-8452-5234926b013c\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-2x6kn" Apr 17 14:30:58.513689 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:58.513652 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2gvgj\" (UniqueName: \"kubernetes.io/projected/5aee7dd1-89ff-4f57-8de1-0e33fceddce6-kube-api-access-2gvgj\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-njj79\" (UID: \"5aee7dd1-89ff-4f57-8de1-0e33fceddce6\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-njj79" Apr 17 14:30:58.513923 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:58.513707 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/5aee7dd1-89ff-4f57-8de1-0e33fceddce6-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-njj79\" (UID: \"5aee7dd1-89ff-4f57-8de1-0e33fceddce6\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-njj79" Apr 17 14:30:58.514118 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:58.514098 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/5aee7dd1-89ff-4f57-8de1-0e33fceddce6-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-njj79\" (UID: \"5aee7dd1-89ff-4f57-8de1-0e33fceddce6\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-njj79" Apr 17 14:30:58.526480 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:58.526448 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gvgj\" (UniqueName: \"kubernetes.io/projected/5aee7dd1-89ff-4f57-8de1-0e33fceddce6-kube-api-access-2gvgj\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-njj79\" (UID: \"5aee7dd1-89ff-4f57-8de1-0e33fceddce6\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-njj79" Apr 17 14:30:58.526564 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:58.526532 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxdt7\" (UniqueName: \"kubernetes.io/projected/6b6833b0-f9b4-4249-8452-5234926b013c-kube-api-access-wxdt7\") pod \"limitador-operator-controller-manager-85c4996f8c-2x6kn\" (UID: \"6b6833b0-f9b4-4249-8452-5234926b013c\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-2x6kn" Apr 17 14:30:58.614458 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:58.614422 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/78d0fc5a-9e2b-4bde-bc8e-17dd4a7dba87-extensions-socket-volume\") pod \"78d0fc5a-9e2b-4bde-bc8e-17dd4a7dba87\" (UID: \"78d0fc5a-9e2b-4bde-bc8e-17dd4a7dba87\") " Apr 17 14:30:58.614615 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:58.614487 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4z8x\" (UniqueName: \"kubernetes.io/projected/78d0fc5a-9e2b-4bde-bc8e-17dd4a7dba87-kube-api-access-d4z8x\") pod \"78d0fc5a-9e2b-4bde-bc8e-17dd4a7dba87\" (UID: \"78d0fc5a-9e2b-4bde-bc8e-17dd4a7dba87\") " Apr 17 14:30:58.614934 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:58.614907 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78d0fc5a-9e2b-4bde-bc8e-17dd4a7dba87-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "78d0fc5a-9e2b-4bde-bc8e-17dd4a7dba87" (UID: "78d0fc5a-9e2b-4bde-bc8e-17dd4a7dba87"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 14:30:58.616561 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:58.616542 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78d0fc5a-9e2b-4bde-bc8e-17dd4a7dba87-kube-api-access-d4z8x" (OuterVolumeSpecName: "kube-api-access-d4z8x") pod "78d0fc5a-9e2b-4bde-bc8e-17dd4a7dba87" (UID: "78d0fc5a-9e2b-4bde-bc8e-17dd4a7dba87"). InnerVolumeSpecName "kube-api-access-d4z8x". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 14:30:58.619743 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:58.619728 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-njj79" Apr 17 14:30:58.626465 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:58.626441 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-2x6kn" Apr 17 14:30:58.718545 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:58.718333 2575 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/78d0fc5a-9e2b-4bde-bc8e-17dd4a7dba87-extensions-socket-volume\") on node \"ip-10-0-138-3.ec2.internal\" DevicePath \"\"" Apr 17 14:30:58.718545 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:58.718363 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-d4z8x\" (UniqueName: \"kubernetes.io/projected/78d0fc5a-9e2b-4bde-bc8e-17dd4a7dba87-kube-api-access-d4z8x\") on node \"ip-10-0-138-3.ec2.internal\" DevicePath \"\"" Apr 17 14:30:58.773076 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:58.773048 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-njj79"] Apr 17 14:30:58.774484 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:30:58.774444 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5aee7dd1_89ff_4f57_8de1_0e33fceddce6.slice/crio-efdf5cb1d4eedb2fcdc63ea528c7cd334237047d77558185d889a7c66fd4da01 WatchSource:0}: Error finding container efdf5cb1d4eedb2fcdc63ea528c7cd334237047d77558185d889a7c66fd4da01: Status 404 returned error can't find the container with id efdf5cb1d4eedb2fcdc63ea528c7cd334237047d77558185d889a7c66fd4da01 Apr 17 14:30:58.793599 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:58.793576 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-2x6kn"] Apr 17 14:30:58.794937 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:30:58.794914 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b6833b0_f9b4_4249_8452_5234926b013c.slice/crio-3acf42af7e817478fff3b16b30e0bf7ec788cdf33bdb9b6e5a371eb87d09745e WatchSource:0}: Error finding container 3acf42af7e817478fff3b16b30e0bf7ec788cdf33bdb9b6e5a371eb87d09745e: Status 404 returned error can't find the container with id 3acf42af7e817478fff3b16b30e0bf7ec788cdf33bdb9b6e5a371eb87d09745e Apr 17 14:30:59.179817 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:59.179784 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7f577ffdcb-v5fb5" Apr 17 14:30:59.179817 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:59.179826 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7f577ffdcb-v5fb5" Apr 17 14:30:59.185246 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:59.185224 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7f577ffdcb-v5fb5" Apr 17 14:30:59.348370 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:59.348142 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-xflb5"] Apr 17 14:30:59.352430 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:59.352399 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-xflb5" Apr 17 14:30:59.363562 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:59.363497 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-xflb5"] Apr 17 14:30:59.427137 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:59.427113 2575 scope.go:117] "RemoveContainer" containerID="d89e1814328f544e4e3b677b224fac34b46b4a5c51c8a8bd398f3b1dbc680784" Apr 17 14:30:59.427507 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:59.427112 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-rchsq" Apr 17 14:30:59.429255 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:59.429222 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-2x6kn" event={"ID":"6b6833b0-f9b4-4249-8452-5234926b013c","Type":"ContainerStarted","Data":"3acf42af7e817478fff3b16b30e0bf7ec788cdf33bdb9b6e5a371eb87d09745e"} Apr 17 14:30:59.431854 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:59.431799 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-njj79" event={"ID":"5aee7dd1-89ff-4f57-8de1-0e33fceddce6","Type":"ContainerStarted","Data":"a472158f0ffe96005a9633b568b768c3b02524c3f7ed19b08b19c5859d644731"} Apr 17 14:30:59.431854 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:59.431835 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-njj79" event={"ID":"5aee7dd1-89ff-4f57-8de1-0e33fceddce6","Type":"ContainerStarted","Data":"efdf5cb1d4eedb2fcdc63ea528c7cd334237047d77558185d889a7c66fd4da01"} Apr 17 14:30:59.432021 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:59.431886 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-njj79" Apr 17 14:30:59.436826 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:59.436791 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7f577ffdcb-v5fb5" Apr 17 14:30:59.478074 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:59.478011 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-njj79" podStartSLOduration=1.477990148 podStartE2EDuration="1.477990148s" podCreationTimestamp="2026-04-17 14:30:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 14:30:59.453331324 +0000 UTC m=+577.760084943" watchObservedRunningTime="2026-04-17 14:30:59.477990148 +0000 UTC m=+577.784743767" Apr 17 14:30:59.521556 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:59.521517 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-64b6cfcbd7-kcp65"] Apr 17 14:30:59.528761 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:59.527549 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgk5l\" (UniqueName: \"kubernetes.io/projected/ff8e485d-47fe-414b-974b-7926fcc8771c-kube-api-access-rgk5l\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-xflb5\" (UID: \"ff8e485d-47fe-414b-974b-7926fcc8771c\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-xflb5" Apr 17 14:30:59.528761 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:59.527617 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/ff8e485d-47fe-414b-974b-7926fcc8771c-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-xflb5\" (UID: \"ff8e485d-47fe-414b-974b-7926fcc8771c\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-xflb5" Apr 17 14:30:59.628563 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:59.628528 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/ff8e485d-47fe-414b-974b-7926fcc8771c-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-xflb5\" (UID: \"ff8e485d-47fe-414b-974b-7926fcc8771c\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-xflb5" Apr 17 14:30:59.628855 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:59.628702 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rgk5l\" (UniqueName: \"kubernetes.io/projected/ff8e485d-47fe-414b-974b-7926fcc8771c-kube-api-access-rgk5l\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-xflb5\" (UID: \"ff8e485d-47fe-414b-974b-7926fcc8771c\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-xflb5" Apr 17 14:30:59.629071 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:59.629041 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/ff8e485d-47fe-414b-974b-7926fcc8771c-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-xflb5\" (UID: \"ff8e485d-47fe-414b-974b-7926fcc8771c\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-xflb5" Apr 17 14:30:59.637191 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:59.637163 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgk5l\" (UniqueName: \"kubernetes.io/projected/ff8e485d-47fe-414b-974b-7926fcc8771c-kube-api-access-rgk5l\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-xflb5\" (UID: \"ff8e485d-47fe-414b-974b-7926fcc8771c\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-xflb5" Apr 17 14:30:59.668561 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:30:59.668528 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-xflb5" Apr 17 14:31:00.056584 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:31:00.056556 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-xflb5"] Apr 17 14:31:00.059944 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:31:00.059919 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff8e485d_47fe_414b_974b_7926fcc8771c.slice/crio-b49d57e35e065bc98b2dce97c8796764ee4e024838fcbf37607042fdc5d934ce WatchSource:0}: Error finding container b49d57e35e065bc98b2dce97c8796764ee4e024838fcbf37607042fdc5d934ce: Status 404 returned error can't find the container with id b49d57e35e065bc98b2dce97c8796764ee4e024838fcbf37607042fdc5d934ce Apr 17 14:31:00.392248 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:31:00.392215 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-ctpdf" Apr 17 14:31:00.408320 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:31:00.408278 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78d0fc5a-9e2b-4bde-bc8e-17dd4a7dba87" path="/var/lib/kubelet/pods/78d0fc5a-9e2b-4bde-bc8e-17dd4a7dba87/volumes" Apr 17 14:31:00.436203 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:31:00.436162 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-xflb5" event={"ID":"ff8e485d-47fe-414b-974b-7926fcc8771c","Type":"ContainerStarted","Data":"dd7e9756b3149abcee37e82190a674b77a374e5944927a969969b17633e124a1"} Apr 17 14:31:00.436203 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:31:00.436207 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-xflb5" event={"ID":"ff8e485d-47fe-414b-974b-7926fcc8771c","Type":"ContainerStarted","Data":"b49d57e35e065bc98b2dce97c8796764ee4e024838fcbf37607042fdc5d934ce"} Apr 17 14:31:00.436704 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:31:00.436280 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-xflb5" Apr 17 14:31:00.438347 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:31:00.438320 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-2x6kn" event={"ID":"6b6833b0-f9b4-4249-8452-5234926b013c","Type":"ContainerStarted","Data":"75e851808e126ea2194f772a609a96a46acadc4bd7ec70c34fba2495ca94c4fb"} Apr 17 14:31:00.438540 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:31:00.438516 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-2x6kn" Apr 17 14:31:00.457717 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:31:00.457661 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-xflb5" podStartSLOduration=1.457642409 podStartE2EDuration="1.457642409s" podCreationTimestamp="2026-04-17 14:30:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 14:31:00.456948142 +0000 UTC m=+578.763701763" watchObservedRunningTime="2026-04-17 14:31:00.457642409 +0000 UTC m=+578.764396028" Apr 17 14:31:00.477523 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:31:00.477474 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-2x6kn" podStartSLOduration=1.308543183 podStartE2EDuration="2.477460216s" podCreationTimestamp="2026-04-17 14:30:58 +0000 UTC" firstStartedPulling="2026-04-17 14:30:58.796835405 +0000 UTC m=+577.103589003" lastFinishedPulling="2026-04-17 14:30:59.965752435 +0000 UTC m=+578.272506036" observedRunningTime="2026-04-17 14:31:00.475643783 +0000 UTC m=+578.782397403" watchObservedRunningTime="2026-04-17 14:31:00.477460216 +0000 UTC m=+578.784213834" Apr 17 14:31:10.440718 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:31:10.440675 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-njj79" Apr 17 14:31:11.444506 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:31:11.444472 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-2x6kn" Apr 17 14:31:11.444947 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:31:11.444537 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-xflb5" Apr 17 14:31:11.499347 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:31:11.499313 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-njj79"] Apr 17 14:31:11.499647 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:31:11.499598 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-njj79" podUID="5aee7dd1-89ff-4f57-8de1-0e33fceddce6" containerName="manager" containerID="cri-o://a472158f0ffe96005a9633b568b768c3b02524c3f7ed19b08b19c5859d644731" gracePeriod=10 Apr 17 14:31:11.749471 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:31:11.749444 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-njj79" Apr 17 14:31:11.839511 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:31:11.839462 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/5aee7dd1-89ff-4f57-8de1-0e33fceddce6-extensions-socket-volume\") pod \"5aee7dd1-89ff-4f57-8de1-0e33fceddce6\" (UID: \"5aee7dd1-89ff-4f57-8de1-0e33fceddce6\") " Apr 17 14:31:11.839708 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:31:11.839589 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2gvgj\" (UniqueName: \"kubernetes.io/projected/5aee7dd1-89ff-4f57-8de1-0e33fceddce6-kube-api-access-2gvgj\") pod \"5aee7dd1-89ff-4f57-8de1-0e33fceddce6\" (UID: \"5aee7dd1-89ff-4f57-8de1-0e33fceddce6\") " Apr 17 14:31:11.839950 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:31:11.839921 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5aee7dd1-89ff-4f57-8de1-0e33fceddce6-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "5aee7dd1-89ff-4f57-8de1-0e33fceddce6" (UID: "5aee7dd1-89ff-4f57-8de1-0e33fceddce6"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 14:31:11.841729 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:31:11.841699 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5aee7dd1-89ff-4f57-8de1-0e33fceddce6-kube-api-access-2gvgj" (OuterVolumeSpecName: "kube-api-access-2gvgj") pod "5aee7dd1-89ff-4f57-8de1-0e33fceddce6" (UID: "5aee7dd1-89ff-4f57-8de1-0e33fceddce6"). InnerVolumeSpecName "kube-api-access-2gvgj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 14:31:11.940545 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:31:11.940505 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2gvgj\" (UniqueName: \"kubernetes.io/projected/5aee7dd1-89ff-4f57-8de1-0e33fceddce6-kube-api-access-2gvgj\") on node \"ip-10-0-138-3.ec2.internal\" DevicePath \"\"" Apr 17 14:31:11.940545 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:31:11.940546 2575 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/5aee7dd1-89ff-4f57-8de1-0e33fceddce6-extensions-socket-volume\") on node \"ip-10-0-138-3.ec2.internal\" DevicePath \"\"" Apr 17 14:31:12.484990 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:31:12.484953 2575 generic.go:358] "Generic (PLEG): container finished" podID="5aee7dd1-89ff-4f57-8de1-0e33fceddce6" containerID="a472158f0ffe96005a9633b568b768c3b02524c3f7ed19b08b19c5859d644731" exitCode=0 Apr 17 14:31:12.485407 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:31:12.485018 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-njj79" Apr 17 14:31:12.485407 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:31:12.485040 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-njj79" event={"ID":"5aee7dd1-89ff-4f57-8de1-0e33fceddce6","Type":"ContainerDied","Data":"a472158f0ffe96005a9633b568b768c3b02524c3f7ed19b08b19c5859d644731"} Apr 17 14:31:12.485407 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:31:12.485087 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-njj79" event={"ID":"5aee7dd1-89ff-4f57-8de1-0e33fceddce6","Type":"ContainerDied","Data":"efdf5cb1d4eedb2fcdc63ea528c7cd334237047d77558185d889a7c66fd4da01"} Apr 17 14:31:12.485407 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:31:12.485110 2575 scope.go:117] "RemoveContainer" containerID="a472158f0ffe96005a9633b568b768c3b02524c3f7ed19b08b19c5859d644731" Apr 17 14:31:12.493414 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:31:12.493398 2575 scope.go:117] "RemoveContainer" containerID="a472158f0ffe96005a9633b568b768c3b02524c3f7ed19b08b19c5859d644731" Apr 17 14:31:12.493648 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:31:12.493631 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a472158f0ffe96005a9633b568b768c3b02524c3f7ed19b08b19c5859d644731\": container with ID starting with a472158f0ffe96005a9633b568b768c3b02524c3f7ed19b08b19c5859d644731 not found: ID does not exist" containerID="a472158f0ffe96005a9633b568b768c3b02524c3f7ed19b08b19c5859d644731" Apr 17 14:31:12.493698 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:31:12.493654 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a472158f0ffe96005a9633b568b768c3b02524c3f7ed19b08b19c5859d644731"} err="failed to get container status \"a472158f0ffe96005a9633b568b768c3b02524c3f7ed19b08b19c5859d644731\": rpc error: code = NotFound desc = could not find container \"a472158f0ffe96005a9633b568b768c3b02524c3f7ed19b08b19c5859d644731\": container with ID starting with a472158f0ffe96005a9633b568b768c3b02524c3f7ed19b08b19c5859d644731 not found: ID does not exist" Apr 17 14:31:12.510684 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:31:12.510665 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-njj79"] Apr 17 14:31:12.519031 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:31:12.519008 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-njj79"] Apr 17 14:31:14.402950 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:31:14.402919 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5aee7dd1-89ff-4f57-8de1-0e33fceddce6" path="/var/lib/kubelet/pods/5aee7dd1-89ff-4f57-8de1-0e33fceddce6/volumes" Apr 17 14:31:24.550037 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:31:24.549976 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-64b6cfcbd7-kcp65" podUID="418f4446-5565-4732-aca2-3070fbe1d5c4" containerName="console" containerID="cri-o://9d12b4a3466a7b76a6ff836a07fd3ca39390461d079a800b9edc072e45f9cf82" gracePeriod=15 Apr 17 14:31:24.783980 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:31:24.783959 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-64b6cfcbd7-kcp65_418f4446-5565-4732-aca2-3070fbe1d5c4/console/0.log" Apr 17 14:31:24.784080 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:31:24.784019 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-64b6cfcbd7-kcp65" Apr 17 14:31:24.863385 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:31:24.863353 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/418f4446-5565-4732-aca2-3070fbe1d5c4-console-config\") pod \"418f4446-5565-4732-aca2-3070fbe1d5c4\" (UID: \"418f4446-5565-4732-aca2-3070fbe1d5c4\") " Apr 17 14:31:24.863560 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:31:24.863398 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/418f4446-5565-4732-aca2-3070fbe1d5c4-oauth-serving-cert\") pod \"418f4446-5565-4732-aca2-3070fbe1d5c4\" (UID: \"418f4446-5565-4732-aca2-3070fbe1d5c4\") " Apr 17 14:31:24.863560 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:31:24.863445 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l4hwr\" (UniqueName: \"kubernetes.io/projected/418f4446-5565-4732-aca2-3070fbe1d5c4-kube-api-access-l4hwr\") pod \"418f4446-5565-4732-aca2-3070fbe1d5c4\" (UID: \"418f4446-5565-4732-aca2-3070fbe1d5c4\") " Apr 17 14:31:24.863560 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:31:24.863462 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/418f4446-5565-4732-aca2-3070fbe1d5c4-service-ca\") pod \"418f4446-5565-4732-aca2-3070fbe1d5c4\" (UID: \"418f4446-5565-4732-aca2-3070fbe1d5c4\") " Apr 17 14:31:24.863560 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:31:24.863479 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/418f4446-5565-4732-aca2-3070fbe1d5c4-console-oauth-config\") pod \"418f4446-5565-4732-aca2-3070fbe1d5c4\" (UID: \"418f4446-5565-4732-aca2-3070fbe1d5c4\") " Apr 17 14:31:24.863560 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:31:24.863523 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/418f4446-5565-4732-aca2-3070fbe1d5c4-trusted-ca-bundle\") pod \"418f4446-5565-4732-aca2-3070fbe1d5c4\" (UID: \"418f4446-5565-4732-aca2-3070fbe1d5c4\") " Apr 17 14:31:24.863794 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:31:24.863602 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/418f4446-5565-4732-aca2-3070fbe1d5c4-console-serving-cert\") pod \"418f4446-5565-4732-aca2-3070fbe1d5c4\" (UID: \"418f4446-5565-4732-aca2-3070fbe1d5c4\") " Apr 17 14:31:24.863945 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:31:24.863782 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/418f4446-5565-4732-aca2-3070fbe1d5c4-console-config" (OuterVolumeSpecName: "console-config") pod "418f4446-5565-4732-aca2-3070fbe1d5c4" (UID: "418f4446-5565-4732-aca2-3070fbe1d5c4"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 14:31:24.864037 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:31:24.864011 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/418f4446-5565-4732-aca2-3070fbe1d5c4-service-ca" (OuterVolumeSpecName: "service-ca") pod "418f4446-5565-4732-aca2-3070fbe1d5c4" (UID: "418f4446-5565-4732-aca2-3070fbe1d5c4"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 14:31:24.864037 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:31:24.864022 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/418f4446-5565-4732-aca2-3070fbe1d5c4-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "418f4446-5565-4732-aca2-3070fbe1d5c4" (UID: "418f4446-5565-4732-aca2-3070fbe1d5c4"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 14:31:24.864216 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:31:24.864195 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/418f4446-5565-4732-aca2-3070fbe1d5c4-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "418f4446-5565-4732-aca2-3070fbe1d5c4" (UID: "418f4446-5565-4732-aca2-3070fbe1d5c4"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 14:31:24.865665 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:31:24.865646 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/418f4446-5565-4732-aca2-3070fbe1d5c4-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "418f4446-5565-4732-aca2-3070fbe1d5c4" (UID: "418f4446-5565-4732-aca2-3070fbe1d5c4"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 14:31:24.865767 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:31:24.865661 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/418f4446-5565-4732-aca2-3070fbe1d5c4-kube-api-access-l4hwr" (OuterVolumeSpecName: "kube-api-access-l4hwr") pod "418f4446-5565-4732-aca2-3070fbe1d5c4" (UID: "418f4446-5565-4732-aca2-3070fbe1d5c4"). InnerVolumeSpecName "kube-api-access-l4hwr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 14:31:24.865767 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:31:24.865676 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/418f4446-5565-4732-aca2-3070fbe1d5c4-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "418f4446-5565-4732-aca2-3070fbe1d5c4" (UID: "418f4446-5565-4732-aca2-3070fbe1d5c4"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 14:31:24.964956 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:31:24.964922 2575 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/418f4446-5565-4732-aca2-3070fbe1d5c4-console-config\") on node \"ip-10-0-138-3.ec2.internal\" DevicePath \"\"" Apr 17 14:31:24.964956 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:31:24.964950 2575 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/418f4446-5565-4732-aca2-3070fbe1d5c4-oauth-serving-cert\") on node \"ip-10-0-138-3.ec2.internal\" DevicePath \"\"" Apr 17 14:31:24.964956 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:31:24.964959 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-l4hwr\" (UniqueName: \"kubernetes.io/projected/418f4446-5565-4732-aca2-3070fbe1d5c4-kube-api-access-l4hwr\") on node \"ip-10-0-138-3.ec2.internal\" DevicePath \"\"" Apr 17 14:31:24.965185 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:31:24.964970 2575 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/418f4446-5565-4732-aca2-3070fbe1d5c4-service-ca\") on node \"ip-10-0-138-3.ec2.internal\" DevicePath \"\"" Apr 17 14:31:24.965185 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:31:24.964979 2575 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/418f4446-5565-4732-aca2-3070fbe1d5c4-console-oauth-config\") on node \"ip-10-0-138-3.ec2.internal\" DevicePath \"\"" Apr 17 14:31:24.965185 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:31:24.964987 2575 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/418f4446-5565-4732-aca2-3070fbe1d5c4-trusted-ca-bundle\") on node \"ip-10-0-138-3.ec2.internal\" DevicePath \"\"" Apr 17 14:31:24.965185 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:31:24.964996 2575 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/418f4446-5565-4732-aca2-3070fbe1d5c4-console-serving-cert\") on node \"ip-10-0-138-3.ec2.internal\" DevicePath \"\"" Apr 17 14:31:25.536019 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:31:25.535991 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-64b6cfcbd7-kcp65_418f4446-5565-4732-aca2-3070fbe1d5c4/console/0.log" Apr 17 14:31:25.536178 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:31:25.536028 2575 generic.go:358] "Generic (PLEG): container finished" podID="418f4446-5565-4732-aca2-3070fbe1d5c4" containerID="9d12b4a3466a7b76a6ff836a07fd3ca39390461d079a800b9edc072e45f9cf82" exitCode=2 Apr 17 14:31:25.536178 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:31:25.536057 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-64b6cfcbd7-kcp65" event={"ID":"418f4446-5565-4732-aca2-3070fbe1d5c4","Type":"ContainerDied","Data":"9d12b4a3466a7b76a6ff836a07fd3ca39390461d079a800b9edc072e45f9cf82"} Apr 17 14:31:25.536178 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:31:25.536093 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-64b6cfcbd7-kcp65" event={"ID":"418f4446-5565-4732-aca2-3070fbe1d5c4","Type":"ContainerDied","Data":"791ef1e11d333d4bd907837cc4e6903aedcb5cd28c510c15a68774e14337dade"} Apr 17 14:31:25.536178 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:31:25.536101 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-64b6cfcbd7-kcp65" Apr 17 14:31:25.536315 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:31:25.536106 2575 scope.go:117] "RemoveContainer" containerID="9d12b4a3466a7b76a6ff836a07fd3ca39390461d079a800b9edc072e45f9cf82" Apr 17 14:31:25.546688 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:31:25.546673 2575 scope.go:117] "RemoveContainer" containerID="9d12b4a3466a7b76a6ff836a07fd3ca39390461d079a800b9edc072e45f9cf82" Apr 17 14:31:25.546968 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:31:25.546950 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d12b4a3466a7b76a6ff836a07fd3ca39390461d079a800b9edc072e45f9cf82\": container with ID starting with 9d12b4a3466a7b76a6ff836a07fd3ca39390461d079a800b9edc072e45f9cf82 not found: ID does not exist" containerID="9d12b4a3466a7b76a6ff836a07fd3ca39390461d079a800b9edc072e45f9cf82" Apr 17 14:31:25.547022 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:31:25.546977 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d12b4a3466a7b76a6ff836a07fd3ca39390461d079a800b9edc072e45f9cf82"} err="failed to get container status \"9d12b4a3466a7b76a6ff836a07fd3ca39390461d079a800b9edc072e45f9cf82\": rpc error: code = NotFound desc = could not find container \"9d12b4a3466a7b76a6ff836a07fd3ca39390461d079a800b9edc072e45f9cf82\": container with ID starting with 9d12b4a3466a7b76a6ff836a07fd3ca39390461d079a800b9edc072e45f9cf82 not found: ID does not exist" Apr 17 14:31:25.560770 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:31:25.560749 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-64b6cfcbd7-kcp65"] Apr 17 14:31:25.567206 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:31:25.567174 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-64b6cfcbd7-kcp65"] Apr 17 14:31:26.403112 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:31:26.403080 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="418f4446-5565-4732-aca2-3070fbe1d5c4" path="/var/lib/kubelet/pods/418f4446-5565-4732-aca2-3070fbe1d5c4/volumes" Apr 17 14:31:32.984511 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:31:32.984478 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-7498df8756-nndmz"] Apr 17 14:31:32.984913 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:31:32.984900 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="418f4446-5565-4732-aca2-3070fbe1d5c4" containerName="console" Apr 17 14:31:32.984962 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:31:32.984914 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="418f4446-5565-4732-aca2-3070fbe1d5c4" containerName="console" Apr 17 14:31:32.984962 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:31:32.984946 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5aee7dd1-89ff-4f57-8de1-0e33fceddce6" containerName="manager" Apr 17 14:31:32.984962 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:31:32.984951 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="5aee7dd1-89ff-4f57-8de1-0e33fceddce6" containerName="manager" Apr 17 14:31:32.985045 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:31:32.985018 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="5aee7dd1-89ff-4f57-8de1-0e33fceddce6" containerName="manager" Apr 17 14:31:32.985045 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:31:32.985028 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="418f4446-5565-4732-aca2-3070fbe1d5c4" containerName="console" Apr 17 14:31:32.989858 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:31:32.989840 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-nndmz" Apr 17 14:31:32.992632 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:31:32.992608 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-zc9dl\"" Apr 17 14:31:32.993970 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:31:32.993944 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-nndmz"] Apr 17 14:31:33.141641 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:31:33.141598 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27dcj\" (UniqueName: \"kubernetes.io/projected/39dc962d-c9f8-48ab-bcbc-6889956610e4-kube-api-access-27dcj\") pod \"authorino-7498df8756-nndmz\" (UID: \"39dc962d-c9f8-48ab-bcbc-6889956610e4\") " pod="kuadrant-system/authorino-7498df8756-nndmz" Apr 17 14:31:33.242395 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:31:33.242321 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-27dcj\" (UniqueName: \"kubernetes.io/projected/39dc962d-c9f8-48ab-bcbc-6889956610e4-kube-api-access-27dcj\") pod \"authorino-7498df8756-nndmz\" (UID: \"39dc962d-c9f8-48ab-bcbc-6889956610e4\") " pod="kuadrant-system/authorino-7498df8756-nndmz" Apr 17 14:31:33.251105 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:31:33.251082 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-27dcj\" (UniqueName: \"kubernetes.io/projected/39dc962d-c9f8-48ab-bcbc-6889956610e4-kube-api-access-27dcj\") pod \"authorino-7498df8756-nndmz\" (UID: \"39dc962d-c9f8-48ab-bcbc-6889956610e4\") " pod="kuadrant-system/authorino-7498df8756-nndmz" Apr 17 14:31:33.299537 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:31:33.299511 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-nndmz" Apr 17 14:31:33.417515 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:31:33.417490 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-nndmz"] Apr 17 14:31:33.419884 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:31:33.419846 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod39dc962d_c9f8_48ab_bcbc_6889956610e4.slice/crio-bf34ac8da0ba2322d3dff97d5032e3fa7eadeb608c8a389aba5e68b8dd59d71b WatchSource:0}: Error finding container bf34ac8da0ba2322d3dff97d5032e3fa7eadeb608c8a389aba5e68b8dd59d71b: Status 404 returned error can't find the container with id bf34ac8da0ba2322d3dff97d5032e3fa7eadeb608c8a389aba5e68b8dd59d71b Apr 17 14:31:33.566080 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:31:33.566044 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-nndmz" event={"ID":"39dc962d-c9f8-48ab-bcbc-6889956610e4","Type":"ContainerStarted","Data":"bf34ac8da0ba2322d3dff97d5032e3fa7eadeb608c8a389aba5e68b8dd59d71b"} Apr 17 14:31:36.579699 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:31:36.579660 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-nndmz" event={"ID":"39dc962d-c9f8-48ab-bcbc-6889956610e4","Type":"ContainerStarted","Data":"7ca758696fb54a4e3633e486f5dc0d6a2f2069c74a9c180b0b65ba7fe168e2cf"} Apr 17 14:31:36.594799 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:31:36.594749 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-7498df8756-nndmz" podStartSLOduration=2.273555297 podStartE2EDuration="4.59473367s" podCreationTimestamp="2026-04-17 14:31:32 +0000 UTC" firstStartedPulling="2026-04-17 14:31:33.421157183 +0000 UTC m=+611.727910780" lastFinishedPulling="2026-04-17 14:31:35.742335553 +0000 UTC m=+614.049089153" observedRunningTime="2026-04-17 14:31:36.594101054 +0000 UTC m=+614.900854673" watchObservedRunningTime="2026-04-17 14:31:36.59473367 +0000 UTC m=+614.901487324" Apr 17 14:32:07.117505 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:32:07.117470 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-df86fd548-xff85"] Apr 17 14:32:07.121164 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:32:07.121144 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-df86fd548-xff85" Apr 17 14:32:07.125048 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:32:07.125023 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-serving-cert\"" Apr 17 14:32:07.125163 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:32:07.125022 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-lq9r8\"" Apr 17 14:32:07.125163 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:32:07.125023 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"maas-parameters\"" Apr 17 14:32:07.128432 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:32:07.128412 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-df86fd548-xff85"] Apr 17 14:32:07.254414 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:32:07.254373 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6b2fn\" (UniqueName: \"kubernetes.io/projected/35b2e5a7-7f6e-47de-8ce9-e16a333c164c-kube-api-access-6b2fn\") pod \"maas-api-df86fd548-xff85\" (UID: \"35b2e5a7-7f6e-47de-8ce9-e16a333c164c\") " pod="opendatahub/maas-api-df86fd548-xff85" Apr 17 14:32:07.254591 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:32:07.254535 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/35b2e5a7-7f6e-47de-8ce9-e16a333c164c-maas-api-tls\") pod \"maas-api-df86fd548-xff85\" (UID: \"35b2e5a7-7f6e-47de-8ce9-e16a333c164c\") " pod="opendatahub/maas-api-df86fd548-xff85" Apr 17 14:32:07.355704 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:32:07.355669 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/35b2e5a7-7f6e-47de-8ce9-e16a333c164c-maas-api-tls\") pod \"maas-api-df86fd548-xff85\" (UID: \"35b2e5a7-7f6e-47de-8ce9-e16a333c164c\") " pod="opendatahub/maas-api-df86fd548-xff85" Apr 17 14:32:07.355924 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:32:07.355746 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6b2fn\" (UniqueName: \"kubernetes.io/projected/35b2e5a7-7f6e-47de-8ce9-e16a333c164c-kube-api-access-6b2fn\") pod \"maas-api-df86fd548-xff85\" (UID: \"35b2e5a7-7f6e-47de-8ce9-e16a333c164c\") " pod="opendatahub/maas-api-df86fd548-xff85" Apr 17 14:32:07.358596 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:32:07.358566 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/35b2e5a7-7f6e-47de-8ce9-e16a333c164c-maas-api-tls\") pod \"maas-api-df86fd548-xff85\" (UID: \"35b2e5a7-7f6e-47de-8ce9-e16a333c164c\") " pod="opendatahub/maas-api-df86fd548-xff85" Apr 17 14:32:07.369720 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:32:07.369645 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6b2fn\" (UniqueName: \"kubernetes.io/projected/35b2e5a7-7f6e-47de-8ce9-e16a333c164c-kube-api-access-6b2fn\") pod \"maas-api-df86fd548-xff85\" (UID: \"35b2e5a7-7f6e-47de-8ce9-e16a333c164c\") " pod="opendatahub/maas-api-df86fd548-xff85" Apr 17 14:32:07.433511 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:32:07.433478 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-df86fd548-xff85" Apr 17 14:32:07.554097 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:32:07.554074 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-df86fd548-xff85"] Apr 17 14:32:07.557202 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:32:07.557167 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod35b2e5a7_7f6e_47de_8ce9_e16a333c164c.slice/crio-6501865c44640e94d271741c908c54709a1cf46412e7d136b9f3cd0bbcf944e8 WatchSource:0}: Error finding container 6501865c44640e94d271741c908c54709a1cf46412e7d136b9f3cd0bbcf944e8: Status 404 returned error can't find the container with id 6501865c44640e94d271741c908c54709a1cf46412e7d136b9f3cd0bbcf944e8 Apr 17 14:32:07.558747 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:32:07.558729 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 14:32:07.696504 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:32:07.696425 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-df86fd548-xff85" event={"ID":"35b2e5a7-7f6e-47de-8ce9-e16a333c164c","Type":"ContainerStarted","Data":"6501865c44640e94d271741c908c54709a1cf46412e7d136b9f3cd0bbcf944e8"} Apr 17 14:32:08.552587 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:32:08.552545 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-nndmz"] Apr 17 14:32:08.553104 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:32:08.552934 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-7498df8756-nndmz" podUID="39dc962d-c9f8-48ab-bcbc-6889956610e4" containerName="authorino" containerID="cri-o://7ca758696fb54a4e3633e486f5dc0d6a2f2069c74a9c180b0b65ba7fe168e2cf" gracePeriod=30 Apr 17 14:32:08.702267 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:32:08.702236 2575 generic.go:358] "Generic (PLEG): container finished" podID="39dc962d-c9f8-48ab-bcbc-6889956610e4" containerID="7ca758696fb54a4e3633e486f5dc0d6a2f2069c74a9c180b0b65ba7fe168e2cf" exitCode=0 Apr 17 14:32:08.702430 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:32:08.702307 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-nndmz" event={"ID":"39dc962d-c9f8-48ab-bcbc-6889956610e4","Type":"ContainerDied","Data":"7ca758696fb54a4e3633e486f5dc0d6a2f2069c74a9c180b0b65ba7fe168e2cf"} Apr 17 14:32:09.724910 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:32:09.724887 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-nndmz" Apr 17 14:32:09.883687 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:32:09.883654 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-27dcj\" (UniqueName: \"kubernetes.io/projected/39dc962d-c9f8-48ab-bcbc-6889956610e4-kube-api-access-27dcj\") pod \"39dc962d-c9f8-48ab-bcbc-6889956610e4\" (UID: \"39dc962d-c9f8-48ab-bcbc-6889956610e4\") " Apr 17 14:32:09.885838 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:32:09.885808 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39dc962d-c9f8-48ab-bcbc-6889956610e4-kube-api-access-27dcj" (OuterVolumeSpecName: "kube-api-access-27dcj") pod "39dc962d-c9f8-48ab-bcbc-6889956610e4" (UID: "39dc962d-c9f8-48ab-bcbc-6889956610e4"). InnerVolumeSpecName "kube-api-access-27dcj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 14:32:09.984433 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:32:09.984355 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-27dcj\" (UniqueName: \"kubernetes.io/projected/39dc962d-c9f8-48ab-bcbc-6889956610e4-kube-api-access-27dcj\") on node \"ip-10-0-138-3.ec2.internal\" DevicePath \"\"" Apr 17 14:32:10.711200 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:32:10.711110 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-nndmz" event={"ID":"39dc962d-c9f8-48ab-bcbc-6889956610e4","Type":"ContainerDied","Data":"bf34ac8da0ba2322d3dff97d5032e3fa7eadeb608c8a389aba5e68b8dd59d71b"} Apr 17 14:32:10.711200 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:32:10.711134 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-nndmz" Apr 17 14:32:10.711200 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:32:10.711161 2575 scope.go:117] "RemoveContainer" containerID="7ca758696fb54a4e3633e486f5dc0d6a2f2069c74a9c180b0b65ba7fe168e2cf" Apr 17 14:32:10.712652 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:32:10.712622 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-df86fd548-xff85" event={"ID":"35b2e5a7-7f6e-47de-8ce9-e16a333c164c","Type":"ContainerStarted","Data":"f63753608612050aa396217e4c87a867853e9f17123eaaa473ec8f7359d4011f"} Apr 17 14:32:10.712827 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:32:10.712813 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-df86fd548-xff85" Apr 17 14:32:10.727978 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:32:10.727859 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-nndmz"] Apr 17 14:32:10.732920 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:32:10.732899 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-7498df8756-nndmz"] Apr 17 14:32:10.750117 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:32:10.750070 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-df86fd548-xff85" podStartSLOduration=1.540892465 podStartE2EDuration="3.750055518s" podCreationTimestamp="2026-04-17 14:32:07 +0000 UTC" firstStartedPulling="2026-04-17 14:32:07.558849261 +0000 UTC m=+645.865602859" lastFinishedPulling="2026-04-17 14:32:09.768012299 +0000 UTC m=+648.074765912" observedRunningTime="2026-04-17 14:32:10.749506276 +0000 UTC m=+649.056259923" watchObservedRunningTime="2026-04-17 14:32:10.750055518 +0000 UTC m=+649.056809136" Apr 17 14:32:12.402407 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:32:12.402376 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39dc962d-c9f8-48ab-bcbc-6889956610e4" path="/var/lib/kubelet/pods/39dc962d-c9f8-48ab-bcbc-6889956610e4/volumes" Apr 17 14:32:16.722538 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:32:16.722509 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-df86fd548-xff85" Apr 17 14:32:44.991996 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:32:44.991959 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-f5df4587b-wclzr"] Apr 17 14:32:44.992451 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:32:44.992355 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="39dc962d-c9f8-48ab-bcbc-6889956610e4" containerName="authorino" Apr 17 14:32:44.992451 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:32:44.992366 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="39dc962d-c9f8-48ab-bcbc-6889956610e4" containerName="authorino" Apr 17 14:32:44.992451 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:32:44.992450 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="39dc962d-c9f8-48ab-bcbc-6889956610e4" containerName="authorino" Apr 17 14:32:44.995609 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:32:44.995587 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-wclzr" Apr 17 14:32:44.998461 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:32:44.998437 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 17 14:32:44.999983 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:32:44.999957 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-m8hqm\"" Apr 17 14:32:45.000134 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:32:45.000106 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"premium-simulated-simulated-premium-kserve-self-signed-certs\"" Apr 17 14:32:45.000540 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:32:45.000519 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 17 14:32:45.003076 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:32:45.003056 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-f5df4587b-wclzr"] Apr 17 14:32:45.106505 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:32:45.106470 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a7d1a773-42e8-4254-afba-448fd1692f34-home\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-wclzr\" (UID: \"a7d1a773-42e8-4254-afba-448fd1692f34\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-wclzr" Apr 17 14:32:45.106505 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:32:45.106505 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a7d1a773-42e8-4254-afba-448fd1692f34-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-wclzr\" (UID: \"a7d1a773-42e8-4254-afba-448fd1692f34\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-wclzr" Apr 17 14:32:45.106718 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:32:45.106584 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a7d1a773-42e8-4254-afba-448fd1692f34-model-cache\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-wclzr\" (UID: \"a7d1a773-42e8-4254-afba-448fd1692f34\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-wclzr" Apr 17 14:32:45.106718 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:32:45.106603 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a7d1a773-42e8-4254-afba-448fd1692f34-dshm\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-wclzr\" (UID: \"a7d1a773-42e8-4254-afba-448fd1692f34\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-wclzr" Apr 17 14:32:45.106718 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:32:45.106642 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a7d1a773-42e8-4254-afba-448fd1692f34-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-wclzr\" (UID: \"a7d1a773-42e8-4254-afba-448fd1692f34\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-wclzr" Apr 17 14:32:45.106718 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:32:45.106669 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mp7kj\" (UniqueName: \"kubernetes.io/projected/a7d1a773-42e8-4254-afba-448fd1692f34-kube-api-access-mp7kj\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-wclzr\" (UID: \"a7d1a773-42e8-4254-afba-448fd1692f34\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-wclzr" Apr 17 14:32:45.207514 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:32:45.207472 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mp7kj\" (UniqueName: \"kubernetes.io/projected/a7d1a773-42e8-4254-afba-448fd1692f34-kube-api-access-mp7kj\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-wclzr\" (UID: \"a7d1a773-42e8-4254-afba-448fd1692f34\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-wclzr" Apr 17 14:32:45.207690 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:32:45.207558 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a7d1a773-42e8-4254-afba-448fd1692f34-home\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-wclzr\" (UID: \"a7d1a773-42e8-4254-afba-448fd1692f34\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-wclzr" Apr 17 14:32:45.207690 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:32:45.207581 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a7d1a773-42e8-4254-afba-448fd1692f34-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-wclzr\" (UID: \"a7d1a773-42e8-4254-afba-448fd1692f34\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-wclzr" Apr 17 14:32:45.207690 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:32:45.207630 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a7d1a773-42e8-4254-afba-448fd1692f34-model-cache\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-wclzr\" (UID: \"a7d1a773-42e8-4254-afba-448fd1692f34\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-wclzr" Apr 17 14:32:45.207690 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:32:45.207660 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a7d1a773-42e8-4254-afba-448fd1692f34-dshm\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-wclzr\" (UID: \"a7d1a773-42e8-4254-afba-448fd1692f34\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-wclzr" Apr 17 14:32:45.207898 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:32:45.207696 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a7d1a773-42e8-4254-afba-448fd1692f34-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-wclzr\" (UID: \"a7d1a773-42e8-4254-afba-448fd1692f34\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-wclzr" Apr 17 14:32:45.208066 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:32:45.208043 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a7d1a773-42e8-4254-afba-448fd1692f34-home\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-wclzr\" (UID: \"a7d1a773-42e8-4254-afba-448fd1692f34\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-wclzr" Apr 17 14:32:45.208143 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:32:45.208063 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a7d1a773-42e8-4254-afba-448fd1692f34-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-wclzr\" (UID: \"a7d1a773-42e8-4254-afba-448fd1692f34\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-wclzr" Apr 17 14:32:45.208196 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:32:45.208143 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a7d1a773-42e8-4254-afba-448fd1692f34-model-cache\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-wclzr\" (UID: \"a7d1a773-42e8-4254-afba-448fd1692f34\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-wclzr" Apr 17 14:32:45.210071 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:32:45.210051 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a7d1a773-42e8-4254-afba-448fd1692f34-dshm\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-wclzr\" (UID: \"a7d1a773-42e8-4254-afba-448fd1692f34\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-wclzr" Apr 17 14:32:45.210240 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:32:45.210225 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a7d1a773-42e8-4254-afba-448fd1692f34-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-wclzr\" (UID: \"a7d1a773-42e8-4254-afba-448fd1692f34\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-wclzr" Apr 17 14:32:45.215120 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:32:45.215095 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mp7kj\" (UniqueName: \"kubernetes.io/projected/a7d1a773-42e8-4254-afba-448fd1692f34-kube-api-access-mp7kj\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-wclzr\" (UID: \"a7d1a773-42e8-4254-afba-448fd1692f34\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-wclzr" Apr 17 14:32:45.306324 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:32:45.306224 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-wclzr" Apr 17 14:32:45.450791 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:32:45.444648 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-f5df4587b-wclzr"] Apr 17 14:32:45.452789 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:32:45.452755 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7d1a773_42e8_4254_afba_448fd1692f34.slice/crio-7a809331dd48f703a6ca3efadec711d2daf373ba51bcbdcbc5f34ccdbe9f752d WatchSource:0}: Error finding container 7a809331dd48f703a6ca3efadec711d2daf373ba51bcbdcbc5f34ccdbe9f752d: Status 404 returned error can't find the container with id 7a809331dd48f703a6ca3efadec711d2daf373ba51bcbdcbc5f34ccdbe9f752d Apr 17 14:32:45.843831 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:32:45.843767 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-wclzr" event={"ID":"a7d1a773-42e8-4254-afba-448fd1692f34","Type":"ContainerStarted","Data":"7a809331dd48f703a6ca3efadec711d2daf373ba51bcbdcbc5f34ccdbe9f752d"} Apr 17 14:32:52.882587 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:32:52.882548 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-wclzr" event={"ID":"a7d1a773-42e8-4254-afba-448fd1692f34","Type":"ContainerStarted","Data":"0dce43780cefe497a084349d980c8ecfc0cad8feae83cc0483c6284a7a1c3336"} Apr 17 14:33:00.699004 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:33:00.698972 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-6d5965695-csf8l"] Apr 17 14:33:00.702757 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:33:00.702737 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-csf8l" Apr 17 14:33:00.705654 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:33:00.705635 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-trlp-test-simulated-kserve-self-signed-certs\"" Apr 17 14:33:00.711913 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:33:00.711861 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-6d5965695-csf8l"] Apr 17 14:33:00.878452 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:33:00.878410 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/383c9d5b-7fac-4518-b520-6b972699a934-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-csf8l\" (UID: \"383c9d5b-7fac-4518-b520-6b972699a934\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-csf8l" Apr 17 14:33:00.878689 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:33:00.878472 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/383c9d5b-7fac-4518-b520-6b972699a934-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-csf8l\" (UID: \"383c9d5b-7fac-4518-b520-6b972699a934\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-csf8l" Apr 17 14:33:00.878689 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:33:00.878500 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jshpw\" (UniqueName: \"kubernetes.io/projected/383c9d5b-7fac-4518-b520-6b972699a934-kube-api-access-jshpw\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-csf8l\" (UID: \"383c9d5b-7fac-4518-b520-6b972699a934\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-csf8l" Apr 17 14:33:00.878689 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:33:00.878544 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/383c9d5b-7fac-4518-b520-6b972699a934-model-cache\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-csf8l\" (UID: \"383c9d5b-7fac-4518-b520-6b972699a934\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-csf8l" Apr 17 14:33:00.878689 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:33:00.878641 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/383c9d5b-7fac-4518-b520-6b972699a934-home\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-csf8l\" (UID: \"383c9d5b-7fac-4518-b520-6b972699a934\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-csf8l" Apr 17 14:33:00.878850 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:33:00.878708 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/383c9d5b-7fac-4518-b520-6b972699a934-dshm\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-csf8l\" (UID: \"383c9d5b-7fac-4518-b520-6b972699a934\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-csf8l" Apr 17 14:33:00.914433 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:33:00.914399 2575 generic.go:358] "Generic (PLEG): container finished" podID="a7d1a773-42e8-4254-afba-448fd1692f34" containerID="0dce43780cefe497a084349d980c8ecfc0cad8feae83cc0483c6284a7a1c3336" exitCode=0 Apr 17 14:33:00.914630 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:33:00.914453 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-wclzr" event={"ID":"a7d1a773-42e8-4254-afba-448fd1692f34","Type":"ContainerDied","Data":"0dce43780cefe497a084349d980c8ecfc0cad8feae83cc0483c6284a7a1c3336"} Apr 17 14:33:00.979518 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:33:00.979429 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/383c9d5b-7fac-4518-b520-6b972699a934-model-cache\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-csf8l\" (UID: \"383c9d5b-7fac-4518-b520-6b972699a934\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-csf8l" Apr 17 14:33:00.979518 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:33:00.979475 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/383c9d5b-7fac-4518-b520-6b972699a934-home\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-csf8l\" (UID: \"383c9d5b-7fac-4518-b520-6b972699a934\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-csf8l" Apr 17 14:33:00.979518 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:33:00.979518 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/383c9d5b-7fac-4518-b520-6b972699a934-dshm\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-csf8l\" (UID: \"383c9d5b-7fac-4518-b520-6b972699a934\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-csf8l" Apr 17 14:33:00.979793 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:33:00.979572 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/383c9d5b-7fac-4518-b520-6b972699a934-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-csf8l\" (UID: \"383c9d5b-7fac-4518-b520-6b972699a934\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-csf8l" Apr 17 14:33:00.979793 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:33:00.979607 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/383c9d5b-7fac-4518-b520-6b972699a934-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-csf8l\" (UID: \"383c9d5b-7fac-4518-b520-6b972699a934\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-csf8l" Apr 17 14:33:00.979793 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:33:00.979633 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jshpw\" (UniqueName: \"kubernetes.io/projected/383c9d5b-7fac-4518-b520-6b972699a934-kube-api-access-jshpw\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-csf8l\" (UID: \"383c9d5b-7fac-4518-b520-6b972699a934\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-csf8l" Apr 17 14:33:00.979998 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:33:00.979945 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/383c9d5b-7fac-4518-b520-6b972699a934-model-cache\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-csf8l\" (UID: \"383c9d5b-7fac-4518-b520-6b972699a934\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-csf8l" Apr 17 14:33:00.979998 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:33:00.979991 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/383c9d5b-7fac-4518-b520-6b972699a934-home\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-csf8l\" (UID: \"383c9d5b-7fac-4518-b520-6b972699a934\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-csf8l" Apr 17 14:33:00.980103 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:33:00.980035 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/383c9d5b-7fac-4518-b520-6b972699a934-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-csf8l\" (UID: \"383c9d5b-7fac-4518-b520-6b972699a934\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-csf8l" Apr 17 14:33:00.981913 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:33:00.981882 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/383c9d5b-7fac-4518-b520-6b972699a934-dshm\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-csf8l\" (UID: \"383c9d5b-7fac-4518-b520-6b972699a934\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-csf8l" Apr 17 14:33:00.982205 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:33:00.982183 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/383c9d5b-7fac-4518-b520-6b972699a934-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-csf8l\" (UID: \"383c9d5b-7fac-4518-b520-6b972699a934\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-csf8l" Apr 17 14:33:00.987189 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:33:00.987169 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jshpw\" (UniqueName: \"kubernetes.io/projected/383c9d5b-7fac-4518-b520-6b972699a934-kube-api-access-jshpw\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-csf8l\" (UID: \"383c9d5b-7fac-4518-b520-6b972699a934\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-csf8l" Apr 17 14:33:01.014979 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:33:01.014947 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-csf8l" Apr 17 14:33:01.152949 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:33:01.152901 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-6d5965695-csf8l"] Apr 17 14:33:01.155426 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:33:01.155388 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod383c9d5b_7fac_4518_b520_6b972699a934.slice/crio-b0956be5fd4a4bf518ea0bbe5820311d84fff24ba7544eea8d10b7871fd1943d WatchSource:0}: Error finding container b0956be5fd4a4bf518ea0bbe5820311d84fff24ba7544eea8d10b7871fd1943d: Status 404 returned error can't find the container with id b0956be5fd4a4bf518ea0bbe5820311d84fff24ba7544eea8d10b7871fd1943d Apr 17 14:33:01.921859 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:33:01.921190 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-csf8l" event={"ID":"383c9d5b-7fac-4518-b520-6b972699a934","Type":"ContainerStarted","Data":"1fd02c6f3d75ef084f514708adec7e26693d319685a0e9e0b7e992c156f49874"} Apr 17 14:33:01.921859 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:33:01.921236 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-csf8l" event={"ID":"383c9d5b-7fac-4518-b520-6b972699a934","Type":"ContainerStarted","Data":"b0956be5fd4a4bf518ea0bbe5820311d84fff24ba7544eea8d10b7871fd1943d"} Apr 17 14:33:05.940018 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:33:05.939981 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-wclzr" event={"ID":"a7d1a773-42e8-4254-afba-448fd1692f34","Type":"ContainerStarted","Data":"09a4dd07f6d5ef426136095e386dae0635c06b7fc77dbc62254c9eedac4477e2"} Apr 17 14:33:05.940446 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:33:05.940174 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-wclzr" Apr 17 14:33:05.959959 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:33:05.959905 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-wclzr" podStartSLOduration=2.14282556 podStartE2EDuration="21.959890684s" podCreationTimestamp="2026-04-17 14:32:44 +0000 UTC" firstStartedPulling="2026-04-17 14:32:45.455029192 +0000 UTC m=+683.761782804" lastFinishedPulling="2026-04-17 14:33:05.272094323 +0000 UTC m=+703.578847928" observedRunningTime="2026-04-17 14:33:05.957031056 +0000 UTC m=+704.263784676" watchObservedRunningTime="2026-04-17 14:33:05.959890684 +0000 UTC m=+704.266644301" Apr 17 14:33:06.987621 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:33:06.987581 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccfctd4"] Apr 17 14:33:06.993015 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:33:06.992995 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccfctd4" Apr 17 14:33:06.997317 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:33:06.997294 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-unab60ef4d3a239b5143b412cab04acac3-kserve-self-signed-certs\"" Apr 17 14:33:07.000291 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:33:07.000265 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccfctd4"] Apr 17 14:33:07.044317 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:33:07.044289 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9459ca47-7122-47fb-98b9-8f70bee6951a-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccfctd4\" (UID: \"9459ca47-7122-47fb-98b9-8f70bee6951a\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccfctd4" Apr 17 14:33:07.044317 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:33:07.044324 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9459ca47-7122-47fb-98b9-8f70bee6951a-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccfctd4\" (UID: \"9459ca47-7122-47fb-98b9-8f70bee6951a\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccfctd4" Apr 17 14:33:07.044654 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:33:07.044369 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9459ca47-7122-47fb-98b9-8f70bee6951a-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccfctd4\" (UID: \"9459ca47-7122-47fb-98b9-8f70bee6951a\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccfctd4" Apr 17 14:33:07.044654 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:33:07.044393 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmd2g\" (UniqueName: \"kubernetes.io/projected/9459ca47-7122-47fb-98b9-8f70bee6951a-kube-api-access-bmd2g\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccfctd4\" (UID: \"9459ca47-7122-47fb-98b9-8f70bee6951a\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccfctd4" Apr 17 14:33:07.044654 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:33:07.044428 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9459ca47-7122-47fb-98b9-8f70bee6951a-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccfctd4\" (UID: \"9459ca47-7122-47fb-98b9-8f70bee6951a\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccfctd4" Apr 17 14:33:07.044654 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:33:07.044458 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9459ca47-7122-47fb-98b9-8f70bee6951a-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccfctd4\" (UID: \"9459ca47-7122-47fb-98b9-8f70bee6951a\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccfctd4" Apr 17 14:33:07.145602 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:33:07.145563 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9459ca47-7122-47fb-98b9-8f70bee6951a-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccfctd4\" (UID: \"9459ca47-7122-47fb-98b9-8f70bee6951a\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccfctd4" Apr 17 14:33:07.146098 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:33:07.146077 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9459ca47-7122-47fb-98b9-8f70bee6951a-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccfctd4\" (UID: \"9459ca47-7122-47fb-98b9-8f70bee6951a\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccfctd4" Apr 17 14:33:07.146322 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:33:07.146306 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9459ca47-7122-47fb-98b9-8f70bee6951a-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccfctd4\" (UID: \"9459ca47-7122-47fb-98b9-8f70bee6951a\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccfctd4" Apr 17 14:33:07.146915 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:33:07.146892 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bmd2g\" (UniqueName: \"kubernetes.io/projected/9459ca47-7122-47fb-98b9-8f70bee6951a-kube-api-access-bmd2g\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccfctd4\" (UID: \"9459ca47-7122-47fb-98b9-8f70bee6951a\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccfctd4" Apr 17 14:33:07.147143 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:33:07.147130 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9459ca47-7122-47fb-98b9-8f70bee6951a-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccfctd4\" (UID: \"9459ca47-7122-47fb-98b9-8f70bee6951a\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccfctd4" Apr 17 14:33:07.147612 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:33:07.147595 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9459ca47-7122-47fb-98b9-8f70bee6951a-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccfctd4\" (UID: \"9459ca47-7122-47fb-98b9-8f70bee6951a\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccfctd4" Apr 17 14:33:07.147825 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:33:07.146520 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9459ca47-7122-47fb-98b9-8f70bee6951a-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccfctd4\" (UID: \"9459ca47-7122-47fb-98b9-8f70bee6951a\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccfctd4" Apr 17 14:33:07.147963 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:33:07.146482 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9459ca47-7122-47fb-98b9-8f70bee6951a-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccfctd4\" (UID: \"9459ca47-7122-47fb-98b9-8f70bee6951a\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccfctd4" Apr 17 14:33:07.148905 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:33:07.148113 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9459ca47-7122-47fb-98b9-8f70bee6951a-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccfctd4\" (UID: \"9459ca47-7122-47fb-98b9-8f70bee6951a\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccfctd4" Apr 17 14:33:07.150643 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:33:07.150600 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9459ca47-7122-47fb-98b9-8f70bee6951a-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccfctd4\" (UID: \"9459ca47-7122-47fb-98b9-8f70bee6951a\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccfctd4" Apr 17 14:33:07.153173 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:33:07.153144 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9459ca47-7122-47fb-98b9-8f70bee6951a-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccfctd4\" (UID: \"9459ca47-7122-47fb-98b9-8f70bee6951a\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccfctd4" Apr 17 14:33:07.156771 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:33:07.156448 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmd2g\" (UniqueName: \"kubernetes.io/projected/9459ca47-7122-47fb-98b9-8f70bee6951a-kube-api-access-bmd2g\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccfctd4\" (UID: \"9459ca47-7122-47fb-98b9-8f70bee6951a\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccfctd4" Apr 17 14:33:07.304906 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:33:07.304806 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccfctd4" Apr 17 14:33:07.454807 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:33:07.454778 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccfctd4"] Apr 17 14:33:07.457100 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:33:07.457065 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9459ca47_7122_47fb_98b9_8f70bee6951a.slice/crio-8a4658e7e6841844fc1613967b3241e693931d0116bb4aaad6c9353e88b46083 WatchSource:0}: Error finding container 8a4658e7e6841844fc1613967b3241e693931d0116bb4aaad6c9353e88b46083: Status 404 returned error can't find the container with id 8a4658e7e6841844fc1613967b3241e693931d0116bb4aaad6c9353e88b46083 Apr 17 14:33:07.952187 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:33:07.951254 2575 generic.go:358] "Generic (PLEG): container finished" podID="383c9d5b-7fac-4518-b520-6b972699a934" containerID="1fd02c6f3d75ef084f514708adec7e26693d319685a0e9e0b7e992c156f49874" exitCode=0 Apr 17 14:33:07.952187 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:33:07.951383 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-csf8l" event={"ID":"383c9d5b-7fac-4518-b520-6b972699a934","Type":"ContainerDied","Data":"1fd02c6f3d75ef084f514708adec7e26693d319685a0e9e0b7e992c156f49874"} Apr 17 14:33:07.954169 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:33:07.954135 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccfctd4" event={"ID":"9459ca47-7122-47fb-98b9-8f70bee6951a","Type":"ContainerStarted","Data":"42a09ceb450f84277b3814d6aada3c86ca5915a9971c9bcb786793d4fc598ca7"} Apr 17 14:33:07.954270 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:33:07.954171 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccfctd4" event={"ID":"9459ca47-7122-47fb-98b9-8f70bee6951a","Type":"ContainerStarted","Data":"8a4658e7e6841844fc1613967b3241e693931d0116bb4aaad6c9353e88b46083"} Apr 17 14:33:08.960003 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:33:08.959966 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-csf8l" event={"ID":"383c9d5b-7fac-4518-b520-6b972699a934","Type":"ContainerStarted","Data":"13481eb15e1b31ab9aae2d7dd7b0b4f3aea550b9ed7c8012b0dbdae0d7bfc460"} Apr 17 14:33:08.960495 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:33:08.960189 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-csf8l" Apr 17 14:33:08.984011 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:33:08.983957 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-csf8l" podStartSLOduration=8.567921449 podStartE2EDuration="8.983941957s" podCreationTimestamp="2026-04-17 14:33:00 +0000 UTC" firstStartedPulling="2026-04-17 14:33:07.952323547 +0000 UTC m=+706.259077158" lastFinishedPulling="2026-04-17 14:33:08.368344065 +0000 UTC m=+706.675097666" observedRunningTime="2026-04-17 14:33:08.982851466 +0000 UTC m=+707.289605078" watchObservedRunningTime="2026-04-17 14:33:08.983941957 +0000 UTC m=+707.290695577" Apr 17 14:33:13.986952 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:33:13.986906 2575 generic.go:358] "Generic (PLEG): container finished" podID="9459ca47-7122-47fb-98b9-8f70bee6951a" containerID="42a09ceb450f84277b3814d6aada3c86ca5915a9971c9bcb786793d4fc598ca7" exitCode=0 Apr 17 14:33:13.987390 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:33:13.986966 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccfctd4" event={"ID":"9459ca47-7122-47fb-98b9-8f70bee6951a","Type":"ContainerDied","Data":"42a09ceb450f84277b3814d6aada3c86ca5915a9971c9bcb786793d4fc598ca7"} Apr 17 14:33:14.992447 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:33:14.992412 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccfctd4" event={"ID":"9459ca47-7122-47fb-98b9-8f70bee6951a","Type":"ContainerStarted","Data":"239fdc9294c947ec80d681e8af68fc2210a33f139d9b5f75806bffaae53f332a"} Apr 17 14:33:14.992860 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:33:14.992626 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccfctd4" Apr 17 14:33:15.011407 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:33:15.011362 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccfctd4" podStartSLOduration=8.558153759 podStartE2EDuration="9.011349965s" podCreationTimestamp="2026-04-17 14:33:06 +0000 UTC" firstStartedPulling="2026-04-17 14:33:13.98761009 +0000 UTC m=+712.294363689" lastFinishedPulling="2026-04-17 14:33:14.440806297 +0000 UTC m=+712.747559895" observedRunningTime="2026-04-17 14:33:15.00877614 +0000 UTC m=+713.315529762" watchObservedRunningTime="2026-04-17 14:33:15.011349965 +0000 UTC m=+713.318103625" Apr 17 14:33:16.960760 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:33:16.960716 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-wclzr" Apr 17 14:33:19.091263 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:33:19.091231 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-simulated-kserve-69d7bf476b-pgx26"] Apr 17 14:33:19.191647 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:33:19.191609 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-69d7bf476b-pgx26"] Apr 17 14:33:19.191806 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:33:19.191745 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-pgx26" Apr 17 14:33:19.194416 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:33:19.194391 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-simulated-kserve-self-signed-certs\"" Apr 17 14:33:19.273432 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:33:19.273398 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b9998793-962e-4833-81fa-b0701e36a088-home\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-pgx26\" (UID: \"b9998793-962e-4833-81fa-b0701e36a088\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-pgx26" Apr 17 14:33:19.273612 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:33:19.273452 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b9998793-962e-4833-81fa-b0701e36a088-model-cache\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-pgx26\" (UID: \"b9998793-962e-4833-81fa-b0701e36a088\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-pgx26" Apr 17 14:33:19.273612 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:33:19.273515 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b9998793-962e-4833-81fa-b0701e36a088-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-pgx26\" (UID: \"b9998793-962e-4833-81fa-b0701e36a088\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-pgx26" Apr 17 14:33:19.273612 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:33:19.273541 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ls49\" (UniqueName: \"kubernetes.io/projected/b9998793-962e-4833-81fa-b0701e36a088-kube-api-access-8ls49\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-pgx26\" (UID: \"b9998793-962e-4833-81fa-b0701e36a088\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-pgx26" Apr 17 14:33:19.273612 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:33:19.273597 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b9998793-962e-4833-81fa-b0701e36a088-tls-certs\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-pgx26\" (UID: \"b9998793-962e-4833-81fa-b0701e36a088\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-pgx26" Apr 17 14:33:19.273793 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:33:19.273650 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b9998793-962e-4833-81fa-b0701e36a088-dshm\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-pgx26\" (UID: \"b9998793-962e-4833-81fa-b0701e36a088\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-pgx26" Apr 17 14:33:19.374853 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:33:19.374761 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b9998793-962e-4833-81fa-b0701e36a088-home\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-pgx26\" (UID: \"b9998793-962e-4833-81fa-b0701e36a088\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-pgx26" Apr 17 14:33:19.374853 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:33:19.374821 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b9998793-962e-4833-81fa-b0701e36a088-model-cache\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-pgx26\" (UID: \"b9998793-962e-4833-81fa-b0701e36a088\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-pgx26" Apr 17 14:33:19.375098 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:33:19.374910 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b9998793-962e-4833-81fa-b0701e36a088-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-pgx26\" (UID: \"b9998793-962e-4833-81fa-b0701e36a088\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-pgx26" Apr 17 14:33:19.375098 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:33:19.374951 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8ls49\" (UniqueName: \"kubernetes.io/projected/b9998793-962e-4833-81fa-b0701e36a088-kube-api-access-8ls49\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-pgx26\" (UID: \"b9998793-962e-4833-81fa-b0701e36a088\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-pgx26" Apr 17 14:33:19.375199 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:33:19.375089 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b9998793-962e-4833-81fa-b0701e36a088-tls-certs\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-pgx26\" (UID: \"b9998793-962e-4833-81fa-b0701e36a088\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-pgx26" Apr 17 14:33:19.375199 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:33:19.375142 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b9998793-962e-4833-81fa-b0701e36a088-dshm\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-pgx26\" (UID: \"b9998793-962e-4833-81fa-b0701e36a088\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-pgx26" Apr 17 14:33:19.375302 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:33:19.375232 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b9998793-962e-4833-81fa-b0701e36a088-home\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-pgx26\" (UID: \"b9998793-962e-4833-81fa-b0701e36a088\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-pgx26" Apr 17 14:33:19.375302 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:33:19.375266 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b9998793-962e-4833-81fa-b0701e36a088-model-cache\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-pgx26\" (UID: \"b9998793-962e-4833-81fa-b0701e36a088\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-pgx26" Apr 17 14:33:19.375389 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:33:19.375322 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b9998793-962e-4833-81fa-b0701e36a088-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-pgx26\" (UID: \"b9998793-962e-4833-81fa-b0701e36a088\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-pgx26" Apr 17 14:33:19.377336 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:33:19.377318 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b9998793-962e-4833-81fa-b0701e36a088-dshm\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-pgx26\" (UID: \"b9998793-962e-4833-81fa-b0701e36a088\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-pgx26" Apr 17 14:33:19.377595 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:33:19.377575 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b9998793-962e-4833-81fa-b0701e36a088-tls-certs\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-pgx26\" (UID: \"b9998793-962e-4833-81fa-b0701e36a088\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-pgx26" Apr 17 14:33:19.382234 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:33:19.382210 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ls49\" (UniqueName: \"kubernetes.io/projected/b9998793-962e-4833-81fa-b0701e36a088-kube-api-access-8ls49\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-pgx26\" (UID: \"b9998793-962e-4833-81fa-b0701e36a088\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-pgx26" Apr 17 14:33:19.503244 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:33:19.503203 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-pgx26" Apr 17 14:33:19.640306 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:33:19.640281 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-69d7bf476b-pgx26"] Apr 17 14:33:19.642693 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:33:19.642638 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb9998793_962e_4833_81fa_b0701e36a088.slice/crio-cea32aeddeebb767a72f81097248cb3685f78410e61668a919ffbe58a09c1f9a WatchSource:0}: Error finding container cea32aeddeebb767a72f81097248cb3685f78410e61668a919ffbe58a09c1f9a: Status 404 returned error can't find the container with id cea32aeddeebb767a72f81097248cb3685f78410e61668a919ffbe58a09c1f9a Apr 17 14:33:19.978361 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:33:19.978263 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-csf8l" Apr 17 14:33:20.013358 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:33:20.013321 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-pgx26" event={"ID":"b9998793-962e-4833-81fa-b0701e36a088","Type":"ContainerStarted","Data":"bd6f1c6f1032570df880c12e98a8ad389f7fb04795ef0995ea10baed879c0ce2"} Apr 17 14:33:20.013554 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:33:20.013365 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-pgx26" event={"ID":"b9998793-962e-4833-81fa-b0701e36a088","Type":"ContainerStarted","Data":"cea32aeddeebb767a72f81097248cb3685f78410e61668a919ffbe58a09c1f9a"} Apr 17 14:33:26.009703 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:33:26.009672 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccfctd4" Apr 17 14:33:26.038648 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:33:26.038608 2575 generic.go:358] "Generic (PLEG): container finished" podID="b9998793-962e-4833-81fa-b0701e36a088" containerID="bd6f1c6f1032570df880c12e98a8ad389f7fb04795ef0995ea10baed879c0ce2" exitCode=0 Apr 17 14:33:26.038828 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:33:26.038680 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-pgx26" event={"ID":"b9998793-962e-4833-81fa-b0701e36a088","Type":"ContainerDied","Data":"bd6f1c6f1032570df880c12e98a8ad389f7fb04795ef0995ea10baed879c0ce2"} Apr 17 14:33:27.044652 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:33:27.044616 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-pgx26" event={"ID":"b9998793-962e-4833-81fa-b0701e36a088","Type":"ContainerStarted","Data":"aba8dcf2cc846ce379902271029d4f8871c7e67f85cea9dc3ebb9fd2f200a4c6"} Apr 17 14:33:27.045491 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:33:27.045470 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-pgx26" Apr 17 14:33:27.062383 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:33:27.062342 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-pgx26" podStartSLOduration=7.829877978 podStartE2EDuration="8.062329243s" podCreationTimestamp="2026-04-17 14:33:19 +0000 UTC" firstStartedPulling="2026-04-17 14:33:26.039674415 +0000 UTC m=+724.346428020" lastFinishedPulling="2026-04-17 14:33:26.272125683 +0000 UTC m=+724.578879285" observedRunningTime="2026-04-17 14:33:27.061545788 +0000 UTC m=+725.368299410" watchObservedRunningTime="2026-04-17 14:33:27.062329243 +0000 UTC m=+725.369082862" Apr 17 14:33:38.063696 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:33:38.063657 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-pgx26" Apr 17 14:45:44.674449 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:45:44.674409 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-xflb5"] Apr 17 14:45:44.674968 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:45:44.674651 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-xflb5" podUID="ff8e485d-47fe-414b-974b-7926fcc8771c" containerName="manager" containerID="cri-o://dd7e9756b3149abcee37e82190a674b77a374e5944927a969969b17633e124a1" gracePeriod=10 Apr 17 14:45:45.625561 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:45:45.625539 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-xflb5" Apr 17 14:45:45.782358 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:45:45.782277 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/ff8e485d-47fe-414b-974b-7926fcc8771c-extensions-socket-volume\") pod \"ff8e485d-47fe-414b-974b-7926fcc8771c\" (UID: \"ff8e485d-47fe-414b-974b-7926fcc8771c\") " Apr 17 14:45:45.782358 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:45:45.782318 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rgk5l\" (UniqueName: \"kubernetes.io/projected/ff8e485d-47fe-414b-974b-7926fcc8771c-kube-api-access-rgk5l\") pod \"ff8e485d-47fe-414b-974b-7926fcc8771c\" (UID: \"ff8e485d-47fe-414b-974b-7926fcc8771c\") " Apr 17 14:45:45.782724 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:45:45.782639 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff8e485d-47fe-414b-974b-7926fcc8771c-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "ff8e485d-47fe-414b-974b-7926fcc8771c" (UID: "ff8e485d-47fe-414b-974b-7926fcc8771c"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 14:45:45.784753 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:45:45.784708 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff8e485d-47fe-414b-974b-7926fcc8771c-kube-api-access-rgk5l" (OuterVolumeSpecName: "kube-api-access-rgk5l") pod "ff8e485d-47fe-414b-974b-7926fcc8771c" (UID: "ff8e485d-47fe-414b-974b-7926fcc8771c"). InnerVolumeSpecName "kube-api-access-rgk5l". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 14:45:45.825091 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:45:45.825062 2575 generic.go:358] "Generic (PLEG): container finished" podID="ff8e485d-47fe-414b-974b-7926fcc8771c" containerID="dd7e9756b3149abcee37e82190a674b77a374e5944927a969969b17633e124a1" exitCode=0 Apr 17 14:45:45.825258 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:45:45.825125 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-xflb5" Apr 17 14:45:45.825258 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:45:45.825156 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-xflb5" event={"ID":"ff8e485d-47fe-414b-974b-7926fcc8771c","Type":"ContainerDied","Data":"dd7e9756b3149abcee37e82190a674b77a374e5944927a969969b17633e124a1"} Apr 17 14:45:45.825258 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:45:45.825203 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-xflb5" event={"ID":"ff8e485d-47fe-414b-974b-7926fcc8771c","Type":"ContainerDied","Data":"b49d57e35e065bc98b2dce97c8796764ee4e024838fcbf37607042fdc5d934ce"} Apr 17 14:45:45.825258 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:45:45.825226 2575 scope.go:117] "RemoveContainer" containerID="dd7e9756b3149abcee37e82190a674b77a374e5944927a969969b17633e124a1" Apr 17 14:45:45.838183 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:45:45.837851 2575 scope.go:117] "RemoveContainer" containerID="dd7e9756b3149abcee37e82190a674b77a374e5944927a969969b17633e124a1" Apr 17 14:45:45.838815 ip-10-0-138-3 kubenswrapper[2575]: E0417 14:45:45.838781 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd7e9756b3149abcee37e82190a674b77a374e5944927a969969b17633e124a1\": container with ID starting with dd7e9756b3149abcee37e82190a674b77a374e5944927a969969b17633e124a1 not found: ID does not exist" containerID="dd7e9756b3149abcee37e82190a674b77a374e5944927a969969b17633e124a1" Apr 17 14:45:45.838939 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:45:45.838826 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd7e9756b3149abcee37e82190a674b77a374e5944927a969969b17633e124a1"} err="failed to get container status \"dd7e9756b3149abcee37e82190a674b77a374e5944927a969969b17633e124a1\": rpc error: code = NotFound desc = could not find container \"dd7e9756b3149abcee37e82190a674b77a374e5944927a969969b17633e124a1\": container with ID starting with dd7e9756b3149abcee37e82190a674b77a374e5944927a969969b17633e124a1 not found: ID does not exist" Apr 17 14:45:45.867762 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:45:45.867729 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-xflb5"] Apr 17 14:45:45.869610 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:45:45.869591 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-xflb5"] Apr 17 14:45:45.883783 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:45:45.883756 2575 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/ff8e485d-47fe-414b-974b-7926fcc8771c-extensions-socket-volume\") on node \"ip-10-0-138-3.ec2.internal\" DevicePath \"\"" Apr 17 14:45:45.883906 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:45:45.883786 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rgk5l\" (UniqueName: \"kubernetes.io/projected/ff8e485d-47fe-414b-974b-7926fcc8771c-kube-api-access-rgk5l\") on node \"ip-10-0-138-3.ec2.internal\" DevicePath \"\"" Apr 17 14:45:46.403465 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:45:46.403421 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff8e485d-47fe-414b-974b-7926fcc8771c" path="/var/lib/kubelet/pods/ff8e485d-47fe-414b-974b-7926fcc8771c/volumes" Apr 17 14:46:50.763461 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:46:50.763376 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-bhh5q"] Apr 17 14:46:50.763889 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:46:50.763801 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ff8e485d-47fe-414b-974b-7926fcc8771c" containerName="manager" Apr 17 14:46:50.763889 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:46:50.763813 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff8e485d-47fe-414b-974b-7926fcc8771c" containerName="manager" Apr 17 14:46:50.763975 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:46:50.763910 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="ff8e485d-47fe-414b-974b-7926fcc8771c" containerName="manager" Apr 17 14:46:50.766821 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:46:50.766803 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-bhh5q" Apr 17 14:46:50.770484 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:46:50.770449 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-hgfvd\"" Apr 17 14:46:50.790433 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:46:50.790395 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-bhh5q"] Apr 17 14:46:50.813934 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:46:50.813892 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/0eded2f2-1957-49d6-8417-7fc097cb5480-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-bhh5q\" (UID: \"0eded2f2-1957-49d6-8417-7fc097cb5480\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-bhh5q" Apr 17 14:46:50.814121 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:46:50.813958 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9n68t\" (UniqueName: \"kubernetes.io/projected/0eded2f2-1957-49d6-8417-7fc097cb5480-kube-api-access-9n68t\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-bhh5q\" (UID: \"0eded2f2-1957-49d6-8417-7fc097cb5480\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-bhh5q" Apr 17 14:46:50.915265 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:46:50.915229 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9n68t\" (UniqueName: \"kubernetes.io/projected/0eded2f2-1957-49d6-8417-7fc097cb5480-kube-api-access-9n68t\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-bhh5q\" (UID: \"0eded2f2-1957-49d6-8417-7fc097cb5480\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-bhh5q" Apr 17 14:46:50.915468 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:46:50.915342 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/0eded2f2-1957-49d6-8417-7fc097cb5480-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-bhh5q\" (UID: \"0eded2f2-1957-49d6-8417-7fc097cb5480\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-bhh5q" Apr 17 14:46:50.915720 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:46:50.915701 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/0eded2f2-1957-49d6-8417-7fc097cb5480-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-bhh5q\" (UID: \"0eded2f2-1957-49d6-8417-7fc097cb5480\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-bhh5q" Apr 17 14:46:50.937436 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:46:50.937399 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9n68t\" (UniqueName: \"kubernetes.io/projected/0eded2f2-1957-49d6-8417-7fc097cb5480-kube-api-access-9n68t\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-bhh5q\" (UID: \"0eded2f2-1957-49d6-8417-7fc097cb5480\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-bhh5q" Apr 17 14:46:51.077450 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:46:51.077411 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-bhh5q" Apr 17 14:46:51.236976 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:46:51.236937 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-bhh5q"] Apr 17 14:46:51.240378 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:46:51.240349 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0eded2f2_1957_49d6_8417_7fc097cb5480.slice/crio-835d309e899ad3ab494a1a65e76cb56641976c235a099afefa0cc5a0260d0386 WatchSource:0}: Error finding container 835d309e899ad3ab494a1a65e76cb56641976c235a099afefa0cc5a0260d0386: Status 404 returned error can't find the container with id 835d309e899ad3ab494a1a65e76cb56641976c235a099afefa0cc5a0260d0386 Apr 17 14:46:51.242719 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:46:51.242699 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 14:46:52.083054 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:46:52.083008 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-bhh5q" event={"ID":"0eded2f2-1957-49d6-8417-7fc097cb5480","Type":"ContainerStarted","Data":"5e30ca8bacaf3b613d124f4879b12882f960e72820b5d3dae3e25d029789ff37"} Apr 17 14:46:52.083054 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:46:52.083057 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-bhh5q" event={"ID":"0eded2f2-1957-49d6-8417-7fc097cb5480","Type":"ContainerStarted","Data":"835d309e899ad3ab494a1a65e76cb56641976c235a099afefa0cc5a0260d0386"} Apr 17 14:46:52.083488 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:46:52.083120 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-bhh5q" Apr 17 14:46:52.134887 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:46:52.134826 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-bhh5q" podStartSLOduration=2.13481109 podStartE2EDuration="2.13481109s" podCreationTimestamp="2026-04-17 14:46:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 14:46:52.134218317 +0000 UTC m=+1530.440971937" watchObservedRunningTime="2026-04-17 14:46:52.13481109 +0000 UTC m=+1530.441564711" Apr 17 14:47:03.089004 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:47:03.088972 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-bhh5q" Apr 17 14:56:32.821953 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:56:32.821912 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-sp7bk_cda3b0f0-f542-412c-aab3-748fec9a0d43/manager/0.log" Apr 17 14:56:32.935911 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:56:32.935861 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-df86fd548-xff85_35b2e5a7-7f6e-47de-8ce9-e16a333c164c/maas-api/0.log" Apr 17 14:56:33.163335 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:56:33.163253 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-24dl9_c4818cef-8813-4ed3-ae34-1ff8d2acb789/manager/2.log" Apr 17 14:56:33.401938 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:56:33.401906 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-58c8f88b6d-rhcxr_5e4f5f7f-c245-4add-bc95-575616281d6a/manager/0.log" Apr 17 14:56:35.018815 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:56:35.018782 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-ctpdf_cd186241-8519-4040-941f-5bfbe4ca67a8/manager/0.log" Apr 17 14:56:35.344413 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:56:35.344382 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-6bc9f4c76f-bhh5q_0eded2f2-1957-49d6-8417-7fc097cb5480/manager/0.log" Apr 17 14:56:35.581703 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:56:35.581674 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-2x6kn_6b6833b0-f9b4-4249-8452-5234926b013c/manager/0.log" Apr 17 14:56:36.039671 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:56:36.039642 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-4hj5k_0adf8995-4617-448e-b417-f69e0ac85b30/discovery/0.log" Apr 17 14:56:36.138021 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:56:36.137993 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-5b474cc896-hrn2m_1b235a5d-f63d-4963-bd85-69721b597cd4/kube-auth-proxy/0.log" Apr 17 14:56:36.460626 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:56:36.460599 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-685d56bbbb-mp6gq_c067d95c-e09f-4c47-8281-f638f4740633/router/0.log" Apr 17 14:56:36.880068 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:56:36.880036 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-69d7bf476b-pgx26_b9998793-962e-4833-81fa-b0701e36a088/storage-initializer/0.log" Apr 17 14:56:36.888425 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:56:36.888395 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-69d7bf476b-pgx26_b9998793-962e-4833-81fa-b0701e36a088/main/0.log" Apr 17 14:56:37.002533 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:56:37.002499 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-trlp-test-simulated-kserve-6d5965695-csf8l_383c9d5b-7fac-4518-b520-6b972699a934/storage-initializer/0.log" Apr 17 14:56:37.009510 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:56:37.009488 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-trlp-test-simulated-kserve-6d5965695-csf8l_383c9d5b-7fac-4518-b520-6b972699a934/main/0.log" Apr 17 14:56:37.113224 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:56:37.113197 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccfctd4_9459ca47-7122-47fb-98b9-8f70bee6951a/storage-initializer/0.log" Apr 17 14:56:37.120905 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:56:37.120865 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccfctd4_9459ca47-7122-47fb-98b9-8f70bee6951a/main/0.log" Apr 17 14:56:37.345177 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:56:37.345144 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-f5df4587b-wclzr_a7d1a773-42e8-4254-afba-448fd1692f34/storage-initializer/0.log" Apr 17 14:56:37.351903 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:56:37.351864 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-f5df4587b-wclzr_a7d1a773-42e8-4254-afba-448fd1692f34/main/0.log" Apr 17 14:56:43.814566 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:56:43.814528 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-fhzz8_05b23a0f-2e19-4ad7-861d-8f2bb92bc69c/global-pull-secret-syncer/0.log" Apr 17 14:56:43.926551 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:56:43.926502 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-bx274_f386792e-2dbd-4e36-af17-6dbd71a6ad31/konnectivity-agent/0.log" Apr 17 14:56:44.038140 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:56:44.038113 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-138-3.ec2.internal_fcb38c6600a2cdeab28c1b9fce28f12c/haproxy/0.log" Apr 17 14:56:48.494379 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:56:48.494346 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-ctpdf_cd186241-8519-4040-941f-5bfbe4ca67a8/manager/0.log" Apr 17 14:56:48.649423 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:56:48.649375 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-6bc9f4c76f-bhh5q_0eded2f2-1957-49d6-8417-7fc097cb5480/manager/0.log" Apr 17 14:56:48.737440 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:56:48.737400 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-2x6kn_6b6833b0-f9b4-4249-8452-5234926b013c/manager/0.log" Apr 17 14:56:50.178573 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:56:50.178477 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_931dfe5d-2ef0-45e0-9977-5d431e964c6e/alertmanager/0.log" Apr 17 14:56:50.201110 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:56:50.201082 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_931dfe5d-2ef0-45e0-9977-5d431e964c6e/config-reloader/0.log" Apr 17 14:56:50.221855 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:56:50.221829 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_931dfe5d-2ef0-45e0-9977-5d431e964c6e/kube-rbac-proxy-web/0.log" Apr 17 14:56:50.248109 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:56:50.248079 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_931dfe5d-2ef0-45e0-9977-5d431e964c6e/kube-rbac-proxy/0.log" Apr 17 14:56:50.274485 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:56:50.274457 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_931dfe5d-2ef0-45e0-9977-5d431e964c6e/kube-rbac-proxy-metric/0.log" Apr 17 14:56:50.295709 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:56:50.295674 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_931dfe5d-2ef0-45e0-9977-5d431e964c6e/prom-label-proxy/0.log" Apr 17 14:56:50.337028 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:56:50.337002 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_931dfe5d-2ef0-45e0-9977-5d431e964c6e/init-config-reloader/0.log" Apr 17 14:56:50.387675 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:56:50.387646 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-zrlkv_7951b390-3fe0-4bc3-bd2a-fde607c15638/cluster-monitoring-operator/0.log" Apr 17 14:56:50.470021 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:56:50.469914 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-558d8d6877-tvdz9_d3b554de-ad0f-47d2-8d4e-73fe61a9588d/metrics-server/0.log" Apr 17 14:56:50.599062 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:56:50.599030 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-pnrxk_48c89e9d-7b69-42c5-a313-71254165cc50/node-exporter/0.log" Apr 17 14:56:50.619904 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:56:50.619879 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-pnrxk_48c89e9d-7b69-42c5-a313-71254165cc50/kube-rbac-proxy/0.log" Apr 17 14:56:50.639432 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:56:50.639401 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-pnrxk_48c89e9d-7b69-42c5-a313-71254165cc50/init-textfile/0.log" Apr 17 14:56:50.734824 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:56:50.734797 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-t7ww2_e8d4b72a-7437-4185-ba66-defa6c9c36cf/kube-rbac-proxy-main/0.log" Apr 17 14:56:50.763779 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:56:50.763748 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-t7ww2_e8d4b72a-7437-4185-ba66-defa6c9c36cf/kube-rbac-proxy-self/0.log" Apr 17 14:56:50.784975 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:56:50.784950 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-t7ww2_e8d4b72a-7437-4185-ba66-defa6c9c36cf/openshift-state-metrics/0.log" Apr 17 14:56:51.044572 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:56:51.044498 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-757dc89df9-s8xkt_63eb63dd-c3b9-4102-80f3-158a88f0d1f1/telemeter-client/0.log" Apr 17 14:56:51.064038 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:56:51.064014 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-757dc89df9-s8xkt_63eb63dd-c3b9-4102-80f3-158a88f0d1f1/reload/0.log" Apr 17 14:56:51.086033 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:56:51.086005 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-757dc89df9-s8xkt_63eb63dd-c3b9-4102-80f3-158a88f0d1f1/kube-rbac-proxy/0.log" Apr 17 14:56:51.134300 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:56:51.134273 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7f8cfbb8c7-ptxbt_60b0eda9-1239-430f-bd31-254ff621c737/thanos-query/0.log" Apr 17 14:56:51.173910 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:56:51.173863 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7f8cfbb8c7-ptxbt_60b0eda9-1239-430f-bd31-254ff621c737/kube-rbac-proxy-web/0.log" Apr 17 14:56:51.208166 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:56:51.208141 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7f8cfbb8c7-ptxbt_60b0eda9-1239-430f-bd31-254ff621c737/kube-rbac-proxy/0.log" Apr 17 14:56:51.227183 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:56:51.227157 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7f8cfbb8c7-ptxbt_60b0eda9-1239-430f-bd31-254ff621c737/prom-label-proxy/0.log" Apr 17 14:56:51.252841 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:56:51.252812 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7f8cfbb8c7-ptxbt_60b0eda9-1239-430f-bd31-254ff621c737/kube-rbac-proxy-rules/0.log" Apr 17 14:56:51.276033 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:56:51.276004 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7f8cfbb8c7-ptxbt_60b0eda9-1239-430f-bd31-254ff621c737/kube-rbac-proxy-metrics/0.log" Apr 17 14:56:52.569372 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:56:52.569328 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-fr76n/perf-node-gather-daemonset-n2v6c"] Apr 17 14:56:52.573053 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:56:52.573029 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fr76n/perf-node-gather-daemonset-n2v6c" Apr 17 14:56:52.576323 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:56:52.576297 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-fr76n\"/\"kube-root-ca.crt\"" Apr 17 14:56:52.576450 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:56:52.576335 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-fr76n\"/\"openshift-service-ca.crt\"" Apr 17 14:56:52.577463 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:56:52.577442 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-fr76n\"/\"default-dockercfg-n2jw9\"" Apr 17 14:56:52.585371 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:56:52.585342 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-fr76n/perf-node-gather-daemonset-n2v6c"] Apr 17 14:56:52.675317 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:56:52.675275 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/a4aef00f-6e98-40ba-9a69-40e5715673c4-podres\") pod \"perf-node-gather-daemonset-n2v6c\" (UID: \"a4aef00f-6e98-40ba-9a69-40e5715673c4\") " pod="openshift-must-gather-fr76n/perf-node-gather-daemonset-n2v6c" Apr 17 14:56:52.675317 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:56:52.675318 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9hb2\" (UniqueName: \"kubernetes.io/projected/a4aef00f-6e98-40ba-9a69-40e5715673c4-kube-api-access-j9hb2\") pod \"perf-node-gather-daemonset-n2v6c\" (UID: \"a4aef00f-6e98-40ba-9a69-40e5715673c4\") " pod="openshift-must-gather-fr76n/perf-node-gather-daemonset-n2v6c" Apr 17 14:56:52.675536 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:56:52.675412 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a4aef00f-6e98-40ba-9a69-40e5715673c4-lib-modules\") pod \"perf-node-gather-daemonset-n2v6c\" (UID: \"a4aef00f-6e98-40ba-9a69-40e5715673c4\") " pod="openshift-must-gather-fr76n/perf-node-gather-daemonset-n2v6c" Apr 17 14:56:52.675536 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:56:52.675462 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a4aef00f-6e98-40ba-9a69-40e5715673c4-sys\") pod \"perf-node-gather-daemonset-n2v6c\" (UID: \"a4aef00f-6e98-40ba-9a69-40e5715673c4\") " pod="openshift-must-gather-fr76n/perf-node-gather-daemonset-n2v6c" Apr 17 14:56:52.675536 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:56:52.675494 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/a4aef00f-6e98-40ba-9a69-40e5715673c4-proc\") pod \"perf-node-gather-daemonset-n2v6c\" (UID: \"a4aef00f-6e98-40ba-9a69-40e5715673c4\") " pod="openshift-must-gather-fr76n/perf-node-gather-daemonset-n2v6c" Apr 17 14:56:52.776803 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:56:52.776772 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a4aef00f-6e98-40ba-9a69-40e5715673c4-lib-modules\") pod \"perf-node-gather-daemonset-n2v6c\" (UID: \"a4aef00f-6e98-40ba-9a69-40e5715673c4\") " pod="openshift-must-gather-fr76n/perf-node-gather-daemonset-n2v6c" Apr 17 14:56:52.777001 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:56:52.776832 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a4aef00f-6e98-40ba-9a69-40e5715673c4-sys\") pod \"perf-node-gather-daemonset-n2v6c\" (UID: \"a4aef00f-6e98-40ba-9a69-40e5715673c4\") " pod="openshift-must-gather-fr76n/perf-node-gather-daemonset-n2v6c" Apr 17 14:56:52.777001 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:56:52.776859 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/a4aef00f-6e98-40ba-9a69-40e5715673c4-proc\") pod \"perf-node-gather-daemonset-n2v6c\" (UID: \"a4aef00f-6e98-40ba-9a69-40e5715673c4\") " pod="openshift-must-gather-fr76n/perf-node-gather-daemonset-n2v6c" Apr 17 14:56:52.777001 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:56:52.776911 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/a4aef00f-6e98-40ba-9a69-40e5715673c4-podres\") pod \"perf-node-gather-daemonset-n2v6c\" (UID: \"a4aef00f-6e98-40ba-9a69-40e5715673c4\") " pod="openshift-must-gather-fr76n/perf-node-gather-daemonset-n2v6c" Apr 17 14:56:52.777001 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:56:52.776939 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j9hb2\" (UniqueName: \"kubernetes.io/projected/a4aef00f-6e98-40ba-9a69-40e5715673c4-kube-api-access-j9hb2\") pod \"perf-node-gather-daemonset-n2v6c\" (UID: \"a4aef00f-6e98-40ba-9a69-40e5715673c4\") " pod="openshift-must-gather-fr76n/perf-node-gather-daemonset-n2v6c" Apr 17 14:56:52.777001 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:56:52.776972 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a4aef00f-6e98-40ba-9a69-40e5715673c4-sys\") pod \"perf-node-gather-daemonset-n2v6c\" (UID: \"a4aef00f-6e98-40ba-9a69-40e5715673c4\") " pod="openshift-must-gather-fr76n/perf-node-gather-daemonset-n2v6c" Apr 17 14:56:52.777001 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:56:52.776972 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a4aef00f-6e98-40ba-9a69-40e5715673c4-lib-modules\") pod \"perf-node-gather-daemonset-n2v6c\" (UID: \"a4aef00f-6e98-40ba-9a69-40e5715673c4\") " pod="openshift-must-gather-fr76n/perf-node-gather-daemonset-n2v6c" Apr 17 14:56:52.777001 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:56:52.776972 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/a4aef00f-6e98-40ba-9a69-40e5715673c4-proc\") pod \"perf-node-gather-daemonset-n2v6c\" (UID: \"a4aef00f-6e98-40ba-9a69-40e5715673c4\") " pod="openshift-must-gather-fr76n/perf-node-gather-daemonset-n2v6c" Apr 17 14:56:52.777269 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:56:52.777042 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/a4aef00f-6e98-40ba-9a69-40e5715673c4-podres\") pod \"perf-node-gather-daemonset-n2v6c\" (UID: \"a4aef00f-6e98-40ba-9a69-40e5715673c4\") " pod="openshift-must-gather-fr76n/perf-node-gather-daemonset-n2v6c" Apr 17 14:56:52.797860 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:56:52.797827 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9hb2\" (UniqueName: \"kubernetes.io/projected/a4aef00f-6e98-40ba-9a69-40e5715673c4-kube-api-access-j9hb2\") pod \"perf-node-gather-daemonset-n2v6c\" (UID: \"a4aef00f-6e98-40ba-9a69-40e5715673c4\") " pod="openshift-must-gather-fr76n/perf-node-gather-daemonset-n2v6c" Apr 17 14:56:52.884020 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:56:52.883930 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fr76n/perf-node-gather-daemonset-n2v6c" Apr 17 14:56:53.042576 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:56:53.042545 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-fr76n/perf-node-gather-daemonset-n2v6c"] Apr 17 14:56:53.044013 ip-10-0-138-3 kubenswrapper[2575]: W0417 14:56:53.043983 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-poda4aef00f_6e98_40ba_9a69_40e5715673c4.slice/crio-64f0252864a6de35e313d3fae038b269db9d003ce5a795bd964e9e17a4341e46 WatchSource:0}: Error finding container 64f0252864a6de35e313d3fae038b269db9d003ce5a795bd964e9e17a4341e46: Status 404 returned error can't find the container with id 64f0252864a6de35e313d3fae038b269db9d003ce5a795bd964e9e17a4341e46 Apr 17 14:56:53.045765 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:56:53.045747 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 14:56:53.385674 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:56:53.385649 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7f577ffdcb-v5fb5_0f3d3e54-7019-4a48-81d1-e73537173f93/console/0.log" Apr 17 14:56:53.391824 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:56:53.391791 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fr76n/perf-node-gather-daemonset-n2v6c" event={"ID":"a4aef00f-6e98-40ba-9a69-40e5715673c4","Type":"ContainerStarted","Data":"cf6c7699cfd048c280b520d52fcd921b57c4363f5cd9bc17d0f05ea66e78b1f2"} Apr 17 14:56:53.391987 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:56:53.391828 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fr76n/perf-node-gather-daemonset-n2v6c" event={"ID":"a4aef00f-6e98-40ba-9a69-40e5715673c4","Type":"ContainerStarted","Data":"64f0252864a6de35e313d3fae038b269db9d003ce5a795bd964e9e17a4341e46"} Apr 17 14:56:53.391987 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:56:53.391943 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-fr76n/perf-node-gather-daemonset-n2v6c" Apr 17 14:56:53.413084 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:56:53.412992 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-fr76n/perf-node-gather-daemonset-n2v6c" podStartSLOduration=1.412975947 podStartE2EDuration="1.412975947s" podCreationTimestamp="2026-04-17 14:56:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 14:56:53.412069128 +0000 UTC m=+2131.718822757" watchObservedRunningTime="2026-04-17 14:56:53.412975947 +0000 UTC m=+2131.719729568" Apr 17 14:56:54.764235 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:56:54.764203 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-lp8nz_822a8df9-29ea-4649-a163-22e1db926c84/dns/0.log" Apr 17 14:56:54.785191 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:56:54.785161 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-lp8nz_822a8df9-29ea-4649-a163-22e1db926c84/kube-rbac-proxy/0.log" Apr 17 14:56:54.910492 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:56:54.910463 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-kvtp7_e8508015-adfb-42aa-acfc-92b24ec90241/dns-node-resolver/0.log" Apr 17 14:56:55.343303 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:56:55.343273 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-7b887f8f96-9xz6f_61a41f4f-43f3-483e-bf64-66b7a0d1f2a2/registry/0.log" Apr 17 14:56:55.364882 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:56:55.364836 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-58lbn_822ddccf-2c1a-42b8-8b4f-b676b2aa2c9f/node-ca/0.log" Apr 17 14:56:56.256507 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:56:56.256471 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-4hj5k_0adf8995-4617-448e-b417-f69e0ac85b30/discovery/0.log" Apr 17 14:56:56.276118 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:56:56.276088 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-5b474cc896-hrn2m_1b235a5d-f63d-4963-bd85-69721b597cd4/kube-auth-proxy/0.log" Apr 17 14:56:56.373166 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:56:56.373136 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-685d56bbbb-mp6gq_c067d95c-e09f-4c47-8281-f638f4740633/router/0.log" Apr 17 14:56:56.862743 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:56:56.862710 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-869tz_d37b1dac-43fd-47dd-9f14-18b1f81b8155/serve-healthcheck-canary/0.log" Apr 17 14:56:57.309190 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:56:57.309097 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-b4t8r_c5d9d12d-4816-4a8c-954c-b83681df2cd9/insights-operator/0.log" Apr 17 14:56:57.309773 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:56:57.309757 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-b4t8r_c5d9d12d-4816-4a8c-954c-b83681df2cd9/insights-operator/1.log" Apr 17 14:56:57.449063 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:56:57.449030 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-zphcj_0090e783-2cdd-4c1a-b3ef-af664ae49c8f/kube-rbac-proxy/0.log" Apr 17 14:56:57.469190 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:56:57.469161 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-zphcj_0090e783-2cdd-4c1a-b3ef-af664ae49c8f/exporter/0.log" Apr 17 14:56:57.489694 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:56:57.489664 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-zphcj_0090e783-2cdd-4c1a-b3ef-af664ae49c8f/extractor/0.log" Apr 17 14:56:59.295481 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:56:59.295449 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-sp7bk_cda3b0f0-f542-412c-aab3-748fec9a0d43/manager/0.log" Apr 17 14:56:59.329575 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:56:59.329544 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-df86fd548-xff85_35b2e5a7-7f6e-47de-8ce9-e16a333c164c/maas-api/0.log" Apr 17 14:56:59.399576 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:56:59.399536 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-24dl9_c4818cef-8813-4ed3-ae34-1ff8d2acb789/manager/1.log" Apr 17 14:56:59.408686 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:56:59.408658 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-fr76n/perf-node-gather-daemonset-n2v6c" Apr 17 14:56:59.412534 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:56:59.412512 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-24dl9_c4818cef-8813-4ed3-ae34-1ff8d2acb789/manager/2.log" Apr 17 14:56:59.469488 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:56:59.469455 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-58c8f88b6d-rhcxr_5e4f5f7f-c245-4add-bc95-575616281d6a/manager/0.log" Apr 17 14:57:00.582556 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:57:00.582526 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_openshift-lws-operator-bfc7f696d-fhlmq_0abf4507-e6c4-4663-b676-41c4b08ede12/openshift-lws-operator/0.log" Apr 17 14:57:06.740051 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:57:06.740018 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vp9jm_63505374-1c69-4d7f-853d-90e9526b6d12/kube-multus-additional-cni-plugins/0.log" Apr 17 14:57:06.765811 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:57:06.765784 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vp9jm_63505374-1c69-4d7f-853d-90e9526b6d12/egress-router-binary-copy/0.log" Apr 17 14:57:06.784598 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:57:06.784573 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vp9jm_63505374-1c69-4d7f-853d-90e9526b6d12/cni-plugins/0.log" Apr 17 14:57:06.802587 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:57:06.802569 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vp9jm_63505374-1c69-4d7f-853d-90e9526b6d12/bond-cni-plugin/0.log" Apr 17 14:57:06.821931 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:57:06.821902 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vp9jm_63505374-1c69-4d7f-853d-90e9526b6d12/routeoverride-cni/0.log" Apr 17 14:57:06.843014 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:57:06.842992 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vp9jm_63505374-1c69-4d7f-853d-90e9526b6d12/whereabouts-cni-bincopy/0.log" Apr 17 14:57:06.863556 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:57:06.863525 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vp9jm_63505374-1c69-4d7f-853d-90e9526b6d12/whereabouts-cni/0.log" Apr 17 14:57:06.894106 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:57:06.894082 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-gxf4n_207ded88-4793-4bf3-9d1a-a6775c96a280/kube-multus/0.log" Apr 17 14:57:07.002921 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:57:07.002828 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-ccsgf_d0eca2ae-83f6-462e-b7d9-9ab1592717a8/network-metrics-daemon/0.log" Apr 17 14:57:07.024629 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:57:07.024603 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-ccsgf_d0eca2ae-83f6-462e-b7d9-9ab1592717a8/kube-rbac-proxy/0.log" Apr 17 14:57:08.213045 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:57:08.213013 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x4dft_5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf/ovn-controller/0.log" Apr 17 14:57:08.242233 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:57:08.242204 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x4dft_5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf/ovn-acl-logging/0.log" Apr 17 14:57:08.261315 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:57:08.261280 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x4dft_5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf/kube-rbac-proxy-node/0.log" Apr 17 14:57:08.280053 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:57:08.280031 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x4dft_5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf/kube-rbac-proxy-ovn-metrics/0.log" Apr 17 14:57:08.298289 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:57:08.298269 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x4dft_5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf/northd/0.log" Apr 17 14:57:08.318658 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:57:08.318634 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x4dft_5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf/nbdb/0.log" Apr 17 14:57:08.336512 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:57:08.336490 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x4dft_5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf/sbdb/0.log" Apr 17 14:57:08.430746 ip-10-0-138-3 kubenswrapper[2575]: I0417 14:57:08.430719 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x4dft_5ab40dbd-e2b5-4a82-83d0-3a4846afc3cf/ovnkube-controller/0.log"