Apr 16 15:09:13.876994 ip-10-0-136-151 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 16 15:09:13.877009 ip-10-0-136-151 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 16 15:09:13.877018 ip-10-0-136-151 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 16 15:09:13.877325 ip-10-0-136-151 systemd[1]: Failed to start Kubernetes Kubelet. Apr 16 15:09:23.989410 ip-10-0-136-151 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 16 15:09:23.989428 ip-10-0-136-151 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot b05ef8df80cc4573ab4e777042187e7d -- Apr 16 15:12:01.285966 ip-10-0-136-151 systemd[1]: Starting Kubernetes Kubelet... Apr 16 15:12:01.755916 ip-10-0-136-151 kubenswrapper[2573]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 15:12:01.755916 ip-10-0-136-151 kubenswrapper[2573]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 15:12:01.755916 ip-10-0-136-151 kubenswrapper[2573]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 15:12:01.755916 ip-10-0-136-151 kubenswrapper[2573]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 15:12:01.755916 ip-10-0-136-151 kubenswrapper[2573]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 15:12:01.758369 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.758281 2573 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 15:12:01.763113 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763098 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 15:12:01.763113 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763114 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 15:12:01.763175 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763117 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 15:12:01.763175 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763121 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 15:12:01.763175 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763126 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 15:12:01.763175 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763130 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 15:12:01.763175 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763133 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 15:12:01.763175 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763136 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 15:12:01.763175 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763139 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 15:12:01.763175 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763142 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 15:12:01.763175 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763145 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 15:12:01.763175 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763148 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 15:12:01.763175 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763150 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 15:12:01.763175 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763153 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 15:12:01.763175 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763155 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 15:12:01.763175 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763158 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 15:12:01.763175 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763160 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 15:12:01.763175 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763163 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 15:12:01.763175 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763170 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 15:12:01.763175 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763173 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 15:12:01.763175 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763175 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 15:12:01.763175 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763178 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 15:12:01.763659 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763181 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 15:12:01.763659 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763185 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 15:12:01.763659 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763187 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 15:12:01.763659 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763190 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 15:12:01.763659 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763193 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 15:12:01.763659 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763195 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 15:12:01.763659 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763198 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 15:12:01.763659 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763200 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 15:12:01.763659 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763203 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 15:12:01.763659 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763205 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 15:12:01.763659 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763208 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 15:12:01.763659 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763211 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 15:12:01.763659 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763213 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 15:12:01.763659 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763215 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 15:12:01.763659 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763218 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 15:12:01.763659 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763220 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 15:12:01.763659 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763223 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 15:12:01.763659 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763225 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 15:12:01.763659 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763228 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 15:12:01.763659 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763231 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 15:12:01.764172 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763234 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 15:12:01.764172 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763236 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 15:12:01.764172 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763239 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 15:12:01.764172 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763242 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 15:12:01.764172 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763246 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 15:12:01.764172 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763250 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 15:12:01.764172 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763253 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 15:12:01.764172 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763255 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 15:12:01.764172 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763258 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 15:12:01.764172 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763261 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 15:12:01.764172 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763264 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 15:12:01.764172 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763268 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 15:12:01.764172 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763270 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 15:12:01.764172 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763275 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 15:12:01.764172 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763278 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 15:12:01.764172 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763281 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 15:12:01.764172 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763283 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 15:12:01.764172 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763286 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 15:12:01.764172 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763289 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 15:12:01.764616 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763292 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 15:12:01.764616 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763294 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 15:12:01.764616 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763297 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 15:12:01.764616 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763300 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 15:12:01.764616 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763303 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 15:12:01.764616 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763305 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 15:12:01.764616 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763308 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 15:12:01.764616 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763310 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 15:12:01.764616 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763313 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 15:12:01.764616 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763315 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 15:12:01.764616 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763318 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 15:12:01.764616 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763320 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 15:12:01.764616 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763322 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 15:12:01.764616 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763325 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 15:12:01.764616 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763327 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 15:12:01.764616 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763330 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 15:12:01.764616 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763332 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 15:12:01.764616 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763335 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 15:12:01.764616 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763338 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 15:12:01.765089 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763340 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 15:12:01.765089 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763343 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 15:12:01.765089 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763345 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 15:12:01.765089 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763348 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 15:12:01.765089 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763350 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 16 15:12:01.765089 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763353 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 15:12:01.765089 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763747 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 15:12:01.765089 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763752 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 15:12:01.765089 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763755 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 15:12:01.765089 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763758 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 15:12:01.765089 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763760 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 16 15:12:01.765089 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763763 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 15:12:01.765089 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763766 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 15:12:01.765089 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763768 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 15:12:01.765089 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763771 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 15:12:01.765089 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763774 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 15:12:01.765089 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763776 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 15:12:01.765089 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763779 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 15:12:01.765089 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763782 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 15:12:01.765089 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763784 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 15:12:01.765562 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763787 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 15:12:01.765562 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763791 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 15:12:01.765562 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763794 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 15:12:01.765562 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763797 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 15:12:01.765562 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763800 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 15:12:01.765562 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763803 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 15:12:01.765562 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763806 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 15:12:01.765562 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763809 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 15:12:01.765562 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763811 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 15:12:01.765562 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763814 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 15:12:01.765562 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763818 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 15:12:01.765562 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763820 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 15:12:01.765562 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763823 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 15:12:01.765562 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763825 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 15:12:01.765562 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763827 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 15:12:01.765562 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763830 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 15:12:01.765562 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763833 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 15:12:01.765562 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763836 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 15:12:01.765562 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763838 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 15:12:01.766046 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763841 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 15:12:01.766046 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763843 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 15:12:01.766046 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763845 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 15:12:01.766046 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763848 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 15:12:01.766046 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763850 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 15:12:01.766046 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763852 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 15:12:01.766046 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763855 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 15:12:01.766046 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763857 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 15:12:01.766046 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763860 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 15:12:01.766046 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763862 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 15:12:01.766046 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763865 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 15:12:01.766046 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763868 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 15:12:01.766046 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763871 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 15:12:01.766046 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763873 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 15:12:01.766046 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763875 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 15:12:01.766046 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763878 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 15:12:01.766046 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763881 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 15:12:01.766046 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763883 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 15:12:01.766046 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763886 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 15:12:01.766046 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763889 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 15:12:01.766527 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763891 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 15:12:01.766527 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763894 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 15:12:01.766527 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763896 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 15:12:01.766527 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763899 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 15:12:01.766527 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763901 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 15:12:01.766527 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763904 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 15:12:01.766527 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763906 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 15:12:01.766527 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763909 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 15:12:01.766527 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763912 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 15:12:01.766527 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763914 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 15:12:01.766527 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763916 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 15:12:01.766527 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763919 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 15:12:01.766527 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763923 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 15:12:01.766527 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763927 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 15:12:01.766527 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763930 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 15:12:01.766527 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763932 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 15:12:01.766527 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763935 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 15:12:01.766527 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763937 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 15:12:01.766527 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763939 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 15:12:01.767032 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763942 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 15:12:01.767032 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763944 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 15:12:01.767032 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763947 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 15:12:01.767032 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763949 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 15:12:01.767032 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763952 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 15:12:01.767032 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763954 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 15:12:01.767032 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763957 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 15:12:01.767032 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763959 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 15:12:01.767032 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763962 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 15:12:01.767032 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763964 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 15:12:01.767032 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763966 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 15:12:01.767032 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763969 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 15:12:01.767032 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763972 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 15:12:01.767032 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.763974 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 15:12:01.767032 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764551 2573 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 15:12:01.767032 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764565 2573 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 15:12:01.767032 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764572 2573 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 15:12:01.767032 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764577 2573 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 15:12:01.767032 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764583 2573 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 15:12:01.767032 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764586 2573 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 15:12:01.767032 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764593 2573 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 15:12:01.767538 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764598 2573 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 15:12:01.767538 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764601 2573 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 15:12:01.767538 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764604 2573 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 15:12:01.767538 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764608 2573 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 15:12:01.767538 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764611 2573 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 15:12:01.767538 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764614 2573 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 15:12:01.767538 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764617 2573 flags.go:64] FLAG: --cgroup-root="" Apr 16 15:12:01.767538 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764621 2573 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 15:12:01.767538 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764624 2573 flags.go:64] FLAG: --client-ca-file="" Apr 16 15:12:01.767538 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764627 2573 flags.go:64] FLAG: --cloud-config="" Apr 16 15:12:01.767538 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764630 2573 flags.go:64] FLAG: --cloud-provider="external" Apr 16 15:12:01.767538 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764633 2573 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 15:12:01.767538 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764637 2573 flags.go:64] FLAG: --cluster-domain="" Apr 16 15:12:01.767538 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764640 2573 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 15:12:01.767538 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764658 2573 flags.go:64] FLAG: --config-dir="" Apr 16 15:12:01.767538 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764664 2573 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 15:12:01.767538 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764670 2573 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 15:12:01.767538 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764674 2573 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 15:12:01.767538 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764677 2573 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 15:12:01.767538 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764680 2573 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 15:12:01.767538 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764684 2573 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 15:12:01.767538 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764688 2573 flags.go:64] FLAG: --contention-profiling="false" Apr 16 15:12:01.767538 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764690 2573 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 15:12:01.767538 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764693 2573 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 15:12:01.768133 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764697 2573 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 15:12:01.768133 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764699 2573 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 15:12:01.768133 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764704 2573 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 15:12:01.768133 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764707 2573 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 15:12:01.768133 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764710 2573 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 15:12:01.768133 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764713 2573 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 15:12:01.768133 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764717 2573 flags.go:64] FLAG: --enable-server="true" Apr 16 15:12:01.768133 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764720 2573 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 15:12:01.768133 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764724 2573 flags.go:64] FLAG: --event-burst="100" Apr 16 15:12:01.768133 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764727 2573 flags.go:64] FLAG: --event-qps="50" Apr 16 15:12:01.768133 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764731 2573 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 15:12:01.768133 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764736 2573 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 15:12:01.768133 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764739 2573 flags.go:64] FLAG: --eviction-hard="" Apr 16 15:12:01.768133 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764742 2573 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 15:12:01.768133 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764746 2573 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 15:12:01.768133 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764749 2573 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 15:12:01.768133 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764752 2573 flags.go:64] FLAG: --eviction-soft="" Apr 16 15:12:01.768133 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764755 2573 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 15:12:01.768133 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764758 2573 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 15:12:01.768133 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764760 2573 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 15:12:01.768133 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764763 2573 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 15:12:01.768133 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764766 2573 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 15:12:01.768133 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764770 2573 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 15:12:01.768133 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764773 2573 flags.go:64] FLAG: --feature-gates="" Apr 16 15:12:01.768133 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764777 2573 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 15:12:01.768746 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764780 2573 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 15:12:01.768746 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764783 2573 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 15:12:01.768746 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764787 2573 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 15:12:01.768746 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764790 2573 flags.go:64] FLAG: --healthz-port="10248" Apr 16 15:12:01.768746 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764793 2573 flags.go:64] FLAG: --help="false" Apr 16 15:12:01.768746 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764796 2573 flags.go:64] FLAG: --hostname-override="ip-10-0-136-151.ec2.internal" Apr 16 15:12:01.768746 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764799 2573 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 15:12:01.768746 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764802 2573 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 15:12:01.768746 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764804 2573 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 15:12:01.768746 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764808 2573 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 15:12:01.768746 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764811 2573 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 15:12:01.768746 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764814 2573 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 15:12:01.768746 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764816 2573 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 15:12:01.768746 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764819 2573 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 15:12:01.768746 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764823 2573 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 15:12:01.768746 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764826 2573 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 15:12:01.768746 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764829 2573 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 15:12:01.768746 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764832 2573 flags.go:64] FLAG: --kube-reserved="" Apr 16 15:12:01.768746 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764836 2573 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 15:12:01.768746 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764839 2573 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 15:12:01.768746 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764842 2573 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 15:12:01.768746 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764844 2573 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 15:12:01.768746 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764847 2573 flags.go:64] FLAG: --lock-file="" Apr 16 15:12:01.768746 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764850 2573 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 15:12:01.769310 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764853 2573 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 15:12:01.769310 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764855 2573 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 15:12:01.769310 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764860 2573 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 15:12:01.769310 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764864 2573 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 15:12:01.769310 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764867 2573 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 15:12:01.769310 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764869 2573 flags.go:64] FLAG: --logging-format="text" Apr 16 15:12:01.769310 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764872 2573 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 15:12:01.769310 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764876 2573 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 15:12:01.769310 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764879 2573 flags.go:64] FLAG: --manifest-url="" Apr 16 15:12:01.769310 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764882 2573 flags.go:64] FLAG: --manifest-url-header="" Apr 16 15:12:01.769310 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764886 2573 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 15:12:01.769310 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764889 2573 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 15:12:01.769310 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764894 2573 flags.go:64] FLAG: --max-pods="110" Apr 16 15:12:01.769310 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764897 2573 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 15:12:01.769310 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764900 2573 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 15:12:01.769310 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764902 2573 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 15:12:01.769310 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764905 2573 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 15:12:01.769310 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764909 2573 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 15:12:01.769310 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764911 2573 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 15:12:01.769310 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764914 2573 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 15:12:01.769310 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764921 2573 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 15:12:01.769310 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764925 2573 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 15:12:01.769310 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764928 2573 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 15:12:01.769310 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764935 2573 flags.go:64] FLAG: --pod-cidr="" Apr 16 15:12:01.769875 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764938 2573 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc76bab72f320de3d4105c90d73c4fb139c09e20ce0fa8dcbc0cb59920d27dec" Apr 16 15:12:01.769875 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764944 2573 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 15:12:01.769875 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764949 2573 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 15:12:01.769875 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764952 2573 flags.go:64] FLAG: --pods-per-core="0" Apr 16 15:12:01.769875 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764955 2573 flags.go:64] FLAG: --port="10250" Apr 16 15:12:01.769875 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764958 2573 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 15:12:01.769875 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764960 2573 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-099eafb58c666e644" Apr 16 15:12:01.769875 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764964 2573 flags.go:64] FLAG: --qos-reserved="" Apr 16 15:12:01.769875 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764967 2573 flags.go:64] FLAG: --read-only-port="10255" Apr 16 15:12:01.769875 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764969 2573 flags.go:64] FLAG: --register-node="true" Apr 16 15:12:01.769875 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764972 2573 flags.go:64] FLAG: --register-schedulable="true" Apr 16 15:12:01.769875 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764975 2573 flags.go:64] FLAG: --register-with-taints="" Apr 16 15:12:01.769875 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764979 2573 flags.go:64] FLAG: --registry-burst="10" Apr 16 15:12:01.769875 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764982 2573 flags.go:64] FLAG: --registry-qps="5" Apr 16 15:12:01.769875 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764985 2573 flags.go:64] FLAG: --reserved-cpus="" Apr 16 15:12:01.769875 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764988 2573 flags.go:64] FLAG: --reserved-memory="" Apr 16 15:12:01.769875 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764991 2573 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 15:12:01.769875 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764994 2573 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 15:12:01.769875 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.764997 2573 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 15:12:01.769875 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.765000 2573 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 15:12:01.769875 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.765003 2573 flags.go:64] FLAG: --runonce="false" Apr 16 15:12:01.769875 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.765006 2573 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 15:12:01.769875 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.765009 2573 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 15:12:01.769875 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.765011 2573 flags.go:64] FLAG: --seccomp-default="false" Apr 16 15:12:01.769875 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.765014 2573 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 15:12:01.770466 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.765017 2573 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 15:12:01.770466 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.765020 2573 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 15:12:01.770466 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.765023 2573 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 15:12:01.770466 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.765026 2573 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 15:12:01.770466 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.765029 2573 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 15:12:01.770466 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.765031 2573 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 15:12:01.770466 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.765034 2573 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 15:12:01.770466 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.765037 2573 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 15:12:01.770466 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.765040 2573 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 15:12:01.770466 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.765044 2573 flags.go:64] FLAG: --system-cgroups="" Apr 16 15:12:01.770466 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.765047 2573 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 15:12:01.770466 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.765052 2573 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 15:12:01.770466 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.765055 2573 flags.go:64] FLAG: --tls-cert-file="" Apr 16 15:12:01.770466 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.765057 2573 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 15:12:01.770466 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.765061 2573 flags.go:64] FLAG: --tls-min-version="" Apr 16 15:12:01.770466 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.765064 2573 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 15:12:01.770466 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.765066 2573 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 15:12:01.770466 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.765069 2573 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 15:12:01.770466 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.765072 2573 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 15:12:01.770466 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.765077 2573 flags.go:64] FLAG: --v="2" Apr 16 15:12:01.770466 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.765082 2573 flags.go:64] FLAG: --version="false" Apr 16 15:12:01.770466 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.765086 2573 flags.go:64] FLAG: --vmodule="" Apr 16 15:12:01.770466 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.765090 2573 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 15:12:01.770466 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.765093 2573 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 15:12:01.770466 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.765197 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 16 15:12:01.771079 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.765201 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 15:12:01.771079 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.765204 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 15:12:01.771079 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.765207 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 15:12:01.771079 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.765210 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 15:12:01.771079 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.765212 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 15:12:01.771079 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.765215 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 15:12:01.771079 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.765217 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 15:12:01.771079 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.765220 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 15:12:01.771079 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.765222 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 15:12:01.771079 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.765225 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 15:12:01.771079 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.765227 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 15:12:01.771079 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.765230 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 15:12:01.771079 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.765232 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 15:12:01.771079 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.765235 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 15:12:01.771079 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.765237 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 15:12:01.771079 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.765240 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 15:12:01.771079 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.765243 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 15:12:01.771079 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.765246 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 15:12:01.771079 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.765248 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 15:12:01.771079 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.765251 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 15:12:01.771936 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.765253 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 15:12:01.771936 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.765256 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 15:12:01.771936 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.765258 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 15:12:01.771936 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.765261 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 15:12:01.771936 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.765263 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 15:12:01.771936 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.765266 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 15:12:01.771936 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.765269 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 15:12:01.771936 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.765272 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 15:12:01.771936 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.765275 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 15:12:01.771936 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.765277 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 15:12:01.771936 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.765280 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 15:12:01.771936 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.765282 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 15:12:01.771936 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.765285 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 15:12:01.771936 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.765287 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 15:12:01.771936 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.765290 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 15:12:01.771936 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.765293 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 15:12:01.771936 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.765295 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 15:12:01.771936 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.765298 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 15:12:01.771936 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.765300 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 15:12:01.771936 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.765303 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 15:12:01.772865 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.765305 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 15:12:01.772865 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.765307 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 15:12:01.772865 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.765310 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 15:12:01.772865 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.765312 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 15:12:01.772865 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.765317 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 15:12:01.772865 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.765320 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 15:12:01.772865 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.765323 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 15:12:01.772865 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.765326 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 15:12:01.772865 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.765329 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 15:12:01.772865 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.765332 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 15:12:01.772865 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.765335 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 15:12:01.772865 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.765337 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 15:12:01.772865 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.765340 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 15:12:01.772865 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.765342 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 15:12:01.772865 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.765345 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 15:12:01.772865 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.765347 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 15:12:01.772865 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.765350 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 15:12:01.772865 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.765352 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 15:12:01.772865 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.765356 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 15:12:01.773667 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.765359 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 15:12:01.773667 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.765362 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 15:12:01.773667 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.765366 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 15:12:01.773667 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.765369 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 15:12:01.773667 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.765371 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 15:12:01.773667 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.765374 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 15:12:01.773667 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.765376 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 15:12:01.773667 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.765379 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 15:12:01.773667 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.765381 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 15:12:01.773667 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.765384 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 15:12:01.773667 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.765387 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 15:12:01.773667 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.765389 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 15:12:01.773667 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.765391 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 15:12:01.773667 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.765394 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 15:12:01.773667 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.765397 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 15:12:01.773667 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.765399 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 15:12:01.773667 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.765402 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 15:12:01.773667 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.765404 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 15:12:01.773667 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.765407 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 15:12:01.773667 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.765410 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 15:12:01.774422 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.765413 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 15:12:01.774422 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.765415 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 15:12:01.774422 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.765418 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 15:12:01.774422 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.765420 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 15:12:01.774422 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.765423 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 15:12:01.774422 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.765426 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 15:12:01.774422 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.765957 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 15:12:01.775290 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.775271 2573 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 15:12:01.775290 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.775290 2573 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 15:12:01.775372 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775340 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 15:12:01.775372 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775346 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 15:12:01.775372 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775349 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 15:12:01.775372 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775352 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 15:12:01.775372 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775354 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 15:12:01.775372 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775357 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 15:12:01.775372 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775360 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 15:12:01.775372 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775363 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 15:12:01.775372 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775365 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 15:12:01.775372 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775368 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 15:12:01.775372 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775371 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 15:12:01.775372 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775373 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 15:12:01.775372 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775377 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 15:12:01.775713 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775379 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 15:12:01.775713 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775382 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 15:12:01.775713 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775385 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 15:12:01.775713 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775387 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 15:12:01.775713 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775391 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 15:12:01.775713 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775393 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 15:12:01.775713 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775397 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 15:12:01.775713 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775400 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 15:12:01.775713 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775402 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 15:12:01.775713 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775405 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 15:12:01.775713 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775407 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 15:12:01.775713 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775410 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 16 15:12:01.775713 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775412 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 15:12:01.775713 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775415 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 15:12:01.775713 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775417 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 15:12:01.775713 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775420 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 15:12:01.775713 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775423 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 15:12:01.775713 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775425 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 15:12:01.775713 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775428 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 15:12:01.775713 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775430 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 15:12:01.776204 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775433 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 15:12:01.776204 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775435 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 15:12:01.776204 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775438 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 15:12:01.776204 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775440 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 15:12:01.776204 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775443 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 15:12:01.776204 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775445 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 15:12:01.776204 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775447 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 15:12:01.776204 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775451 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 15:12:01.776204 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775456 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 15:12:01.776204 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775458 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 15:12:01.776204 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775462 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 15:12:01.776204 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775465 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 15:12:01.776204 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775468 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 15:12:01.776204 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775471 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 15:12:01.776204 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775474 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 15:12:01.776204 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775477 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 15:12:01.776204 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775480 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 15:12:01.776204 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775483 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 15:12:01.776204 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775485 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 15:12:01.776676 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775488 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 15:12:01.776676 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775491 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 15:12:01.776676 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775494 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 15:12:01.776676 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775497 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 15:12:01.776676 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775499 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 15:12:01.776676 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775502 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 15:12:01.776676 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775504 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 15:12:01.776676 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775507 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 15:12:01.776676 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775509 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 15:12:01.776676 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775511 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 15:12:01.776676 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775514 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 15:12:01.776676 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775516 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 15:12:01.776676 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775519 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 15:12:01.776676 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775521 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 15:12:01.776676 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775523 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 15:12:01.776676 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775526 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 15:12:01.776676 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775528 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 15:12:01.776676 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775530 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 15:12:01.776676 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775533 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 15:12:01.776676 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775536 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 15:12:01.777173 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775538 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 15:12:01.777173 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775540 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 15:12:01.777173 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775543 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 15:12:01.777173 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775547 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 15:12:01.777173 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775551 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 15:12:01.777173 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775555 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 15:12:01.777173 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775557 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 15:12:01.777173 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775560 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 15:12:01.777173 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775562 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 15:12:01.777173 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775565 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 15:12:01.777173 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775567 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 15:12:01.777173 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775570 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 15:12:01.777173 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775572 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 15:12:01.777173 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775574 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 15:12:01.777173 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.775580 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 15:12:01.777539 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775692 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 15:12:01.777539 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775699 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 15:12:01.777539 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775702 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 15:12:01.777539 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775705 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 15:12:01.777539 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775708 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 15:12:01.777539 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775711 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 15:12:01.777539 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775714 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 15:12:01.777539 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775717 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 15:12:01.777539 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775720 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 15:12:01.777539 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775723 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 15:12:01.777539 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775725 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 15:12:01.777539 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775728 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 15:12:01.777539 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775730 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 15:12:01.777539 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775733 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 15:12:01.777539 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775735 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 15:12:01.777539 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775738 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 15:12:01.777539 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775740 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 15:12:01.777539 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775743 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 15:12:01.777539 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775745 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 15:12:01.777539 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775748 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 15:12:01.778056 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775751 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 15:12:01.778056 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775753 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 15:12:01.778056 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775756 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 15:12:01.778056 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775759 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 15:12:01.778056 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775762 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 15:12:01.778056 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775765 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 15:12:01.778056 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775767 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 15:12:01.778056 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775770 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 15:12:01.778056 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775772 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 15:12:01.778056 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775775 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 15:12:01.778056 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775778 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 15:12:01.778056 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775780 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 15:12:01.778056 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775782 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 15:12:01.778056 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775785 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 15:12:01.778056 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775787 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 15:12:01.778056 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775789 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 15:12:01.778056 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775792 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 15:12:01.778056 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775794 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 15:12:01.778056 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775797 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 15:12:01.778056 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775799 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 15:12:01.778536 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775801 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 15:12:01.778536 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775804 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 15:12:01.778536 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775806 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 15:12:01.778536 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775809 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 15:12:01.778536 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775811 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 15:12:01.778536 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775814 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 15:12:01.778536 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775817 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 15:12:01.778536 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775819 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 15:12:01.778536 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775822 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 15:12:01.778536 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775824 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 15:12:01.778536 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775827 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 15:12:01.778536 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775829 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 15:12:01.778536 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775832 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 15:12:01.778536 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775835 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 15:12:01.778536 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775837 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 16 15:12:01.778536 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775840 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 15:12:01.778536 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775843 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 15:12:01.778536 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775845 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 15:12:01.778536 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775848 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 15:12:01.778536 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775850 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 15:12:01.779098 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775852 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 15:12:01.779098 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775855 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 15:12:01.779098 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775857 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 15:12:01.779098 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775859 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 15:12:01.779098 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775862 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 15:12:01.779098 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775865 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 15:12:01.779098 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775867 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 15:12:01.779098 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775871 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 15:12:01.779098 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775874 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 15:12:01.779098 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775877 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 15:12:01.779098 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775879 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 15:12:01.779098 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775882 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 15:12:01.779098 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775885 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 15:12:01.779098 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775888 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 15:12:01.779098 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775890 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 15:12:01.779098 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775893 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 15:12:01.779098 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775895 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 15:12:01.779098 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775898 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 15:12:01.779098 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775900 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 15:12:01.779580 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775903 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 15:12:01.779580 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775905 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 15:12:01.779580 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775908 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 15:12:01.779580 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775911 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 15:12:01.779580 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775913 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 15:12:01.779580 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775917 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 15:12:01.779580 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:01.775920 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 15:12:01.779580 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.775926 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 15:12:01.779580 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.776506 2573 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 15:12:01.779580 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.779103 2573 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 15:12:01.780086 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.780074 2573 server.go:1019] "Starting client certificate rotation" Apr 16 15:12:01.780191 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.780172 2573 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 15:12:01.780232 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.780215 2573 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 15:12:01.804775 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.804755 2573 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 15:12:01.807108 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.807090 2573 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 15:12:01.821717 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.821700 2573 log.go:25] "Validated CRI v1 runtime API" Apr 16 15:12:01.826226 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.826213 2573 log.go:25] "Validated CRI v1 image API" Apr 16 15:12:01.827387 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.827373 2573 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 15:12:01.829484 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.829460 2573 fs.go:135] Filesystem UUIDs: map[145307b3-bf95-48a0-b896-63b417fb9e34:/dev/nvme0n1p4 1b746e4d-116d-49cb-b399-75e36d5e1e35:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2] Apr 16 15:12:01.829524 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.829485 2573 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 15:12:01.835974 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.835871 2573 manager.go:217] Machine: {Timestamp:2026-04-16 15:12:01.833844574 +0000 UTC m=+0.420048374 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100243 MemoryCapacity:32812171264 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec208a6b3705310a8c8196dc1154e7e8 SystemUUID:ec208a6b-3705-310a-8c81-96dc1154e7e8 BootID:b05ef8df-80cc-4573-ab4e-777042187e7d Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6098944 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406085632 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:1b:e7:6d:29:bf Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:1b:e7:6d:29:bf Speed:0 Mtu:9001} {Name:ovs-system MacAddress:5a:8e:4f:6d:ad:e4 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812171264 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 15:12:01.835974 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.835968 2573 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 15:12:01.836079 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.836047 2573 manager.go:233] Version: {KernelVersion:5.14.0-570.104.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260401-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 15:12:01.837909 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.837889 2573 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 15:12:01.838042 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.837912 2573 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-136-151.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 15:12:01.838092 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.838049 2573 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 15:12:01.838092 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.838057 2573 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 15:12:01.838092 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.838069 2573 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 15:12:01.838977 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.838966 2573 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 15:12:01.840919 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.840909 2573 state_mem.go:36] "Initialized new in-memory state store" Apr 16 15:12:01.841033 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.841024 2573 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 15:12:01.844515 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.844501 2573 kubelet.go:491] "Attempting to sync node with API server" Apr 16 15:12:01.844515 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.844516 2573 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 15:12:01.844630 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.844535 2573 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 15:12:01.844630 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.844545 2573 kubelet.go:397] "Adding apiserver pod source" Apr 16 15:12:01.844630 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.844555 2573 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 15:12:01.845662 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.845631 2573 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 15:12:01.845703 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.845667 2573 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 15:12:01.852099 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.852073 2573 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 15:12:01.853959 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.853945 2573 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 15:12:01.856426 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.856413 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 15:12:01.856494 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.856430 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 15:12:01.856494 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.856437 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 15:12:01.856494 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.856443 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 15:12:01.856494 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.856448 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 15:12:01.856494 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.856453 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 15:12:01.856494 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.856459 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 15:12:01.856494 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.856464 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 15:12:01.856494 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.856471 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 15:12:01.856494 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.856477 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 15:12:01.856494 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.856486 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 15:12:01.856494 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.856493 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 15:12:01.857433 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.857421 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 15:12:01.857433 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.857430 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 15:12:01.860863 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.860850 2573 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 15:12:01.860948 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.860886 2573 server.go:1295] "Started kubelet" Apr 16 15:12:01.861030 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.860986 2573 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 15:12:01.861064 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.861055 2573 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 15:12:01.861715 ip-10-0-136-151 systemd[1]: Started Kubernetes Kubelet. Apr 16 15:12:01.861801 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.861763 2573 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 15:12:01.862007 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.861994 2573 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 15:12:01.863403 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.863381 2573 server.go:317] "Adding debug handlers to kubelet server" Apr 16 15:12:01.866768 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.866749 2573 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 15:12:01.867484 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.867470 2573 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 15:12:01.868405 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.868370 2573 factory.go:55] Registering systemd factory Apr 16 15:12:01.868486 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.868445 2573 factory.go:223] Registration of the systemd container factory successfully Apr 16 15:12:01.868702 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.868678 2573 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 15:12:01.868702 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.868704 2573 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 15:12:01.868847 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.868750 2573 factory.go:153] Registering CRI-O factory Apr 16 15:12:01.868847 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:01.868772 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-151.ec2.internal\" not found" Apr 16 15:12:01.868847 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.868782 2573 factory.go:223] Registration of the crio container factory successfully Apr 16 15:12:01.868847 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.868782 2573 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 15:12:01.868847 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.868838 2573 reconstruct.go:97] "Volume reconstruction finished" Apr 16 15:12:01.868847 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.868847 2573 reconciler.go:26] "Reconciler: start to sync state" Apr 16 15:12:01.869096 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.868846 2573 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 15:12:01.869299 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.869280 2573 factory.go:103] Registering Raw factory Apr 16 15:12:01.869365 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.869308 2573 manager.go:1196] Started watching for new ooms in manager Apr 16 15:12:01.870626 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.870610 2573 manager.go:319] Starting recovery of all containers Apr 16 15:12:01.872373 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:01.872312 2573 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 15:12:01.880383 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.880228 2573 manager.go:324] Recovery completed Apr 16 15:12:01.882839 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:01.882811 2573 watcher.go:152] Failed to watch directory "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service": readdirent /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service: no such file or directory Apr 16 15:12:01.885981 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.885967 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 15:12:01.886347 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.886327 2573 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 15:12:01.888552 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.888537 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-151.ec2.internal" event="NodeHasSufficientMemory" Apr 16 15:12:01.888600 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.888565 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-151.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 15:12:01.888600 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.888576 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-151.ec2.internal" event="NodeHasSufficientPID" Apr 16 15:12:01.889010 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.888997 2573 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 15:12:01.889072 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.889009 2573 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 15:12:01.889072 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.889048 2573 state_mem.go:36] "Initialized new in-memory state store" Apr 16 15:12:01.891238 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.891223 2573 policy_none.go:49] "None policy: Start" Apr 16 15:12:01.891315 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.891244 2573 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 15:12:01.891315 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.891259 2573 state_mem.go:35] "Initializing new in-memory state store" Apr 16 15:12:01.908755 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:01.908709 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-136-151.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 15:12:01.908755 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:01.908706 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 16 15:12:01.908871 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:01.908786 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 15:12:01.908871 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.908805 2573 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-136-151.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 15:12:01.908929 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:01.908873 2573 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-136-151.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 16 15:12:01.909634 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:01.908692 2573 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-136-151.ec2.internal.18a6df0674750041 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-136-151.ec2.internal,UID:ip-10-0-136-151.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-136-151.ec2.internal,},FirstTimestamp:2026-04-16 15:12:01.860862017 +0000 UTC m=+0.447065816,LastTimestamp:2026-04-16 15:12:01.860862017 +0000 UTC m=+0.447065816,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-136-151.ec2.internal,}" Apr 16 15:12:01.925091 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:01.925015 2573 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-136-151.ec2.internal.18a6df06761b8980 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-136-151.ec2.internal,UID:ip-10-0-136-151.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-136-151.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-136-151.ec2.internal,},FirstTimestamp:2026-04-16 15:12:01.888553344 +0000 UTC m=+0.474757143,LastTimestamp:2026-04-16 15:12:01.888553344 +0000 UTC m=+0.474757143,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-136-151.ec2.internal,}" Apr 16 15:12:01.932297 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.932277 2573 manager.go:341] "Starting Device Plugin manager" Apr 16 15:12:01.932416 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:01.932311 2573 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 15:12:01.932416 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.932321 2573 server.go:85] "Starting device plugin registration server" Apr 16 15:12:01.932521 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.932514 2573 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 15:12:01.932569 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.932522 2573 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 15:12:01.934289 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.933380 2573 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 15:12:01.934289 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.933478 2573 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 15:12:01.934289 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:01.933487 2573 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 15:12:01.934882 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:01.934667 2573 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 15:12:01.934882 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:01.934712 2573 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-136-151.ec2.internal\" not found" Apr 16 15:12:01.949896 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:01.938253 2573 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-136-151.ec2.internal.18a6df06761bcc5b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-136-151.ec2.internal,UID:ip-10-0-136-151.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-136-151.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-136-151.ec2.internal,},FirstTimestamp:2026-04-16 15:12:01.888570459 +0000 UTC m=+0.474774259,LastTimestamp:2026-04-16 15:12:01.888570459 +0000 UTC m=+0.474774259,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-136-151.ec2.internal,}" Apr 16 15:12:01.954965 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:01.954896 2573 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-136-151.ec2.internal.18a6df06761bf191 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-136-151.ec2.internal,UID:ip-10-0-136-151.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node ip-10-0-136-151.ec2.internal status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:ip-10-0-136-151.ec2.internal,},FirstTimestamp:2026-04-16 15:12:01.888579985 +0000 UTC m=+0.474783785,LastTimestamp:2026-04-16 15:12:01.888579985 +0000 UTC m=+0.474783785,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-136-151.ec2.internal,}" Apr 16 15:12:01.976782 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:01.976714 2573 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-136-151.ec2.internal.18a6df06793d369d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-136-151.ec2.internal,UID:ip-10-0-136-151.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:ip-10-0-136-151.ec2.internal,},FirstTimestamp:2026-04-16 15:12:01.941091997 +0000 UTC m=+0.527295803,LastTimestamp:2026-04-16 15:12:01.941091997 +0000 UTC m=+0.527295803,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-136-151.ec2.internal,}" Apr 16 15:12:02.011802 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:02.011722 2573 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 15:12:02.013088 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:02.013073 2573 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 15:12:02.013167 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:02.013100 2573 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 15:12:02.013167 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:02.013126 2573 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 15:12:02.013167 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:02.013133 2573 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 15:12:02.013356 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:02.013170 2573 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 15:12:02.032982 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:02.032964 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 15:12:02.035322 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:02.035306 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-151.ec2.internal" event="NodeHasSufficientMemory" Apr 16 15:12:02.035395 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:02.035341 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-151.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 15:12:02.035395 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:02.035353 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-151.ec2.internal" event="NodeHasSufficientPID" Apr 16 15:12:02.035395 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:02.035384 2573 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-136-151.ec2.internal" Apr 16 15:12:02.036214 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:02.036171 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Apr 16 15:12:02.047594 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:02.047577 2573 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-9f9gz" Apr 16 15:12:02.054514 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:02.054432 2573 event.go:359] "Server rejected event (will not retry!)" err="events \"ip-10-0-136-151.ec2.internal.18a6df06761b8980\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-136-151.ec2.internal.18a6df06761b8980 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-136-151.ec2.internal,UID:ip-10-0-136-151.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-136-151.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-136-151.ec2.internal,},FirstTimestamp:2026-04-16 15:12:01.888553344 +0000 UTC m=+0.474757143,LastTimestamp:2026-04-16 15:12:02.035323587 +0000 UTC m=+0.621527388,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-136-151.ec2.internal,}" Apr 16 15:12:02.065884 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:02.065864 2573 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-9f9gz" Apr 16 15:12:02.071680 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:02.071660 2573 kubelet_node_status.go:116] "Unable to register node with API server, error getting existing node" err="nodes \"ip-10-0-136-151.ec2.internal\" not found" node="ip-10-0-136-151.ec2.internal" Apr 16 15:12:02.114004 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:02.113980 2573 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-151.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-136-151.ec2.internal"] Apr 16 15:12:02.114075 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:02.114069 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 15:12:02.114966 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:02.114951 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-151.ec2.internal" event="NodeHasSufficientMemory" Apr 16 15:12:02.115019 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:02.114983 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-151.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 15:12:02.115019 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:02.114994 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-151.ec2.internal" event="NodeHasSufficientPID" Apr 16 15:12:02.116238 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:02.116226 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 15:12:02.116378 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:02.116365 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-151.ec2.internal" Apr 16 15:12:02.116407 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:02.116396 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 15:12:02.116927 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:02.116906 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-151.ec2.internal" event="NodeHasSufficientMemory" Apr 16 15:12:02.116927 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:02.116925 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-151.ec2.internal" event="NodeHasSufficientMemory" Apr 16 15:12:02.117065 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:02.116939 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-151.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 15:12:02.117065 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:02.116943 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-151.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 15:12:02.117065 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:02.116954 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-151.ec2.internal" event="NodeHasSufficientPID" Apr 16 15:12:02.117065 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:02.116954 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-151.ec2.internal" event="NodeHasSufficientPID" Apr 16 15:12:02.118416 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:02.118399 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-151.ec2.internal" Apr 16 15:12:02.118488 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:02.118423 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 15:12:02.119090 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:02.119062 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-151.ec2.internal" event="NodeHasSufficientMemory" Apr 16 15:12:02.119090 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:02.119092 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-151.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 15:12:02.119217 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:02.119101 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-151.ec2.internal" event="NodeHasSufficientPID" Apr 16 15:12:02.127670 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:02.127638 2573 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-136-151.ec2.internal\" not found" node="ip-10-0-136-151.ec2.internal" Apr 16 15:12:02.146265 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:02.146249 2573 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-136-151.ec2.internal\" not found" node="ip-10-0-136-151.ec2.internal" Apr 16 15:12:02.150431 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:02.150416 2573 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-136-151.ec2.internal\" not found" node="ip-10-0-136-151.ec2.internal" Apr 16 15:12:02.170305 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:02.170283 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/a58163a404ebe8fd3d421aefb0ba0f25-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-136-151.ec2.internal\" (UID: \"a58163a404ebe8fd3d421aefb0ba0f25\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-151.ec2.internal" Apr 16 15:12:02.271255 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:02.271177 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/a58163a404ebe8fd3d421aefb0ba0f25-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-136-151.ec2.internal\" (UID: \"a58163a404ebe8fd3d421aefb0ba0f25\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-151.ec2.internal" Apr 16 15:12:02.271255 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:02.271217 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a58163a404ebe8fd3d421aefb0ba0f25-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-136-151.ec2.internal\" (UID: \"a58163a404ebe8fd3d421aefb0ba0f25\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-151.ec2.internal" Apr 16 15:12:02.271255 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:02.271254 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/a58163a404ebe8fd3d421aefb0ba0f25-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-136-151.ec2.internal\" (UID: \"a58163a404ebe8fd3d421aefb0ba0f25\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-151.ec2.internal" Apr 16 15:12:02.271436 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:02.271288 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/965c3203e0e8294321ef158b59084066-config\") pod \"kube-apiserver-proxy-ip-10-0-136-151.ec2.internal\" (UID: \"965c3203e0e8294321ef158b59084066\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-136-151.ec2.internal" Apr 16 15:12:02.272221 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:02.272197 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 15:12:02.273164 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:02.273149 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-151.ec2.internal" event="NodeHasSufficientMemory" Apr 16 15:12:02.273237 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:02.273179 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-151.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 15:12:02.273237 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:02.273198 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-151.ec2.internal" event="NodeHasSufficientPID" Apr 16 15:12:02.273237 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:02.273226 2573 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-136-151.ec2.internal" Apr 16 15:12:02.312764 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:02.312742 2573 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-136-151.ec2.internal" Apr 16 15:12:02.312821 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:02.312767 2573 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-136-151.ec2.internal\": node \"ip-10-0-136-151.ec2.internal\" not found" Apr 16 15:12:02.372192 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:02.372158 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a58163a404ebe8fd3d421aefb0ba0f25-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-136-151.ec2.internal\" (UID: \"a58163a404ebe8fd3d421aefb0ba0f25\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-151.ec2.internal" Apr 16 15:12:02.372192 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:02.372196 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/965c3203e0e8294321ef158b59084066-config\") pod \"kube-apiserver-proxy-ip-10-0-136-151.ec2.internal\" (UID: \"965c3203e0e8294321ef158b59084066\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-136-151.ec2.internal" Apr 16 15:12:02.372368 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:02.372250 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/965c3203e0e8294321ef158b59084066-config\") pod \"kube-apiserver-proxy-ip-10-0-136-151.ec2.internal\" (UID: \"965c3203e0e8294321ef158b59084066\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-136-151.ec2.internal" Apr 16 15:12:02.372368 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:02.372253 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a58163a404ebe8fd3d421aefb0ba0f25-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-136-151.ec2.internal\" (UID: \"a58163a404ebe8fd3d421aefb0ba0f25\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-151.ec2.internal" Apr 16 15:12:02.430279 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:02.430249 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-151.ec2.internal\" not found" Apr 16 15:12:02.449335 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:02.449315 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-151.ec2.internal" Apr 16 15:12:02.453174 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:02.453141 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-151.ec2.internal" Apr 16 15:12:02.530419 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:02.530356 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-151.ec2.internal\" not found" Apr 16 15:12:02.630990 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:02.630954 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-151.ec2.internal\" not found" Apr 16 15:12:02.731057 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:02.731022 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-151.ec2.internal\" not found" Apr 16 15:12:02.780765 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:02.780683 2573 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 15:12:02.832036 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:02.832002 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-151.ec2.internal\" not found" Apr 16 15:12:02.867494 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:02.867470 2573 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 15:12:02.932720 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:02.932690 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-151.ec2.internal\" not found" Apr 16 15:12:02.948842 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:02.948804 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod965c3203e0e8294321ef158b59084066.slice/crio-ab461ca4612668dbab6cd5fee7fa77801210cbe35b76a7dd2fa3efbcb6cfdfe2 WatchSource:0}: Error finding container ab461ca4612668dbab6cd5fee7fa77801210cbe35b76a7dd2fa3efbcb6cfdfe2: Status 404 returned error can't find the container with id ab461ca4612668dbab6cd5fee7fa77801210cbe35b76a7dd2fa3efbcb6cfdfe2 Apr 16 15:12:02.949117 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:02.949099 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda58163a404ebe8fd3d421aefb0ba0f25.slice/crio-d07bbf322ae011ff43273c4f7203901db8e8d9132409ea6eb50eba437d9d1a06 WatchSource:0}: Error finding container d07bbf322ae011ff43273c4f7203901db8e8d9132409ea6eb50eba437d9d1a06: Status 404 returned error can't find the container with id d07bbf322ae011ff43273c4f7203901db8e8d9132409ea6eb50eba437d9d1a06 Apr 16 15:12:02.953419 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:02.953399 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 15:12:02.970392 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:02.970375 2573 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 15:12:03.016364 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.016313 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-151.ec2.internal" event={"ID":"a58163a404ebe8fd3d421aefb0ba0f25","Type":"ContainerStarted","Data":"d07bbf322ae011ff43273c4f7203901db8e8d9132409ea6eb50eba437d9d1a06"} Apr 16 15:12:03.016701 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.016686 2573 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 15:12:03.017619 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.017598 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-151.ec2.internal" event={"ID":"965c3203e0e8294321ef158b59084066","Type":"ContainerStarted","Data":"ab461ca4612668dbab6cd5fee7fa77801210cbe35b76a7dd2fa3efbcb6cfdfe2"} Apr 16 15:12:03.022088 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.022072 2573 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 15:12:03.068425 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.068366 2573 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-151.ec2.internal" Apr 16 15:12:03.068516 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.068359 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 15:07:02 +0000 UTC" deadline="2027-11-08 23:03:47.147483084 +0000 UTC" Apr 16 15:12:03.068516 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.068470 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13711h51m44.07901888s" Apr 16 15:12:03.214855 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.214829 2573 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 15:12:03.363693 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.363601 2573 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 15:12:03.364888 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.364870 2573 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-151.ec2.internal" Apr 16 15:12:03.383946 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.383921 2573 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 15:12:03.475796 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.475775 2573 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 15:12:03.508705 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.508672 2573 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-9c4kp" Apr 16 15:12:03.600508 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.600475 2573 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-9c4kp" Apr 16 15:12:03.846344 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.846318 2573 apiserver.go:52] "Watching apiserver" Apr 16 15:12:03.857350 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.857323 2573 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 15:12:03.859285 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.859258 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lrfp8","openshift-cluster-node-tuning-operator/tuned-bsxxt","openshift-image-registry/node-ca-4pfr6","openshift-multus/multus-8cgfx","openshift-network-diagnostics/network-check-target-ggs7q","openshift-network-operator/iptables-alerter-zf2wm","kube-system/konnectivity-agent-2dkml","openshift-dns/node-resolver-sjt8j","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-151.ec2.internal","openshift-multus/multus-additional-cni-plugins-4kjbb","openshift-multus/network-metrics-daemon-h9c44","openshift-ovn-kubernetes/ovnkube-node-n96lw","kube-system/kube-apiserver-proxy-ip-10-0-136-151.ec2.internal"] Apr 16 15:12:03.862272 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.862243 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-2dkml" Apr 16 15:12:03.863433 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.863224 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-bsxxt" Apr 16 15:12:03.863433 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.863318 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-4pfr6" Apr 16 15:12:03.864291 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.864266 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-8cgfx" Apr 16 15:12:03.867678 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.867641 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ggs7q" Apr 16 15:12:03.867779 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:03.867727 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ggs7q" podUID="905049f6-70d0-42ea-bf9f-d37c4f436506" Apr 16 15:12:03.868778 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.868756 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-zf2wm" Apr 16 15:12:03.868947 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.868924 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lrfp8" Apr 16 15:12:03.870140 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.870121 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-4kjbb" Apr 16 15:12:03.871336 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.871290 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-sjt8j" Apr 16 15:12:03.873871 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.873856 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h9c44" Apr 16 15:12:03.873961 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:03.873910 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h9c44" podUID="414282ac-76dc-4992-819a-bdf05043c09d" Apr 16 15:12:03.873961 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.873860 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-n96lw" Apr 16 15:12:03.879612 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.879593 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/81bf7298-f21b-4f3a-adb6-d3a8c98c8da7-var-lib-kubelet\") pod \"tuned-bsxxt\" (UID: \"81bf7298-f21b-4f3a-adb6-d3a8c98c8da7\") " pod="openshift-cluster-node-tuning-operator/tuned-bsxxt" Apr 16 15:12:03.879722 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.879630 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/0b3e092e-d180-4205-ac3b-9d71123acb4d-host-run-k8s-cni-cncf-io\") pod \"multus-8cgfx\" (UID: \"0b3e092e-d180-4205-ac3b-9d71123acb4d\") " pod="openshift-multus/multus-8cgfx" Apr 16 15:12:03.879722 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.879677 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/4caba475-2461-4633-a1d4-00e049169fb5-device-dir\") pod \"aws-ebs-csi-driver-node-lrfp8\" (UID: \"4caba475-2461-4633-a1d4-00e049169fb5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lrfp8" Apr 16 15:12:03.879828 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.879735 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/81bf7298-f21b-4f3a-adb6-d3a8c98c8da7-etc-sysctl-d\") pod \"tuned-bsxxt\" (UID: \"81bf7298-f21b-4f3a-adb6-d3a8c98c8da7\") " pod="openshift-cluster-node-tuning-operator/tuned-bsxxt" Apr 16 15:12:03.879828 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.879792 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/81bf7298-f21b-4f3a-adb6-d3a8c98c8da7-etc-sysctl-conf\") pod \"tuned-bsxxt\" (UID: \"81bf7298-f21b-4f3a-adb6-d3a8c98c8da7\") " pod="openshift-cluster-node-tuning-operator/tuned-bsxxt" Apr 16 15:12:03.881005 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.880691 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/914c9c02-e314-4b0c-b5f6-aab4f4681c8d-cnibin\") pod \"multus-additional-cni-plugins-4kjbb\" (UID: \"914c9c02-e314-4b0c-b5f6-aab4f4681c8d\") " pod="openshift-multus/multus-additional-cni-plugins-4kjbb" Apr 16 15:12:03.881076 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.881031 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0b3e092e-d180-4205-ac3b-9d71123acb4d-system-cni-dir\") pod \"multus-8cgfx\" (UID: \"0b3e092e-d180-4205-ac3b-9d71123acb4d\") " pod="openshift-multus/multus-8cgfx" Apr 16 15:12:03.881076 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.881056 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0b3e092e-d180-4205-ac3b-9d71123acb4d-os-release\") pod \"multus-8cgfx\" (UID: \"0b3e092e-d180-4205-ac3b-9d71123acb4d\") " pod="openshift-multus/multus-8cgfx" Apr 16 15:12:03.881159 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.881079 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0b3e092e-d180-4205-ac3b-9d71123acb4d-etc-kubernetes\") pod \"multus-8cgfx\" (UID: \"0b3e092e-d180-4205-ac3b-9d71123acb4d\") " pod="openshift-multus/multus-8cgfx" Apr 16 15:12:03.881159 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.881118 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/81bf7298-f21b-4f3a-adb6-d3a8c98c8da7-etc-systemd\") pod \"tuned-bsxxt\" (UID: \"81bf7298-f21b-4f3a-adb6-d3a8c98c8da7\") " pod="openshift-cluster-node-tuning-operator/tuned-bsxxt" Apr 16 15:12:03.881159 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.881145 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0b3e092e-d180-4205-ac3b-9d71123acb4d-multus-conf-dir\") pod \"multus-8cgfx\" (UID: \"0b3e092e-d180-4205-ac3b-9d71123acb4d\") " pod="openshift-multus/multus-8cgfx" Apr 16 15:12:03.881250 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.881168 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/81bf7298-f21b-4f3a-adb6-d3a8c98c8da7-etc-tuned\") pod \"tuned-bsxxt\" (UID: \"81bf7298-f21b-4f3a-adb6-d3a8c98c8da7\") " pod="openshift-cluster-node-tuning-operator/tuned-bsxxt" Apr 16 15:12:03.881250 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.881225 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twwkr\" (UniqueName: \"kubernetes.io/projected/81bf7298-f21b-4f3a-adb6-d3a8c98c8da7-kube-api-access-twwkr\") pod \"tuned-bsxxt\" (UID: \"81bf7298-f21b-4f3a-adb6-d3a8c98c8da7\") " pod="openshift-cluster-node-tuning-operator/tuned-bsxxt" Apr 16 15:12:03.881307 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.881250 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2f287493-2507-4465-97b3-78fa8046ab43-host-slash\") pod \"iptables-alerter-zf2wm\" (UID: \"2f287493-2507-4465-97b3-78fa8046ab43\") " pod="openshift-network-operator/iptables-alerter-zf2wm" Apr 16 15:12:03.881307 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.881275 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/0b3e092e-d180-4205-ac3b-9d71123acb4d-host-run-multus-certs\") pod \"multus-8cgfx\" (UID: \"0b3e092e-d180-4205-ac3b-9d71123acb4d\") " pod="openshift-multus/multus-8cgfx" Apr 16 15:12:03.881307 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.881299 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0b3e092e-d180-4205-ac3b-9d71123acb4d-multus-cni-dir\") pod \"multus-8cgfx\" (UID: \"0b3e092e-d180-4205-ac3b-9d71123acb4d\") " pod="openshift-multus/multus-8cgfx" Apr 16 15:12:03.881399 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.881323 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4caba475-2461-4633-a1d4-00e049169fb5-kubelet-dir\") pod \"aws-ebs-csi-driver-node-lrfp8\" (UID: \"4caba475-2461-4633-a1d4-00e049169fb5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lrfp8" Apr 16 15:12:03.881399 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.881340 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/914c9c02-e314-4b0c-b5f6-aab4f4681c8d-system-cni-dir\") pod \"multus-additional-cni-plugins-4kjbb\" (UID: \"914c9c02-e314-4b0c-b5f6-aab4f4681c8d\") " pod="openshift-multus/multus-additional-cni-plugins-4kjbb" Apr 16 15:12:03.881399 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.881355 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/4caba475-2461-4633-a1d4-00e049169fb5-etc-selinux\") pod \"aws-ebs-csi-driver-node-lrfp8\" (UID: \"4caba475-2461-4633-a1d4-00e049169fb5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lrfp8" Apr 16 15:12:03.881399 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.881372 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/81bf7298-f21b-4f3a-adb6-d3a8c98c8da7-sys\") pod \"tuned-bsxxt\" (UID: \"81bf7298-f21b-4f3a-adb6-d3a8c98c8da7\") " pod="openshift-cluster-node-tuning-operator/tuned-bsxxt" Apr 16 15:12:03.881399 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.881394 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/914c9c02-e314-4b0c-b5f6-aab4f4681c8d-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-4kjbb\" (UID: \"914c9c02-e314-4b0c-b5f6-aab4f4681c8d\") " pod="openshift-multus/multus-additional-cni-plugins-4kjbb" Apr 16 15:12:03.881542 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.881411 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m27xg\" (UniqueName: \"kubernetes.io/projected/2f287493-2507-4465-97b3-78fa8046ab43-kube-api-access-m27xg\") pod \"iptables-alerter-zf2wm\" (UID: \"2f287493-2507-4465-97b3-78fa8046ab43\") " pod="openshift-network-operator/iptables-alerter-zf2wm" Apr 16 15:12:03.881542 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.881424 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/baca9b64-0c10-49d7-9301-8d81286c87b6-agent-certs\") pod \"konnectivity-agent-2dkml\" (UID: \"baca9b64-0c10-49d7-9301-8d81286c87b6\") " pod="kube-system/konnectivity-agent-2dkml" Apr 16 15:12:03.881542 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.881440 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/baca9b64-0c10-49d7-9301-8d81286c87b6-konnectivity-ca\") pod \"konnectivity-agent-2dkml\" (UID: \"baca9b64-0c10-49d7-9301-8d81286c87b6\") " pod="kube-system/konnectivity-agent-2dkml" Apr 16 15:12:03.881542 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.881454 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0b3e092e-d180-4205-ac3b-9d71123acb4d-host-var-lib-kubelet\") pod \"multus-8cgfx\" (UID: \"0b3e092e-d180-4205-ac3b-9d71123acb4d\") " pod="openshift-multus/multus-8cgfx" Apr 16 15:12:03.881542 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.881468 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/0b3e092e-d180-4205-ac3b-9d71123acb4d-hostroot\") pod \"multus-8cgfx\" (UID: \"0b3e092e-d180-4205-ac3b-9d71123acb4d\") " pod="openshift-multus/multus-8cgfx" Apr 16 15:12:03.881542 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.881482 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/0b3e092e-d180-4205-ac3b-9d71123acb4d-multus-daemon-config\") pod \"multus-8cgfx\" (UID: \"0b3e092e-d180-4205-ac3b-9d71123acb4d\") " pod="openshift-multus/multus-8cgfx" Apr 16 15:12:03.881542 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.881502 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l758b\" (UniqueName: \"kubernetes.io/projected/914c9c02-e314-4b0c-b5f6-aab4f4681c8d-kube-api-access-l758b\") pod \"multus-additional-cni-plugins-4kjbb\" (UID: \"914c9c02-e314-4b0c-b5f6-aab4f4681c8d\") " pod="openshift-multus/multus-additional-cni-plugins-4kjbb" Apr 16 15:12:03.881542 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.881519 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bf188288-b071-4a5d-867d-17022354cd51-host\") pod \"node-ca-4pfr6\" (UID: \"bf188288-b071-4a5d-867d-17022354cd51\") " pod="openshift-image-registry/node-ca-4pfr6" Apr 16 15:12:03.881542 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.881534 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/914c9c02-e314-4b0c-b5f6-aab4f4681c8d-os-release\") pod \"multus-additional-cni-plugins-4kjbb\" (UID: \"914c9c02-e314-4b0c-b5f6-aab4f4681c8d\") " pod="openshift-multus/multus-additional-cni-plugins-4kjbb" Apr 16 15:12:03.881843 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.881548 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/914c9c02-e314-4b0c-b5f6-aab4f4681c8d-cni-binary-copy\") pod \"multus-additional-cni-plugins-4kjbb\" (UID: \"914c9c02-e314-4b0c-b5f6-aab4f4681c8d\") " pod="openshift-multus/multus-additional-cni-plugins-4kjbb" Apr 16 15:12:03.881843 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.881562 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/914c9c02-e314-4b0c-b5f6-aab4f4681c8d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-4kjbb\" (UID: \"914c9c02-e314-4b0c-b5f6-aab4f4681c8d\") " pod="openshift-multus/multus-additional-cni-plugins-4kjbb" Apr 16 15:12:03.881843 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.881577 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0b3e092e-d180-4205-ac3b-9d71123acb4d-cni-binary-copy\") pod \"multus-8cgfx\" (UID: \"0b3e092e-d180-4205-ac3b-9d71123acb4d\") " pod="openshift-multus/multus-8cgfx" Apr 16 15:12:03.881843 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.881604 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bjrv\" (UniqueName: \"kubernetes.io/projected/4caba475-2461-4633-a1d4-00e049169fb5-kube-api-access-4bjrv\") pod \"aws-ebs-csi-driver-node-lrfp8\" (UID: \"4caba475-2461-4633-a1d4-00e049169fb5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lrfp8" Apr 16 15:12:03.881843 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.881618 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/4caba475-2461-4633-a1d4-00e049169fb5-socket-dir\") pod \"aws-ebs-csi-driver-node-lrfp8\" (UID: \"4caba475-2461-4633-a1d4-00e049169fb5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lrfp8" Apr 16 15:12:03.881843 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.881632 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/81bf7298-f21b-4f3a-adb6-d3a8c98c8da7-etc-kubernetes\") pod \"tuned-bsxxt\" (UID: \"81bf7298-f21b-4f3a-adb6-d3a8c98c8da7\") " pod="openshift-cluster-node-tuning-operator/tuned-bsxxt" Apr 16 15:12:03.881843 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.881664 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfpcz\" (UniqueName: \"kubernetes.io/projected/bf188288-b071-4a5d-867d-17022354cd51-kube-api-access-bfpcz\") pod \"node-ca-4pfr6\" (UID: \"bf188288-b071-4a5d-867d-17022354cd51\") " pod="openshift-image-registry/node-ca-4pfr6" Apr 16 15:12:03.881843 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.881687 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0b3e092e-d180-4205-ac3b-9d71123acb4d-host-var-lib-cni-bin\") pod \"multus-8cgfx\" (UID: \"0b3e092e-d180-4205-ac3b-9d71123acb4d\") " pod="openshift-multus/multus-8cgfx" Apr 16 15:12:03.882175 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.881815 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0b3e092e-d180-4205-ac3b-9d71123acb4d-host-run-netns\") pod \"multus-8cgfx\" (UID: \"0b3e092e-d180-4205-ac3b-9d71123acb4d\") " pod="openshift-multus/multus-8cgfx" Apr 16 15:12:03.882175 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.881956 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/4caba475-2461-4633-a1d4-00e049169fb5-sys-fs\") pod \"aws-ebs-csi-driver-node-lrfp8\" (UID: \"4caba475-2461-4633-a1d4-00e049169fb5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lrfp8" Apr 16 15:12:03.882175 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.882000 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/81bf7298-f21b-4f3a-adb6-d3a8c98c8da7-lib-modules\") pod \"tuned-bsxxt\" (UID: \"81bf7298-f21b-4f3a-adb6-d3a8c98c8da7\") " pod="openshift-cluster-node-tuning-operator/tuned-bsxxt" Apr 16 15:12:03.882175 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.882042 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/4caba475-2461-4633-a1d4-00e049169fb5-registration-dir\") pod \"aws-ebs-csi-driver-node-lrfp8\" (UID: \"4caba475-2461-4633-a1d4-00e049169fb5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lrfp8" Apr 16 15:12:03.882175 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.882081 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/0b3e092e-d180-4205-ac3b-9d71123acb4d-multus-socket-dir-parent\") pod \"multus-8cgfx\" (UID: \"0b3e092e-d180-4205-ac3b-9d71123acb4d\") " pod="openshift-multus/multus-8cgfx" Apr 16 15:12:03.882175 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.882104 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/0b3e092e-d180-4205-ac3b-9d71123acb4d-host-var-lib-cni-multus\") pod \"multus-8cgfx\" (UID: \"0b3e092e-d180-4205-ac3b-9d71123acb4d\") " pod="openshift-multus/multus-8cgfx" Apr 16 15:12:03.882175 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.882126 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/81bf7298-f21b-4f3a-adb6-d3a8c98c8da7-run\") pod \"tuned-bsxxt\" (UID: \"81bf7298-f21b-4f3a-adb6-d3a8c98c8da7\") " pod="openshift-cluster-node-tuning-operator/tuned-bsxxt" Apr 16 15:12:03.882175 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.882146 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/bf188288-b071-4a5d-867d-17022354cd51-serviceca\") pod \"node-ca-4pfr6\" (UID: \"bf188288-b071-4a5d-867d-17022354cd51\") " pod="openshift-image-registry/node-ca-4pfr6" Apr 16 15:12:03.882175 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.882168 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0b3e092e-d180-4205-ac3b-9d71123acb4d-cnibin\") pod \"multus-8cgfx\" (UID: \"0b3e092e-d180-4205-ac3b-9d71123acb4d\") " pod="openshift-multus/multus-8cgfx" Apr 16 15:12:03.882561 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.882188 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/81bf7298-f21b-4f3a-adb6-d3a8c98c8da7-etc-sysconfig\") pod \"tuned-bsxxt\" (UID: \"81bf7298-f21b-4f3a-adb6-d3a8c98c8da7\") " pod="openshift-cluster-node-tuning-operator/tuned-bsxxt" Apr 16 15:12:03.882561 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.882211 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctkdb\" (UniqueName: \"kubernetes.io/projected/905049f6-70d0-42ea-bf9f-d37c4f436506-kube-api-access-ctkdb\") pod \"network-check-target-ggs7q\" (UID: \"905049f6-70d0-42ea-bf9f-d37c4f436506\") " pod="openshift-network-diagnostics/network-check-target-ggs7q" Apr 16 15:12:03.882561 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.882232 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmbt2\" (UniqueName: \"kubernetes.io/projected/0b3e092e-d180-4205-ac3b-9d71123acb4d-kube-api-access-kmbt2\") pod \"multus-8cgfx\" (UID: \"0b3e092e-d180-4205-ac3b-9d71123acb4d\") " pod="openshift-multus/multus-8cgfx" Apr 16 15:12:03.882561 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.882269 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/2f287493-2507-4465-97b3-78fa8046ab43-iptables-alerter-script\") pod \"iptables-alerter-zf2wm\" (UID: \"2f287493-2507-4465-97b3-78fa8046ab43\") " pod="openshift-network-operator/iptables-alerter-zf2wm" Apr 16 15:12:03.882561 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.882298 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/81bf7298-f21b-4f3a-adb6-d3a8c98c8da7-host\") pod \"tuned-bsxxt\" (UID: \"81bf7298-f21b-4f3a-adb6-d3a8c98c8da7\") " pod="openshift-cluster-node-tuning-operator/tuned-bsxxt" Apr 16 15:12:03.882561 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.882319 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/914c9c02-e314-4b0c-b5f6-aab4f4681c8d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-4kjbb\" (UID: \"914c9c02-e314-4b0c-b5f6-aab4f4681c8d\") " pod="openshift-multus/multus-additional-cni-plugins-4kjbb" Apr 16 15:12:03.882561 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.882350 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/81bf7298-f21b-4f3a-adb6-d3a8c98c8da7-etc-modprobe-d\") pod \"tuned-bsxxt\" (UID: \"81bf7298-f21b-4f3a-adb6-d3a8c98c8da7\") " pod="openshift-cluster-node-tuning-operator/tuned-bsxxt" Apr 16 15:12:03.882561 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.882371 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/81bf7298-f21b-4f3a-adb6-d3a8c98c8da7-tmp\") pod \"tuned-bsxxt\" (UID: \"81bf7298-f21b-4f3a-adb6-d3a8c98c8da7\") " pod="openshift-cluster-node-tuning-operator/tuned-bsxxt" Apr 16 15:12:03.884832 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.884812 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 15:12:03.884913 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.884851 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 15:12:03.885327 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.885310 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 15:12:03.885408 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.885343 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 15:12:03.885465 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.885343 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 15:12:03.885518 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.885497 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-756ks\"" Apr 16 15:12:03.885567 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.885528 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 15:12:03.887429 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.887412 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 15:12:03.891550 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.888753 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 15:12:03.891550 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.888834 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-wjgzc\"" Apr 16 15:12:03.891550 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.889008 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 15:12:03.891550 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.889381 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-lzhhw\"" Apr 16 15:12:03.891550 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.890421 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-stlgx\"" Apr 16 15:12:03.891550 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.890483 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 15:12:03.891550 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.890608 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 15:12:03.892273 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.892252 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-5pcmx\"" Apr 16 15:12:03.892359 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.892287 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 15:12:03.893349 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.893330 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 15:12:03.893845 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.893831 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-qw9hk\"" Apr 16 15:12:03.897009 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.896991 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 15:12:03.897334 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.897320 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 15:12:03.898569 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.898550 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 15:12:03.898815 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.898798 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 15:12:03.908124 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.908030 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 15:12:03.908299 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.908277 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 15:12:03.908442 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.908287 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 15:12:03.908529 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.908514 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-8824q\"" Apr 16 15:12:03.908589 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.908579 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 15:12:03.908668 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.908632 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-kclpt\"" Apr 16 15:12:03.908728 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.908680 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 15:12:03.908884 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.908858 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 15:12:03.908986 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.908928 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 15:12:03.909061 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.909038 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-dn9rz\"" Apr 16 15:12:03.921045 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.920839 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 15:12:03.924181 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.924166 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 15:12:03.926314 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.926298 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 15:12:03.969819 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.969798 2573 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 15:12:03.983286 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.983260 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/0b3e092e-d180-4205-ac3b-9d71123acb4d-host-var-lib-cni-multus\") pod \"multus-8cgfx\" (UID: \"0b3e092e-d180-4205-ac3b-9d71123acb4d\") " pod="openshift-multus/multus-8cgfx" Apr 16 15:12:03.983401 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.983301 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/81bf7298-f21b-4f3a-adb6-d3a8c98c8da7-run\") pod \"tuned-bsxxt\" (UID: \"81bf7298-f21b-4f3a-adb6-d3a8c98c8da7\") " pod="openshift-cluster-node-tuning-operator/tuned-bsxxt" Apr 16 15:12:03.983401 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.983371 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/81bf7298-f21b-4f3a-adb6-d3a8c98c8da7-run\") pod \"tuned-bsxxt\" (UID: \"81bf7298-f21b-4f3a-adb6-d3a8c98c8da7\") " pod="openshift-cluster-node-tuning-operator/tuned-bsxxt" Apr 16 15:12:03.983401 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.983364 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/0b3e092e-d180-4205-ac3b-9d71123acb4d-host-var-lib-cni-multus\") pod \"multus-8cgfx\" (UID: \"0b3e092e-d180-4205-ac3b-9d71123acb4d\") " pod="openshift-multus/multus-8cgfx" Apr 16 15:12:03.983565 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.983424 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/bf188288-b071-4a5d-867d-17022354cd51-serviceca\") pod \"node-ca-4pfr6\" (UID: \"bf188288-b071-4a5d-867d-17022354cd51\") " pod="openshift-image-registry/node-ca-4pfr6" Apr 16 15:12:03.983565 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.983458 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/edf7b8a9-878b-451c-86c3-6a78d2ece872-run-systemd\") pod \"ovnkube-node-n96lw\" (UID: \"edf7b8a9-878b-451c-86c3-6a78d2ece872\") " pod="openshift-ovn-kubernetes/ovnkube-node-n96lw" Apr 16 15:12:03.983565 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.983486 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/edf7b8a9-878b-451c-86c3-6a78d2ece872-run-openvswitch\") pod \"ovnkube-node-n96lw\" (UID: \"edf7b8a9-878b-451c-86c3-6a78d2ece872\") " pod="openshift-ovn-kubernetes/ovnkube-node-n96lw" Apr 16 15:12:03.983565 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.983515 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0b3e092e-d180-4205-ac3b-9d71123acb4d-cnibin\") pod \"multus-8cgfx\" (UID: \"0b3e092e-d180-4205-ac3b-9d71123acb4d\") " pod="openshift-multus/multus-8cgfx" Apr 16 15:12:03.983565 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.983539 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/81bf7298-f21b-4f3a-adb6-d3a8c98c8da7-etc-sysconfig\") pod \"tuned-bsxxt\" (UID: \"81bf7298-f21b-4f3a-adb6-d3a8c98c8da7\") " pod="openshift-cluster-node-tuning-operator/tuned-bsxxt" Apr 16 15:12:03.983794 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.983572 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ctkdb\" (UniqueName: \"kubernetes.io/projected/905049f6-70d0-42ea-bf9f-d37c4f436506-kube-api-access-ctkdb\") pod \"network-check-target-ggs7q\" (UID: \"905049f6-70d0-42ea-bf9f-d37c4f436506\") " pod="openshift-network-diagnostics/network-check-target-ggs7q" Apr 16 15:12:03.983794 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.983601 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kmbt2\" (UniqueName: \"kubernetes.io/projected/0b3e092e-d180-4205-ac3b-9d71123acb4d-kube-api-access-kmbt2\") pod \"multus-8cgfx\" (UID: \"0b3e092e-d180-4205-ac3b-9d71123acb4d\") " pod="openshift-multus/multus-8cgfx" Apr 16 15:12:03.983794 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.983628 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/2f287493-2507-4465-97b3-78fa8046ab43-iptables-alerter-script\") pod \"iptables-alerter-zf2wm\" (UID: \"2f287493-2507-4465-97b3-78fa8046ab43\") " pod="openshift-network-operator/iptables-alerter-zf2wm" Apr 16 15:12:03.983794 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.983629 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0b3e092e-d180-4205-ac3b-9d71123acb4d-cnibin\") pod \"multus-8cgfx\" (UID: \"0b3e092e-d180-4205-ac3b-9d71123acb4d\") " pod="openshift-multus/multus-8cgfx" Apr 16 15:12:03.983794 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.983688 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/81bf7298-f21b-4f3a-adb6-d3a8c98c8da7-etc-sysconfig\") pod \"tuned-bsxxt\" (UID: \"81bf7298-f21b-4f3a-adb6-d3a8c98c8da7\") " pod="openshift-cluster-node-tuning-operator/tuned-bsxxt" Apr 16 15:12:03.983794 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.983750 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/81bf7298-f21b-4f3a-adb6-d3a8c98c8da7-host\") pod \"tuned-bsxxt\" (UID: \"81bf7298-f21b-4f3a-adb6-d3a8c98c8da7\") " pod="openshift-cluster-node-tuning-operator/tuned-bsxxt" Apr 16 15:12:03.983794 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.983779 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/914c9c02-e314-4b0c-b5f6-aab4f4681c8d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-4kjbb\" (UID: \"914c9c02-e314-4b0c-b5f6-aab4f4681c8d\") " pod="openshift-multus/multus-additional-cni-plugins-4kjbb" Apr 16 15:12:03.984082 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.983807 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/1462d2c4-a9b3-43d9-a968-33a1359d6ccc-hosts-file\") pod \"node-resolver-sjt8j\" (UID: \"1462d2c4-a9b3-43d9-a968-33a1359d6ccc\") " pod="openshift-dns/node-resolver-sjt8j" Apr 16 15:12:03.984082 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.983824 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/81bf7298-f21b-4f3a-adb6-d3a8c98c8da7-host\") pod \"tuned-bsxxt\" (UID: \"81bf7298-f21b-4f3a-adb6-d3a8c98c8da7\") " pod="openshift-cluster-node-tuning-operator/tuned-bsxxt" Apr 16 15:12:03.984082 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.983834 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/edf7b8a9-878b-451c-86c3-6a78d2ece872-host-slash\") pod \"ovnkube-node-n96lw\" (UID: \"edf7b8a9-878b-451c-86c3-6a78d2ece872\") " pod="openshift-ovn-kubernetes/ovnkube-node-n96lw" Apr 16 15:12:03.984082 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.983858 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/edf7b8a9-878b-451c-86c3-6a78d2ece872-host-cni-netd\") pod \"ovnkube-node-n96lw\" (UID: \"edf7b8a9-878b-451c-86c3-6a78d2ece872\") " pod="openshift-ovn-kubernetes/ovnkube-node-n96lw" Apr 16 15:12:03.984082 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.983907 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/edf7b8a9-878b-451c-86c3-6a78d2ece872-ovnkube-config\") pod \"ovnkube-node-n96lw\" (UID: \"edf7b8a9-878b-451c-86c3-6a78d2ece872\") " pod="openshift-ovn-kubernetes/ovnkube-node-n96lw" Apr 16 15:12:03.984082 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.983943 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/81bf7298-f21b-4f3a-adb6-d3a8c98c8da7-etc-modprobe-d\") pod \"tuned-bsxxt\" (UID: \"81bf7298-f21b-4f3a-adb6-d3a8c98c8da7\") " pod="openshift-cluster-node-tuning-operator/tuned-bsxxt" Apr 16 15:12:03.984082 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.983969 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/81bf7298-f21b-4f3a-adb6-d3a8c98c8da7-tmp\") pod \"tuned-bsxxt\" (UID: \"81bf7298-f21b-4f3a-adb6-d3a8c98c8da7\") " pod="openshift-cluster-node-tuning-operator/tuned-bsxxt" Apr 16 15:12:03.984082 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.983992 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/81bf7298-f21b-4f3a-adb6-d3a8c98c8da7-var-lib-kubelet\") pod \"tuned-bsxxt\" (UID: \"81bf7298-f21b-4f3a-adb6-d3a8c98c8da7\") " pod="openshift-cluster-node-tuning-operator/tuned-bsxxt" Apr 16 15:12:03.984082 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.984025 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/0b3e092e-d180-4205-ac3b-9d71123acb4d-host-run-k8s-cni-cncf-io\") pod \"multus-8cgfx\" (UID: \"0b3e092e-d180-4205-ac3b-9d71123acb4d\") " pod="openshift-multus/multus-8cgfx" Apr 16 15:12:03.984082 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.984051 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/4caba475-2461-4633-a1d4-00e049169fb5-device-dir\") pod \"aws-ebs-csi-driver-node-lrfp8\" (UID: \"4caba475-2461-4633-a1d4-00e049169fb5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lrfp8" Apr 16 15:12:03.984082 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.984075 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/81bf7298-f21b-4f3a-adb6-d3a8c98c8da7-etc-sysctl-d\") pod \"tuned-bsxxt\" (UID: \"81bf7298-f21b-4f3a-adb6-d3a8c98c8da7\") " pod="openshift-cluster-node-tuning-operator/tuned-bsxxt" Apr 16 15:12:03.984557 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.984102 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/81bf7298-f21b-4f3a-adb6-d3a8c98c8da7-etc-sysctl-conf\") pod \"tuned-bsxxt\" (UID: \"81bf7298-f21b-4f3a-adb6-d3a8c98c8da7\") " pod="openshift-cluster-node-tuning-operator/tuned-bsxxt" Apr 16 15:12:03.984557 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.984138 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/edf7b8a9-878b-451c-86c3-6a78d2ece872-run-ovn\") pod \"ovnkube-node-n96lw\" (UID: \"edf7b8a9-878b-451c-86c3-6a78d2ece872\") " pod="openshift-ovn-kubernetes/ovnkube-node-n96lw" Apr 16 15:12:03.984557 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.984167 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/81bf7298-f21b-4f3a-adb6-d3a8c98c8da7-etc-modprobe-d\") pod \"tuned-bsxxt\" (UID: \"81bf7298-f21b-4f3a-adb6-d3a8c98c8da7\") " pod="openshift-cluster-node-tuning-operator/tuned-bsxxt" Apr 16 15:12:03.984557 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.984215 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/914c9c02-e314-4b0c-b5f6-aab4f4681c8d-cnibin\") pod \"multus-additional-cni-plugins-4kjbb\" (UID: \"914c9c02-e314-4b0c-b5f6-aab4f4681c8d\") " pod="openshift-multus/multus-additional-cni-plugins-4kjbb" Apr 16 15:12:03.984557 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.984172 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/914c9c02-e314-4b0c-b5f6-aab4f4681c8d-cnibin\") pod \"multus-additional-cni-plugins-4kjbb\" (UID: \"914c9c02-e314-4b0c-b5f6-aab4f4681c8d\") " pod="openshift-multus/multus-additional-cni-plugins-4kjbb" Apr 16 15:12:03.984557 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.984170 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/2f287493-2507-4465-97b3-78fa8046ab43-iptables-alerter-script\") pod \"iptables-alerter-zf2wm\" (UID: \"2f287493-2507-4465-97b3-78fa8046ab43\") " pod="openshift-network-operator/iptables-alerter-zf2wm" Apr 16 15:12:03.984557 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.984252 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0b3e092e-d180-4205-ac3b-9d71123acb4d-system-cni-dir\") pod \"multus-8cgfx\" (UID: \"0b3e092e-d180-4205-ac3b-9d71123acb4d\") " pod="openshift-multus/multus-8cgfx" Apr 16 15:12:03.984557 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.984276 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/0b3e092e-d180-4205-ac3b-9d71123acb4d-host-run-k8s-cni-cncf-io\") pod \"multus-8cgfx\" (UID: \"0b3e092e-d180-4205-ac3b-9d71123acb4d\") " pod="openshift-multus/multus-8cgfx" Apr 16 15:12:03.984557 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.984230 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/81bf7298-f21b-4f3a-adb6-d3a8c98c8da7-var-lib-kubelet\") pod \"tuned-bsxxt\" (UID: \"81bf7298-f21b-4f3a-adb6-d3a8c98c8da7\") " pod="openshift-cluster-node-tuning-operator/tuned-bsxxt" Apr 16 15:12:03.984557 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.984272 2573 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 15:12:03.984557 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.984311 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0b3e092e-d180-4205-ac3b-9d71123acb4d-os-release\") pod \"multus-8cgfx\" (UID: \"0b3e092e-d180-4205-ac3b-9d71123acb4d\") " pod="openshift-multus/multus-8cgfx" Apr 16 15:12:03.984557 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.984228 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/4caba475-2461-4633-a1d4-00e049169fb5-device-dir\") pod \"aws-ebs-csi-driver-node-lrfp8\" (UID: \"4caba475-2461-4633-a1d4-00e049169fb5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lrfp8" Apr 16 15:12:03.984557 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.984358 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0b3e092e-d180-4205-ac3b-9d71123acb4d-etc-kubernetes\") pod \"multus-8cgfx\" (UID: \"0b3e092e-d180-4205-ac3b-9d71123acb4d\") " pod="openshift-multus/multus-8cgfx" Apr 16 15:12:03.984557 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.984397 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/bf188288-b071-4a5d-867d-17022354cd51-serviceca\") pod \"node-ca-4pfr6\" (UID: \"bf188288-b071-4a5d-867d-17022354cd51\") " pod="openshift-image-registry/node-ca-4pfr6" Apr 16 15:12:03.984557 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.984416 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/81bf7298-f21b-4f3a-adb6-d3a8c98c8da7-etc-systemd\") pod \"tuned-bsxxt\" (UID: \"81bf7298-f21b-4f3a-adb6-d3a8c98c8da7\") " pod="openshift-cluster-node-tuning-operator/tuned-bsxxt" Apr 16 15:12:03.984557 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.984426 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0b3e092e-d180-4205-ac3b-9d71123acb4d-etc-kubernetes\") pod \"multus-8cgfx\" (UID: \"0b3e092e-d180-4205-ac3b-9d71123acb4d\") " pod="openshift-multus/multus-8cgfx" Apr 16 15:12:03.984557 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.984444 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0b3e092e-d180-4205-ac3b-9d71123acb4d-os-release\") pod \"multus-8cgfx\" (UID: \"0b3e092e-d180-4205-ac3b-9d71123acb4d\") " pod="openshift-multus/multus-8cgfx" Apr 16 15:12:03.984557 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.984486 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/81bf7298-f21b-4f3a-adb6-d3a8c98c8da7-etc-systemd\") pod \"tuned-bsxxt\" (UID: \"81bf7298-f21b-4f3a-adb6-d3a8c98c8da7\") " pod="openshift-cluster-node-tuning-operator/tuned-bsxxt" Apr 16 15:12:03.985343 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.984495 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0b3e092e-d180-4205-ac3b-9d71123acb4d-system-cni-dir\") pod \"multus-8cgfx\" (UID: \"0b3e092e-d180-4205-ac3b-9d71123acb4d\") " pod="openshift-multus/multus-8cgfx" Apr 16 15:12:03.985343 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.984525 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0b3e092e-d180-4205-ac3b-9d71123acb4d-multus-conf-dir\") pod \"multus-8cgfx\" (UID: \"0b3e092e-d180-4205-ac3b-9d71123acb4d\") " pod="openshift-multus/multus-8cgfx" Apr 16 15:12:03.985343 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.984532 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/81bf7298-f21b-4f3a-adb6-d3a8c98c8da7-etc-sysctl-conf\") pod \"tuned-bsxxt\" (UID: \"81bf7298-f21b-4f3a-adb6-d3a8c98c8da7\") " pod="openshift-cluster-node-tuning-operator/tuned-bsxxt" Apr 16 15:12:03.985343 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.984559 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/81bf7298-f21b-4f3a-adb6-d3a8c98c8da7-etc-tuned\") pod \"tuned-bsxxt\" (UID: \"81bf7298-f21b-4f3a-adb6-d3a8c98c8da7\") " pod="openshift-cluster-node-tuning-operator/tuned-bsxxt" Apr 16 15:12:03.985343 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.984579 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-twwkr\" (UniqueName: \"kubernetes.io/projected/81bf7298-f21b-4f3a-adb6-d3a8c98c8da7-kube-api-access-twwkr\") pod \"tuned-bsxxt\" (UID: \"81bf7298-f21b-4f3a-adb6-d3a8c98c8da7\") " pod="openshift-cluster-node-tuning-operator/tuned-bsxxt" Apr 16 15:12:03.985343 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.984583 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0b3e092e-d180-4205-ac3b-9d71123acb4d-multus-conf-dir\") pod \"multus-8cgfx\" (UID: \"0b3e092e-d180-4205-ac3b-9d71123acb4d\") " pod="openshift-multus/multus-8cgfx" Apr 16 15:12:03.985343 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.984600 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6xnx\" (UniqueName: \"kubernetes.io/projected/414282ac-76dc-4992-819a-bdf05043c09d-kube-api-access-x6xnx\") pod \"network-metrics-daemon-h9c44\" (UID: \"414282ac-76dc-4992-819a-bdf05043c09d\") " pod="openshift-multus/network-metrics-daemon-h9c44" Apr 16 15:12:03.985343 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.984603 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/81bf7298-f21b-4f3a-adb6-d3a8c98c8da7-etc-sysctl-d\") pod \"tuned-bsxxt\" (UID: \"81bf7298-f21b-4f3a-adb6-d3a8c98c8da7\") " pod="openshift-cluster-node-tuning-operator/tuned-bsxxt" Apr 16 15:12:03.985343 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.984639 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/edf7b8a9-878b-451c-86c3-6a78d2ece872-systemd-units\") pod \"ovnkube-node-n96lw\" (UID: \"edf7b8a9-878b-451c-86c3-6a78d2ece872\") " pod="openshift-ovn-kubernetes/ovnkube-node-n96lw" Apr 16 15:12:03.985343 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.984676 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/914c9c02-e314-4b0c-b5f6-aab4f4681c8d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-4kjbb\" (UID: \"914c9c02-e314-4b0c-b5f6-aab4f4681c8d\") " pod="openshift-multus/multus-additional-cni-plugins-4kjbb" Apr 16 15:12:03.985343 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.984692 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/edf7b8a9-878b-451c-86c3-6a78d2ece872-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-n96lw\" (UID: \"edf7b8a9-878b-451c-86c3-6a78d2ece872\") " pod="openshift-ovn-kubernetes/ovnkube-node-n96lw" Apr 16 15:12:03.985343 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.984721 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8zn5\" (UniqueName: \"kubernetes.io/projected/edf7b8a9-878b-451c-86c3-6a78d2ece872-kube-api-access-q8zn5\") pod \"ovnkube-node-n96lw\" (UID: \"edf7b8a9-878b-451c-86c3-6a78d2ece872\") " pod="openshift-ovn-kubernetes/ovnkube-node-n96lw" Apr 16 15:12:03.985343 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.984751 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2f287493-2507-4465-97b3-78fa8046ab43-host-slash\") pod \"iptables-alerter-zf2wm\" (UID: \"2f287493-2507-4465-97b3-78fa8046ab43\") " pod="openshift-network-operator/iptables-alerter-zf2wm" Apr 16 15:12:03.985343 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.984776 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/0b3e092e-d180-4205-ac3b-9d71123acb4d-host-run-multus-certs\") pod \"multus-8cgfx\" (UID: \"0b3e092e-d180-4205-ac3b-9d71123acb4d\") " pod="openshift-multus/multus-8cgfx" Apr 16 15:12:03.985343 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.984819 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/edf7b8a9-878b-451c-86c3-6a78d2ece872-env-overrides\") pod \"ovnkube-node-n96lw\" (UID: \"edf7b8a9-878b-451c-86c3-6a78d2ece872\") " pod="openshift-ovn-kubernetes/ovnkube-node-n96lw" Apr 16 15:12:03.985343 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.984827 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2f287493-2507-4465-97b3-78fa8046ab43-host-slash\") pod \"iptables-alerter-zf2wm\" (UID: \"2f287493-2507-4465-97b3-78fa8046ab43\") " pod="openshift-network-operator/iptables-alerter-zf2wm" Apr 16 15:12:03.985343 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.984864 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/0b3e092e-d180-4205-ac3b-9d71123acb4d-host-run-multus-certs\") pod \"multus-8cgfx\" (UID: \"0b3e092e-d180-4205-ac3b-9d71123acb4d\") " pod="openshift-multus/multus-8cgfx" Apr 16 15:12:03.986176 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.984897 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/edf7b8a9-878b-451c-86c3-6a78d2ece872-ovn-node-metrics-cert\") pod \"ovnkube-node-n96lw\" (UID: \"edf7b8a9-878b-451c-86c3-6a78d2ece872\") " pod="openshift-ovn-kubernetes/ovnkube-node-n96lw" Apr 16 15:12:03.986176 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.984921 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/edf7b8a9-878b-451c-86c3-6a78d2ece872-ovnkube-script-lib\") pod \"ovnkube-node-n96lw\" (UID: \"edf7b8a9-878b-451c-86c3-6a78d2ece872\") " pod="openshift-ovn-kubernetes/ovnkube-node-n96lw" Apr 16 15:12:03.986176 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.984945 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0b3e092e-d180-4205-ac3b-9d71123acb4d-multus-cni-dir\") pod \"multus-8cgfx\" (UID: \"0b3e092e-d180-4205-ac3b-9d71123acb4d\") " pod="openshift-multus/multus-8cgfx" Apr 16 15:12:03.986176 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.984966 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4caba475-2461-4633-a1d4-00e049169fb5-kubelet-dir\") pod \"aws-ebs-csi-driver-node-lrfp8\" (UID: \"4caba475-2461-4633-a1d4-00e049169fb5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lrfp8" Apr 16 15:12:03.986176 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.984985 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/914c9c02-e314-4b0c-b5f6-aab4f4681c8d-system-cni-dir\") pod \"multus-additional-cni-plugins-4kjbb\" (UID: \"914c9c02-e314-4b0c-b5f6-aab4f4681c8d\") " pod="openshift-multus/multus-additional-cni-plugins-4kjbb" Apr 16 15:12:03.986176 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.985007 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/edf7b8a9-878b-451c-86c3-6a78d2ece872-host-kubelet\") pod \"ovnkube-node-n96lw\" (UID: \"edf7b8a9-878b-451c-86c3-6a78d2ece872\") " pod="openshift-ovn-kubernetes/ovnkube-node-n96lw" Apr 16 15:12:03.986176 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.985027 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/4caba475-2461-4633-a1d4-00e049169fb5-etc-selinux\") pod \"aws-ebs-csi-driver-node-lrfp8\" (UID: \"4caba475-2461-4633-a1d4-00e049169fb5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lrfp8" Apr 16 15:12:03.986176 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.985029 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4caba475-2461-4633-a1d4-00e049169fb5-kubelet-dir\") pod \"aws-ebs-csi-driver-node-lrfp8\" (UID: \"4caba475-2461-4633-a1d4-00e049169fb5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lrfp8" Apr 16 15:12:03.986176 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.985046 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/81bf7298-f21b-4f3a-adb6-d3a8c98c8da7-sys\") pod \"tuned-bsxxt\" (UID: \"81bf7298-f21b-4f3a-adb6-d3a8c98c8da7\") " pod="openshift-cluster-node-tuning-operator/tuned-bsxxt" Apr 16 15:12:03.986176 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.985071 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/914c9c02-e314-4b0c-b5f6-aab4f4681c8d-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-4kjbb\" (UID: \"914c9c02-e314-4b0c-b5f6-aab4f4681c8d\") " pod="openshift-multus/multus-additional-cni-plugins-4kjbb" Apr 16 15:12:03.986176 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.985088 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/914c9c02-e314-4b0c-b5f6-aab4f4681c8d-system-cni-dir\") pod \"multus-additional-cni-plugins-4kjbb\" (UID: \"914c9c02-e314-4b0c-b5f6-aab4f4681c8d\") " pod="openshift-multus/multus-additional-cni-plugins-4kjbb" Apr 16 15:12:03.986176 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.985103 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/414282ac-76dc-4992-819a-bdf05043c09d-metrics-certs\") pod \"network-metrics-daemon-h9c44\" (UID: \"414282ac-76dc-4992-819a-bdf05043c09d\") " pod="openshift-multus/network-metrics-daemon-h9c44" Apr 16 15:12:03.986176 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.985109 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0b3e092e-d180-4205-ac3b-9d71123acb4d-multus-cni-dir\") pod \"multus-8cgfx\" (UID: \"0b3e092e-d180-4205-ac3b-9d71123acb4d\") " pod="openshift-multus/multus-8cgfx" Apr 16 15:12:03.986176 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.985137 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/edf7b8a9-878b-451c-86c3-6a78d2ece872-etc-openvswitch\") pod \"ovnkube-node-n96lw\" (UID: \"edf7b8a9-878b-451c-86c3-6a78d2ece872\") " pod="openshift-ovn-kubernetes/ovnkube-node-n96lw" Apr 16 15:12:03.986176 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.985154 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/81bf7298-f21b-4f3a-adb6-d3a8c98c8da7-sys\") pod \"tuned-bsxxt\" (UID: \"81bf7298-f21b-4f3a-adb6-d3a8c98c8da7\") " pod="openshift-cluster-node-tuning-operator/tuned-bsxxt" Apr 16 15:12:03.986176 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.985158 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m27xg\" (UniqueName: \"kubernetes.io/projected/2f287493-2507-4465-97b3-78fa8046ab43-kube-api-access-m27xg\") pod \"iptables-alerter-zf2wm\" (UID: \"2f287493-2507-4465-97b3-78fa8046ab43\") " pod="openshift-network-operator/iptables-alerter-zf2wm" Apr 16 15:12:03.986176 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.985174 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/baca9b64-0c10-49d7-9301-8d81286c87b6-agent-certs\") pod \"konnectivity-agent-2dkml\" (UID: \"baca9b64-0c10-49d7-9301-8d81286c87b6\") " pod="kube-system/konnectivity-agent-2dkml" Apr 16 15:12:03.986964 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.985189 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/baca9b64-0c10-49d7-9301-8d81286c87b6-konnectivity-ca\") pod \"konnectivity-agent-2dkml\" (UID: \"baca9b64-0c10-49d7-9301-8d81286c87b6\") " pod="kube-system/konnectivity-agent-2dkml" Apr 16 15:12:03.986964 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.985198 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/4caba475-2461-4633-a1d4-00e049169fb5-etc-selinux\") pod \"aws-ebs-csi-driver-node-lrfp8\" (UID: \"4caba475-2461-4633-a1d4-00e049169fb5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lrfp8" Apr 16 15:12:03.986964 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.985205 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0b3e092e-d180-4205-ac3b-9d71123acb4d-host-var-lib-kubelet\") pod \"multus-8cgfx\" (UID: \"0b3e092e-d180-4205-ac3b-9d71123acb4d\") " pod="openshift-multus/multus-8cgfx" Apr 16 15:12:03.986964 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.985226 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0b3e092e-d180-4205-ac3b-9d71123acb4d-host-var-lib-kubelet\") pod \"multus-8cgfx\" (UID: \"0b3e092e-d180-4205-ac3b-9d71123acb4d\") " pod="openshift-multus/multus-8cgfx" Apr 16 15:12:03.986964 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.985246 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/0b3e092e-d180-4205-ac3b-9d71123acb4d-hostroot\") pod \"multus-8cgfx\" (UID: \"0b3e092e-d180-4205-ac3b-9d71123acb4d\") " pod="openshift-multus/multus-8cgfx" Apr 16 15:12:03.986964 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.985273 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/0b3e092e-d180-4205-ac3b-9d71123acb4d-multus-daemon-config\") pod \"multus-8cgfx\" (UID: \"0b3e092e-d180-4205-ac3b-9d71123acb4d\") " pod="openshift-multus/multus-8cgfx" Apr 16 15:12:03.986964 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.985312 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l758b\" (UniqueName: \"kubernetes.io/projected/914c9c02-e314-4b0c-b5f6-aab4f4681c8d-kube-api-access-l758b\") pod \"multus-additional-cni-plugins-4kjbb\" (UID: \"914c9c02-e314-4b0c-b5f6-aab4f4681c8d\") " pod="openshift-multus/multus-additional-cni-plugins-4kjbb" Apr 16 15:12:03.986964 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.985342 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6shh\" (UniqueName: \"kubernetes.io/projected/1462d2c4-a9b3-43d9-a968-33a1359d6ccc-kube-api-access-q6shh\") pod \"node-resolver-sjt8j\" (UID: \"1462d2c4-a9b3-43d9-a968-33a1359d6ccc\") " pod="openshift-dns/node-resolver-sjt8j" Apr 16 15:12:03.986964 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.985365 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bf188288-b071-4a5d-867d-17022354cd51-host\") pod \"node-ca-4pfr6\" (UID: \"bf188288-b071-4a5d-867d-17022354cd51\") " pod="openshift-image-registry/node-ca-4pfr6" Apr 16 15:12:03.986964 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.985407 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/914c9c02-e314-4b0c-b5f6-aab4f4681c8d-os-release\") pod \"multus-additional-cni-plugins-4kjbb\" (UID: \"914c9c02-e314-4b0c-b5f6-aab4f4681c8d\") " pod="openshift-multus/multus-additional-cni-plugins-4kjbb" Apr 16 15:12:03.986964 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.985433 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/914c9c02-e314-4b0c-b5f6-aab4f4681c8d-cni-binary-copy\") pod \"multus-additional-cni-plugins-4kjbb\" (UID: \"914c9c02-e314-4b0c-b5f6-aab4f4681c8d\") " pod="openshift-multus/multus-additional-cni-plugins-4kjbb" Apr 16 15:12:03.986964 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.985459 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/914c9c02-e314-4b0c-b5f6-aab4f4681c8d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-4kjbb\" (UID: \"914c9c02-e314-4b0c-b5f6-aab4f4681c8d\") " pod="openshift-multus/multus-additional-cni-plugins-4kjbb" Apr 16 15:12:03.986964 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.985485 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0b3e092e-d180-4205-ac3b-9d71123acb4d-cni-binary-copy\") pod \"multus-8cgfx\" (UID: \"0b3e092e-d180-4205-ac3b-9d71123acb4d\") " pod="openshift-multus/multus-8cgfx" Apr 16 15:12:03.986964 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.985510 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4bjrv\" (UniqueName: \"kubernetes.io/projected/4caba475-2461-4633-a1d4-00e049169fb5-kube-api-access-4bjrv\") pod \"aws-ebs-csi-driver-node-lrfp8\" (UID: \"4caba475-2461-4633-a1d4-00e049169fb5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lrfp8" Apr 16 15:12:03.986964 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.985538 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/edf7b8a9-878b-451c-86c3-6a78d2ece872-host-run-ovn-kubernetes\") pod \"ovnkube-node-n96lw\" (UID: \"edf7b8a9-878b-451c-86c3-6a78d2ece872\") " pod="openshift-ovn-kubernetes/ovnkube-node-n96lw" Apr 16 15:12:03.986964 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.985564 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/4caba475-2461-4633-a1d4-00e049169fb5-socket-dir\") pod \"aws-ebs-csi-driver-node-lrfp8\" (UID: \"4caba475-2461-4633-a1d4-00e049169fb5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lrfp8" Apr 16 15:12:03.986964 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.985589 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/81bf7298-f21b-4f3a-adb6-d3a8c98c8da7-etc-kubernetes\") pod \"tuned-bsxxt\" (UID: \"81bf7298-f21b-4f3a-adb6-d3a8c98c8da7\") " pod="openshift-cluster-node-tuning-operator/tuned-bsxxt" Apr 16 15:12:03.987629 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.985615 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bfpcz\" (UniqueName: \"kubernetes.io/projected/bf188288-b071-4a5d-867d-17022354cd51-kube-api-access-bfpcz\") pod \"node-ca-4pfr6\" (UID: \"bf188288-b071-4a5d-867d-17022354cd51\") " pod="openshift-image-registry/node-ca-4pfr6" Apr 16 15:12:03.987629 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.985675 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1462d2c4-a9b3-43d9-a968-33a1359d6ccc-tmp-dir\") pod \"node-resolver-sjt8j\" (UID: \"1462d2c4-a9b3-43d9-a968-33a1359d6ccc\") " pod="openshift-dns/node-resolver-sjt8j" Apr 16 15:12:03.987629 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.985702 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/edf7b8a9-878b-451c-86c3-6a78d2ece872-var-lib-openvswitch\") pod \"ovnkube-node-n96lw\" (UID: \"edf7b8a9-878b-451c-86c3-6a78d2ece872\") " pod="openshift-ovn-kubernetes/ovnkube-node-n96lw" Apr 16 15:12:03.987629 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.985733 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/edf7b8a9-878b-451c-86c3-6a78d2ece872-node-log\") pod \"ovnkube-node-n96lw\" (UID: \"edf7b8a9-878b-451c-86c3-6a78d2ece872\") " pod="openshift-ovn-kubernetes/ovnkube-node-n96lw" Apr 16 15:12:03.987629 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.985758 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/edf7b8a9-878b-451c-86c3-6a78d2ece872-log-socket\") pod \"ovnkube-node-n96lw\" (UID: \"edf7b8a9-878b-451c-86c3-6a78d2ece872\") " pod="openshift-ovn-kubernetes/ovnkube-node-n96lw" Apr 16 15:12:03.987629 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.985781 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/edf7b8a9-878b-451c-86c3-6a78d2ece872-host-cni-bin\") pod \"ovnkube-node-n96lw\" (UID: \"edf7b8a9-878b-451c-86c3-6a78d2ece872\") " pod="openshift-ovn-kubernetes/ovnkube-node-n96lw" Apr 16 15:12:03.987629 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.985810 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0b3e092e-d180-4205-ac3b-9d71123acb4d-host-var-lib-cni-bin\") pod \"multus-8cgfx\" (UID: \"0b3e092e-d180-4205-ac3b-9d71123acb4d\") " pod="openshift-multus/multus-8cgfx" Apr 16 15:12:03.987629 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.985837 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0b3e092e-d180-4205-ac3b-9d71123acb4d-host-run-netns\") pod \"multus-8cgfx\" (UID: \"0b3e092e-d180-4205-ac3b-9d71123acb4d\") " pod="openshift-multus/multus-8cgfx" Apr 16 15:12:03.987629 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.985849 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/914c9c02-e314-4b0c-b5f6-aab4f4681c8d-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-4kjbb\" (UID: \"914c9c02-e314-4b0c-b5f6-aab4f4681c8d\") " pod="openshift-multus/multus-additional-cni-plugins-4kjbb" Apr 16 15:12:03.987629 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.985865 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/4caba475-2461-4633-a1d4-00e049169fb5-sys-fs\") pod \"aws-ebs-csi-driver-node-lrfp8\" (UID: \"4caba475-2461-4633-a1d4-00e049169fb5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lrfp8" Apr 16 15:12:03.987629 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.985889 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/81bf7298-f21b-4f3a-adb6-d3a8c98c8da7-lib-modules\") pod \"tuned-bsxxt\" (UID: \"81bf7298-f21b-4f3a-adb6-d3a8c98c8da7\") " pod="openshift-cluster-node-tuning-operator/tuned-bsxxt" Apr 16 15:12:03.987629 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.985917 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/edf7b8a9-878b-451c-86c3-6a78d2ece872-host-run-netns\") pod \"ovnkube-node-n96lw\" (UID: \"edf7b8a9-878b-451c-86c3-6a78d2ece872\") " pod="openshift-ovn-kubernetes/ovnkube-node-n96lw" Apr 16 15:12:03.987629 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.985948 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/4caba475-2461-4633-a1d4-00e049169fb5-registration-dir\") pod \"aws-ebs-csi-driver-node-lrfp8\" (UID: \"4caba475-2461-4633-a1d4-00e049169fb5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lrfp8" Apr 16 15:12:03.987629 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.985978 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/0b3e092e-d180-4205-ac3b-9d71123acb4d-multus-socket-dir-parent\") pod \"multus-8cgfx\" (UID: \"0b3e092e-d180-4205-ac3b-9d71123acb4d\") " pod="openshift-multus/multus-8cgfx" Apr 16 15:12:03.987629 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.986070 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/0b3e092e-d180-4205-ac3b-9d71123acb4d-multus-socket-dir-parent\") pod \"multus-8cgfx\" (UID: \"0b3e092e-d180-4205-ac3b-9d71123acb4d\") " pod="openshift-multus/multus-8cgfx" Apr 16 15:12:03.987629 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.986138 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/0b3e092e-d180-4205-ac3b-9d71123acb4d-hostroot\") pod \"multus-8cgfx\" (UID: \"0b3e092e-d180-4205-ac3b-9d71123acb4d\") " pod="openshift-multus/multus-8cgfx" Apr 16 15:12:03.987629 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.986201 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/4caba475-2461-4633-a1d4-00e049169fb5-socket-dir\") pod \"aws-ebs-csi-driver-node-lrfp8\" (UID: \"4caba475-2461-4633-a1d4-00e049169fb5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lrfp8" Apr 16 15:12:03.988331 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.986272 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/81bf7298-f21b-4f3a-adb6-d3a8c98c8da7-etc-kubernetes\") pod \"tuned-bsxxt\" (UID: \"81bf7298-f21b-4f3a-adb6-d3a8c98c8da7\") " pod="openshift-cluster-node-tuning-operator/tuned-bsxxt" Apr 16 15:12:03.988331 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.986551 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0b3e092e-d180-4205-ac3b-9d71123acb4d-host-var-lib-cni-bin\") pod \"multus-8cgfx\" (UID: \"0b3e092e-d180-4205-ac3b-9d71123acb4d\") " pod="openshift-multus/multus-8cgfx" Apr 16 15:12:03.988331 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.986608 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0b3e092e-d180-4205-ac3b-9d71123acb4d-host-run-netns\") pod \"multus-8cgfx\" (UID: \"0b3e092e-d180-4205-ac3b-9d71123acb4d\") " pod="openshift-multus/multus-8cgfx" Apr 16 15:12:03.988331 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.986670 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/0b3e092e-d180-4205-ac3b-9d71123acb4d-multus-daemon-config\") pod \"multus-8cgfx\" (UID: \"0b3e092e-d180-4205-ac3b-9d71123acb4d\") " pod="openshift-multus/multus-8cgfx" Apr 16 15:12:03.988331 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.986681 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/4caba475-2461-4633-a1d4-00e049169fb5-sys-fs\") pod \"aws-ebs-csi-driver-node-lrfp8\" (UID: \"4caba475-2461-4633-a1d4-00e049169fb5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lrfp8" Apr 16 15:12:03.988331 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.986708 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/914c9c02-e314-4b0c-b5f6-aab4f4681c8d-os-release\") pod \"multus-additional-cni-plugins-4kjbb\" (UID: \"914c9c02-e314-4b0c-b5f6-aab4f4681c8d\") " pod="openshift-multus/multus-additional-cni-plugins-4kjbb" Apr 16 15:12:03.988331 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.986783 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/81bf7298-f21b-4f3a-adb6-d3a8c98c8da7-lib-modules\") pod \"tuned-bsxxt\" (UID: \"81bf7298-f21b-4f3a-adb6-d3a8c98c8da7\") " pod="openshift-cluster-node-tuning-operator/tuned-bsxxt" Apr 16 15:12:03.988331 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.986843 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/4caba475-2461-4633-a1d4-00e049169fb5-registration-dir\") pod \"aws-ebs-csi-driver-node-lrfp8\" (UID: \"4caba475-2461-4633-a1d4-00e049169fb5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lrfp8" Apr 16 15:12:03.988331 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.986947 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bf188288-b071-4a5d-867d-17022354cd51-host\") pod \"node-ca-4pfr6\" (UID: \"bf188288-b071-4a5d-867d-17022354cd51\") " pod="openshift-image-registry/node-ca-4pfr6" Apr 16 15:12:03.988331 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.987327 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/baca9b64-0c10-49d7-9301-8d81286c87b6-konnectivity-ca\") pod \"konnectivity-agent-2dkml\" (UID: \"baca9b64-0c10-49d7-9301-8d81286c87b6\") " pod="kube-system/konnectivity-agent-2dkml" Apr 16 15:12:03.988331 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.987399 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0b3e092e-d180-4205-ac3b-9d71123acb4d-cni-binary-copy\") pod \"multus-8cgfx\" (UID: \"0b3e092e-d180-4205-ac3b-9d71123acb4d\") " pod="openshift-multus/multus-8cgfx" Apr 16 15:12:03.988331 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.987947 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/914c9c02-e314-4b0c-b5f6-aab4f4681c8d-cni-binary-copy\") pod \"multus-additional-cni-plugins-4kjbb\" (UID: \"914c9c02-e314-4b0c-b5f6-aab4f4681c8d\") " pod="openshift-multus/multus-additional-cni-plugins-4kjbb" Apr 16 15:12:03.988331 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.988151 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/81bf7298-f21b-4f3a-adb6-d3a8c98c8da7-tmp\") pod \"tuned-bsxxt\" (UID: \"81bf7298-f21b-4f3a-adb6-d3a8c98c8da7\") " pod="openshift-cluster-node-tuning-operator/tuned-bsxxt" Apr 16 15:12:03.988803 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.988411 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/baca9b64-0c10-49d7-9301-8d81286c87b6-agent-certs\") pod \"konnectivity-agent-2dkml\" (UID: \"baca9b64-0c10-49d7-9301-8d81286c87b6\") " pod="kube-system/konnectivity-agent-2dkml" Apr 16 15:12:03.988803 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.988453 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/914c9c02-e314-4b0c-b5f6-aab4f4681c8d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-4kjbb\" (UID: \"914c9c02-e314-4b0c-b5f6-aab4f4681c8d\") " pod="openshift-multus/multus-additional-cni-plugins-4kjbb" Apr 16 15:12:03.988803 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:03.988764 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/81bf7298-f21b-4f3a-adb6-d3a8c98c8da7-etc-tuned\") pod \"tuned-bsxxt\" (UID: \"81bf7298-f21b-4f3a-adb6-d3a8c98c8da7\") " pod="openshift-cluster-node-tuning-operator/tuned-bsxxt" Apr 16 15:12:04.022340 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:04.022314 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l758b\" (UniqueName: \"kubernetes.io/projected/914c9c02-e314-4b0c-b5f6-aab4f4681c8d-kube-api-access-l758b\") pod \"multus-additional-cni-plugins-4kjbb\" (UID: \"914c9c02-e314-4b0c-b5f6-aab4f4681c8d\") " pod="openshift-multus/multus-additional-cni-plugins-4kjbb" Apr 16 15:12:04.029131 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:04.029111 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-twwkr\" (UniqueName: \"kubernetes.io/projected/81bf7298-f21b-4f3a-adb6-d3a8c98c8da7-kube-api-access-twwkr\") pod \"tuned-bsxxt\" (UID: \"81bf7298-f21b-4f3a-adb6-d3a8c98c8da7\") " pod="openshift-cluster-node-tuning-operator/tuned-bsxxt" Apr 16 15:12:04.043700 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:04.043678 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m27xg\" (UniqueName: \"kubernetes.io/projected/2f287493-2507-4465-97b3-78fa8046ab43-kube-api-access-m27xg\") pod \"iptables-alerter-zf2wm\" (UID: \"2f287493-2507-4465-97b3-78fa8046ab43\") " pod="openshift-network-operator/iptables-alerter-zf2wm" Apr 16 15:12:04.064916 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:04.064880 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmbt2\" (UniqueName: \"kubernetes.io/projected/0b3e092e-d180-4205-ac3b-9d71123acb4d-kube-api-access-kmbt2\") pod \"multus-8cgfx\" (UID: \"0b3e092e-d180-4205-ac3b-9d71123acb4d\") " pod="openshift-multus/multus-8cgfx" Apr 16 15:12:04.067059 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:04.067038 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bjrv\" (UniqueName: \"kubernetes.io/projected/4caba475-2461-4633-a1d4-00e049169fb5-kube-api-access-4bjrv\") pod \"aws-ebs-csi-driver-node-lrfp8\" (UID: \"4caba475-2461-4633-a1d4-00e049169fb5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lrfp8" Apr 16 15:12:04.087164 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:04.087134 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x6xnx\" (UniqueName: \"kubernetes.io/projected/414282ac-76dc-4992-819a-bdf05043c09d-kube-api-access-x6xnx\") pod \"network-metrics-daemon-h9c44\" (UID: \"414282ac-76dc-4992-819a-bdf05043c09d\") " pod="openshift-multus/network-metrics-daemon-h9c44" Apr 16 15:12:04.087281 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:04.087179 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/edf7b8a9-878b-451c-86c3-6a78d2ece872-systemd-units\") pod \"ovnkube-node-n96lw\" (UID: \"edf7b8a9-878b-451c-86c3-6a78d2ece872\") " pod="openshift-ovn-kubernetes/ovnkube-node-n96lw" Apr 16 15:12:04.087281 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:04.087250 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/edf7b8a9-878b-451c-86c3-6a78d2ece872-systemd-units\") pod \"ovnkube-node-n96lw\" (UID: \"edf7b8a9-878b-451c-86c3-6a78d2ece872\") " pod="openshift-ovn-kubernetes/ovnkube-node-n96lw" Apr 16 15:12:04.087391 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:04.087309 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/edf7b8a9-878b-451c-86c3-6a78d2ece872-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-n96lw\" (UID: \"edf7b8a9-878b-451c-86c3-6a78d2ece872\") " pod="openshift-ovn-kubernetes/ovnkube-node-n96lw" Apr 16 15:12:04.087391 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:04.087351 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q8zn5\" (UniqueName: \"kubernetes.io/projected/edf7b8a9-878b-451c-86c3-6a78d2ece872-kube-api-access-q8zn5\") pod \"ovnkube-node-n96lw\" (UID: \"edf7b8a9-878b-451c-86c3-6a78d2ece872\") " pod="openshift-ovn-kubernetes/ovnkube-node-n96lw" Apr 16 15:12:04.087490 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:04.087398 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/edf7b8a9-878b-451c-86c3-6a78d2ece872-env-overrides\") pod \"ovnkube-node-n96lw\" (UID: \"edf7b8a9-878b-451c-86c3-6a78d2ece872\") " pod="openshift-ovn-kubernetes/ovnkube-node-n96lw" Apr 16 15:12:04.087490 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:04.087430 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/edf7b8a9-878b-451c-86c3-6a78d2ece872-ovn-node-metrics-cert\") pod \"ovnkube-node-n96lw\" (UID: \"edf7b8a9-878b-451c-86c3-6a78d2ece872\") " pod="openshift-ovn-kubernetes/ovnkube-node-n96lw" Apr 16 15:12:04.087490 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:04.087453 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/edf7b8a9-878b-451c-86c3-6a78d2ece872-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-n96lw\" (UID: \"edf7b8a9-878b-451c-86c3-6a78d2ece872\") " pod="openshift-ovn-kubernetes/ovnkube-node-n96lw" Apr 16 15:12:04.088051 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:04.087456 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/edf7b8a9-878b-451c-86c3-6a78d2ece872-ovnkube-script-lib\") pod \"ovnkube-node-n96lw\" (UID: \"edf7b8a9-878b-451c-86c3-6a78d2ece872\") " pod="openshift-ovn-kubernetes/ovnkube-node-n96lw" Apr 16 15:12:04.088051 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:04.087736 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/edf7b8a9-878b-451c-86c3-6a78d2ece872-host-kubelet\") pod \"ovnkube-node-n96lw\" (UID: \"edf7b8a9-878b-451c-86c3-6a78d2ece872\") " pod="openshift-ovn-kubernetes/ovnkube-node-n96lw" Apr 16 15:12:04.088051 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:04.087767 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/414282ac-76dc-4992-819a-bdf05043c09d-metrics-certs\") pod \"network-metrics-daemon-h9c44\" (UID: \"414282ac-76dc-4992-819a-bdf05043c09d\") " pod="openshift-multus/network-metrics-daemon-h9c44" Apr 16 15:12:04.088051 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:04.087784 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/edf7b8a9-878b-451c-86c3-6a78d2ece872-etc-openvswitch\") pod \"ovnkube-node-n96lw\" (UID: \"edf7b8a9-878b-451c-86c3-6a78d2ece872\") " pod="openshift-ovn-kubernetes/ovnkube-node-n96lw" Apr 16 15:12:04.088051 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:04.087782 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/edf7b8a9-878b-451c-86c3-6a78d2ece872-host-kubelet\") pod \"ovnkube-node-n96lw\" (UID: \"edf7b8a9-878b-451c-86c3-6a78d2ece872\") " pod="openshift-ovn-kubernetes/ovnkube-node-n96lw" Apr 16 15:12:04.088051 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:04.087808 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q6shh\" (UniqueName: \"kubernetes.io/projected/1462d2c4-a9b3-43d9-a968-33a1359d6ccc-kube-api-access-q6shh\") pod \"node-resolver-sjt8j\" (UID: \"1462d2c4-a9b3-43d9-a968-33a1359d6ccc\") " pod="openshift-dns/node-resolver-sjt8j" Apr 16 15:12:04.088051 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:04.087862 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/edf7b8a9-878b-451c-86c3-6a78d2ece872-host-run-ovn-kubernetes\") pod \"ovnkube-node-n96lw\" (UID: \"edf7b8a9-878b-451c-86c3-6a78d2ece872\") " pod="openshift-ovn-kubernetes/ovnkube-node-n96lw" Apr 16 15:12:04.088051 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:04.087892 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1462d2c4-a9b3-43d9-a968-33a1359d6ccc-tmp-dir\") pod \"node-resolver-sjt8j\" (UID: \"1462d2c4-a9b3-43d9-a968-33a1359d6ccc\") " pod="openshift-dns/node-resolver-sjt8j" Apr 16 15:12:04.088051 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:04.087888 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/edf7b8a9-878b-451c-86c3-6a78d2ece872-etc-openvswitch\") pod \"ovnkube-node-n96lw\" (UID: \"edf7b8a9-878b-451c-86c3-6a78d2ece872\") " pod="openshift-ovn-kubernetes/ovnkube-node-n96lw" Apr 16 15:12:04.088051 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:04.087918 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/edf7b8a9-878b-451c-86c3-6a78d2ece872-var-lib-openvswitch\") pod \"ovnkube-node-n96lw\" (UID: \"edf7b8a9-878b-451c-86c3-6a78d2ece872\") " pod="openshift-ovn-kubernetes/ovnkube-node-n96lw" Apr 16 15:12:04.088051 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:04.087918 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/edf7b8a9-878b-451c-86c3-6a78d2ece872-env-overrides\") pod \"ovnkube-node-n96lw\" (UID: \"edf7b8a9-878b-451c-86c3-6a78d2ece872\") " pod="openshift-ovn-kubernetes/ovnkube-node-n96lw" Apr 16 15:12:04.088051 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:04.087929 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 15:12:04.088051 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:04.087957 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/edf7b8a9-878b-451c-86c3-6a78d2ece872-node-log\") pod \"ovnkube-node-n96lw\" (UID: \"edf7b8a9-878b-451c-86c3-6a78d2ece872\") " pod="openshift-ovn-kubernetes/ovnkube-node-n96lw" Apr 16 15:12:04.088051 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:04.087960 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/edf7b8a9-878b-451c-86c3-6a78d2ece872-host-run-ovn-kubernetes\") pod \"ovnkube-node-n96lw\" (UID: \"edf7b8a9-878b-451c-86c3-6a78d2ece872\") " pod="openshift-ovn-kubernetes/ovnkube-node-n96lw" Apr 16 15:12:04.088051 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:04.087976 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/edf7b8a9-878b-451c-86c3-6a78d2ece872-var-lib-openvswitch\") pod \"ovnkube-node-n96lw\" (UID: \"edf7b8a9-878b-451c-86c3-6a78d2ece872\") " pod="openshift-ovn-kubernetes/ovnkube-node-n96lw" Apr 16 15:12:04.088051 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:04.087987 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/edf7b8a9-878b-451c-86c3-6a78d2ece872-log-socket\") pod \"ovnkube-node-n96lw\" (UID: \"edf7b8a9-878b-451c-86c3-6a78d2ece872\") " pod="openshift-ovn-kubernetes/ovnkube-node-n96lw" Apr 16 15:12:04.088051 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:04.088015 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/edf7b8a9-878b-451c-86c3-6a78d2ece872-host-cni-bin\") pod \"ovnkube-node-n96lw\" (UID: \"edf7b8a9-878b-451c-86c3-6a78d2ece872\") " pod="openshift-ovn-kubernetes/ovnkube-node-n96lw" Apr 16 15:12:04.088051 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:04.088019 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/edf7b8a9-878b-451c-86c3-6a78d2ece872-node-log\") pod \"ovnkube-node-n96lw\" (UID: \"edf7b8a9-878b-451c-86c3-6a78d2ece872\") " pod="openshift-ovn-kubernetes/ovnkube-node-n96lw" Apr 16 15:12:04.088815 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:04.088019 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/edf7b8a9-878b-451c-86c3-6a78d2ece872-log-socket\") pod \"ovnkube-node-n96lw\" (UID: \"edf7b8a9-878b-451c-86c3-6a78d2ece872\") " pod="openshift-ovn-kubernetes/ovnkube-node-n96lw" Apr 16 15:12:04.088815 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:04.088057 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/edf7b8a9-878b-451c-86c3-6a78d2ece872-host-cni-bin\") pod \"ovnkube-node-n96lw\" (UID: \"edf7b8a9-878b-451c-86c3-6a78d2ece872\") " pod="openshift-ovn-kubernetes/ovnkube-node-n96lw" Apr 16 15:12:04.088815 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:04.088085 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/414282ac-76dc-4992-819a-bdf05043c09d-metrics-certs podName:414282ac-76dc-4992-819a-bdf05043c09d nodeName:}" failed. No retries permitted until 2026-04-16 15:12:04.588015899 +0000 UTC m=+3.174219702 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/414282ac-76dc-4992-819a-bdf05043c09d-metrics-certs") pod "network-metrics-daemon-h9c44" (UID: "414282ac-76dc-4992-819a-bdf05043c09d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 15:12:04.088815 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:04.088148 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/edf7b8a9-878b-451c-86c3-6a78d2ece872-host-run-netns\") pod \"ovnkube-node-n96lw\" (UID: \"edf7b8a9-878b-451c-86c3-6a78d2ece872\") " pod="openshift-ovn-kubernetes/ovnkube-node-n96lw" Apr 16 15:12:04.088815 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:04.088157 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1462d2c4-a9b3-43d9-a968-33a1359d6ccc-tmp-dir\") pod \"node-resolver-sjt8j\" (UID: \"1462d2c4-a9b3-43d9-a968-33a1359d6ccc\") " pod="openshift-dns/node-resolver-sjt8j" Apr 16 15:12:04.088815 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:04.088186 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/edf7b8a9-878b-451c-86c3-6a78d2ece872-run-systemd\") pod \"ovnkube-node-n96lw\" (UID: \"edf7b8a9-878b-451c-86c3-6a78d2ece872\") " pod="openshift-ovn-kubernetes/ovnkube-node-n96lw" Apr 16 15:12:04.088815 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:04.088220 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/edf7b8a9-878b-451c-86c3-6a78d2ece872-run-openvswitch\") pod \"ovnkube-node-n96lw\" (UID: \"edf7b8a9-878b-451c-86c3-6a78d2ece872\") " pod="openshift-ovn-kubernetes/ovnkube-node-n96lw" Apr 16 15:12:04.088815 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:04.088233 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/edf7b8a9-878b-451c-86c3-6a78d2ece872-host-run-netns\") pod \"ovnkube-node-n96lw\" (UID: \"edf7b8a9-878b-451c-86c3-6a78d2ece872\") " pod="openshift-ovn-kubernetes/ovnkube-node-n96lw" Apr 16 15:12:04.088815 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:04.088278 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/edf7b8a9-878b-451c-86c3-6a78d2ece872-run-systemd\") pod \"ovnkube-node-n96lw\" (UID: \"edf7b8a9-878b-451c-86c3-6a78d2ece872\") " pod="openshift-ovn-kubernetes/ovnkube-node-n96lw" Apr 16 15:12:04.088815 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:04.088294 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/1462d2c4-a9b3-43d9-a968-33a1359d6ccc-hosts-file\") pod \"node-resolver-sjt8j\" (UID: \"1462d2c4-a9b3-43d9-a968-33a1359d6ccc\") " pod="openshift-dns/node-resolver-sjt8j" Apr 16 15:12:04.088815 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:04.088317 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/edf7b8a9-878b-451c-86c3-6a78d2ece872-host-slash\") pod \"ovnkube-node-n96lw\" (UID: \"edf7b8a9-878b-451c-86c3-6a78d2ece872\") " pod="openshift-ovn-kubernetes/ovnkube-node-n96lw" Apr 16 15:12:04.088815 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:04.088323 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/edf7b8a9-878b-451c-86c3-6a78d2ece872-run-openvswitch\") pod \"ovnkube-node-n96lw\" (UID: \"edf7b8a9-878b-451c-86c3-6a78d2ece872\") " pod="openshift-ovn-kubernetes/ovnkube-node-n96lw" Apr 16 15:12:04.088815 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:04.088341 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/edf7b8a9-878b-451c-86c3-6a78d2ece872-host-cni-netd\") pod \"ovnkube-node-n96lw\" (UID: \"edf7b8a9-878b-451c-86c3-6a78d2ece872\") " pod="openshift-ovn-kubernetes/ovnkube-node-n96lw" Apr 16 15:12:04.088815 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:04.088381 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/edf7b8a9-878b-451c-86c3-6a78d2ece872-ovnkube-config\") pod \"ovnkube-node-n96lw\" (UID: \"edf7b8a9-878b-451c-86c3-6a78d2ece872\") " pod="openshift-ovn-kubernetes/ovnkube-node-n96lw" Apr 16 15:12:04.088815 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:04.088385 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/1462d2c4-a9b3-43d9-a968-33a1359d6ccc-hosts-file\") pod \"node-resolver-sjt8j\" (UID: \"1462d2c4-a9b3-43d9-a968-33a1359d6ccc\") " pod="openshift-dns/node-resolver-sjt8j" Apr 16 15:12:04.088815 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:04.088365 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/edf7b8a9-878b-451c-86c3-6a78d2ece872-host-slash\") pod \"ovnkube-node-n96lw\" (UID: \"edf7b8a9-878b-451c-86c3-6a78d2ece872\") " pod="openshift-ovn-kubernetes/ovnkube-node-n96lw" Apr 16 15:12:04.088815 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:04.088430 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/edf7b8a9-878b-451c-86c3-6a78d2ece872-run-ovn\") pod \"ovnkube-node-n96lw\" (UID: \"edf7b8a9-878b-451c-86c3-6a78d2ece872\") " pod="openshift-ovn-kubernetes/ovnkube-node-n96lw" Apr 16 15:12:04.089399 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:04.088435 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/edf7b8a9-878b-451c-86c3-6a78d2ece872-host-cni-netd\") pod \"ovnkube-node-n96lw\" (UID: \"edf7b8a9-878b-451c-86c3-6a78d2ece872\") " pod="openshift-ovn-kubernetes/ovnkube-node-n96lw" Apr 16 15:12:04.089399 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:04.088526 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/edf7b8a9-878b-451c-86c3-6a78d2ece872-run-ovn\") pod \"ovnkube-node-n96lw\" (UID: \"edf7b8a9-878b-451c-86c3-6a78d2ece872\") " pod="openshift-ovn-kubernetes/ovnkube-node-n96lw" Apr 16 15:12:04.089399 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:04.088594 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/edf7b8a9-878b-451c-86c3-6a78d2ece872-ovnkube-script-lib\") pod \"ovnkube-node-n96lw\" (UID: \"edf7b8a9-878b-451c-86c3-6a78d2ece872\") " pod="openshift-ovn-kubernetes/ovnkube-node-n96lw" Apr 16 15:12:04.089399 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:04.088806 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/edf7b8a9-878b-451c-86c3-6a78d2ece872-ovnkube-config\") pod \"ovnkube-node-n96lw\" (UID: \"edf7b8a9-878b-451c-86c3-6a78d2ece872\") " pod="openshift-ovn-kubernetes/ovnkube-node-n96lw" Apr 16 15:12:04.091406 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:04.091388 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/edf7b8a9-878b-451c-86c3-6a78d2ece872-ovn-node-metrics-cert\") pod \"ovnkube-node-n96lw\" (UID: \"edf7b8a9-878b-451c-86c3-6a78d2ece872\") " pod="openshift-ovn-kubernetes/ovnkube-node-n96lw" Apr 16 15:12:04.097833 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:04.097782 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfpcz\" (UniqueName: \"kubernetes.io/projected/bf188288-b071-4a5d-867d-17022354cd51-kube-api-access-bfpcz\") pod \"node-ca-4pfr6\" (UID: \"bf188288-b071-4a5d-867d-17022354cd51\") " pod="openshift-image-registry/node-ca-4pfr6" Apr 16 15:12:04.153190 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:04.153131 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8zn5\" (UniqueName: \"kubernetes.io/projected/edf7b8a9-878b-451c-86c3-6a78d2ece872-kube-api-access-q8zn5\") pod \"ovnkube-node-n96lw\" (UID: \"edf7b8a9-878b-451c-86c3-6a78d2ece872\") " pod="openshift-ovn-kubernetes/ovnkube-node-n96lw" Apr 16 15:12:04.154688 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:04.154666 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6xnx\" (UniqueName: \"kubernetes.io/projected/414282ac-76dc-4992-819a-bdf05043c09d-kube-api-access-x6xnx\") pod \"network-metrics-daemon-h9c44\" (UID: \"414282ac-76dc-4992-819a-bdf05043c09d\") " pod="openshift-multus/network-metrics-daemon-h9c44" Apr 16 15:12:04.175332 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:04.175312 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-2dkml" Apr 16 15:12:04.176442 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:04.176423 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 15:12:04.176442 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:04.176444 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 15:12:04.176558 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:04.176454 2573 projected.go:194] Error preparing data for projected volume kube-api-access-ctkdb for pod openshift-network-diagnostics/network-check-target-ggs7q: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 15:12:04.176777 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:04.176764 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/905049f6-70d0-42ea-bf9f-d37c4f436506-kube-api-access-ctkdb podName:905049f6-70d0-42ea-bf9f-d37c4f436506 nodeName:}" failed. No retries permitted until 2026-04-16 15:12:04.676744868 +0000 UTC m=+3.262948655 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-ctkdb" (UniqueName: "kubernetes.io/projected/905049f6-70d0-42ea-bf9f-d37c4f436506-kube-api-access-ctkdb") pod "network-check-target-ggs7q" (UID: "905049f6-70d0-42ea-bf9f-d37c4f436506") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 15:12:04.179449 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:04.179420 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6shh\" (UniqueName: \"kubernetes.io/projected/1462d2c4-a9b3-43d9-a968-33a1359d6ccc-kube-api-access-q6shh\") pod \"node-resolver-sjt8j\" (UID: \"1462d2c4-a9b3-43d9-a968-33a1359d6ccc\") " pod="openshift-dns/node-resolver-sjt8j" Apr 16 15:12:04.182729 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:04.182710 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-bsxxt" Apr 16 15:12:04.195130 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:04.195109 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-4pfr6" Apr 16 15:12:04.202780 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:04.202758 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-8cgfx" Apr 16 15:12:04.212373 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:04.212352 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-zf2wm" Apr 16 15:12:04.219178 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:04.219158 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lrfp8" Apr 16 15:12:04.226483 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:04.226462 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-4kjbb" Apr 16 15:12:04.234059 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:04.234041 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-sjt8j" Apr 16 15:12:04.238978 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:04.238958 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-n96lw" Apr 16 15:12:04.592754 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:04.592726 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/414282ac-76dc-4992-819a-bdf05043c09d-metrics-certs\") pod \"network-metrics-daemon-h9c44\" (UID: \"414282ac-76dc-4992-819a-bdf05043c09d\") " pod="openshift-multus/network-metrics-daemon-h9c44" Apr 16 15:12:04.592927 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:04.592895 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 15:12:04.592980 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:04.592969 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/414282ac-76dc-4992-819a-bdf05043c09d-metrics-certs podName:414282ac-76dc-4992-819a-bdf05043c09d nodeName:}" failed. No retries permitted until 2026-04-16 15:12:05.592946108 +0000 UTC m=+4.179149917 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/414282ac-76dc-4992-819a-bdf05043c09d-metrics-certs") pod "network-metrics-daemon-h9c44" (UID: "414282ac-76dc-4992-819a-bdf05043c09d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 15:12:04.601756 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:04.601712 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 15:07:03 +0000 UTC" deadline="2027-11-09 19:45:02.815017457 +0000 UTC" Apr 16 15:12:04.601756 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:04.601747 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13732h32m58.213273762s" Apr 16 15:12:04.693104 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:04.693048 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ctkdb\" (UniqueName: \"kubernetes.io/projected/905049f6-70d0-42ea-bf9f-d37c4f436506-kube-api-access-ctkdb\") pod \"network-check-target-ggs7q\" (UID: \"905049f6-70d0-42ea-bf9f-d37c4f436506\") " pod="openshift-network-diagnostics/network-check-target-ggs7q" Apr 16 15:12:04.693273 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:04.693206 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 15:12:04.693273 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:04.693229 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 15:12:04.693273 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:04.693240 2573 projected.go:194] Error preparing data for projected volume kube-api-access-ctkdb for pod openshift-network-diagnostics/network-check-target-ggs7q: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 15:12:04.693480 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:04.693305 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/905049f6-70d0-42ea-bf9f-d37c4f436506-kube-api-access-ctkdb podName:905049f6-70d0-42ea-bf9f-d37c4f436506 nodeName:}" failed. No retries permitted until 2026-04-16 15:12:05.69328479 +0000 UTC m=+4.279488578 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-ctkdb" (UniqueName: "kubernetes.io/projected/905049f6-70d0-42ea-bf9f-d37c4f436506-kube-api-access-ctkdb") pod "network-check-target-ggs7q" (UID: "905049f6-70d0-42ea-bf9f-d37c4f436506") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 15:12:04.769988 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:04.769962 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod81bf7298_f21b_4f3a_adb6_d3a8c98c8da7.slice/crio-783f8e6a0c1b072badb8b8aeb3c1b310f9932a835912adecd620479ef5ce477c WatchSource:0}: Error finding container 783f8e6a0c1b072badb8b8aeb3c1b310f9932a835912adecd620479ef5ce477c: Status 404 returned error can't find the container with id 783f8e6a0c1b072badb8b8aeb3c1b310f9932a835912adecd620479ef5ce477c Apr 16 15:12:04.771364 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:04.771330 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b3e092e_d180_4205_ac3b_9d71123acb4d.slice/crio-fd9aee0e995d35bf0f6c63ff137b956925b595c0ec459b1fcd6a188158351947 WatchSource:0}: Error finding container fd9aee0e995d35bf0f6c63ff137b956925b595c0ec459b1fcd6a188158351947: Status 404 returned error can't find the container with id fd9aee0e995d35bf0f6c63ff137b956925b595c0ec459b1fcd6a188158351947 Apr 16 15:12:04.773504 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:04.773480 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod914c9c02_e314_4b0c_b5f6_aab4f4681c8d.slice/crio-d1aadc34041bc0aa0833246770cfce4e4331d3944bff45116c2e57135fe28527 WatchSource:0}: Error finding container d1aadc34041bc0aa0833246770cfce4e4331d3944bff45116c2e57135fe28527: Status 404 returned error can't find the container with id d1aadc34041bc0aa0833246770cfce4e4331d3944bff45116c2e57135fe28527 Apr 16 15:12:04.775435 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:04.775365 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbaca9b64_0c10_49d7_9301_8d81286c87b6.slice/crio-d68de3e40af7c4b8fe01213285027aa73838624ca99804507c14b012317535d3 WatchSource:0}: Error finding container d68de3e40af7c4b8fe01213285027aa73838624ca99804507c14b012317535d3: Status 404 returned error can't find the container with id d68de3e40af7c4b8fe01213285027aa73838624ca99804507c14b012317535d3 Apr 16 15:12:04.776321 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:04.776302 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podedf7b8a9_878b_451c_86c3_6a78d2ece872.slice/crio-43b84ebb73b247a57cca06669ab76d4bf77ceed1fa2f8c4ffd0d3f10dde7f479 WatchSource:0}: Error finding container 43b84ebb73b247a57cca06669ab76d4bf77ceed1fa2f8c4ffd0d3f10dde7f479: Status 404 returned error can't find the container with id 43b84ebb73b247a57cca06669ab76d4bf77ceed1fa2f8c4ffd0d3f10dde7f479 Apr 16 15:12:04.777572 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:12:04.777353 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f287493_2507_4465_97b3_78fa8046ab43.slice/crio-cf8d3df3c7e634fd191942035aee9d7cf87674ee67c4b5536b4218f8f3eda92b WatchSource:0}: Error finding container cf8d3df3c7e634fd191942035aee9d7cf87674ee67c4b5536b4218f8f3eda92b: Status 404 returned error can't find the container with id cf8d3df3c7e634fd191942035aee9d7cf87674ee67c4b5536b4218f8f3eda92b Apr 16 15:12:05.021608 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:05.021578 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-4pfr6" event={"ID":"bf188288-b071-4a5d-867d-17022354cd51","Type":"ContainerStarted","Data":"ad145df9125392973e27dccf91b0052b0b057ea5549e162118b04a831aa075a7"} Apr 16 15:12:05.022485 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:05.022450 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-zf2wm" event={"ID":"2f287493-2507-4465-97b3-78fa8046ab43","Type":"ContainerStarted","Data":"cf8d3df3c7e634fd191942035aee9d7cf87674ee67c4b5536b4218f8f3eda92b"} Apr 16 15:12:05.025894 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:05.025872 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n96lw" event={"ID":"edf7b8a9-878b-451c-86c3-6a78d2ece872","Type":"ContainerStarted","Data":"43b84ebb73b247a57cca06669ab76d4bf77ceed1fa2f8c4ffd0d3f10dde7f479"} Apr 16 15:12:05.026848 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:05.026825 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4kjbb" event={"ID":"914c9c02-e314-4b0c-b5f6-aab4f4681c8d","Type":"ContainerStarted","Data":"d1aadc34041bc0aa0833246770cfce4e4331d3944bff45116c2e57135fe28527"} Apr 16 15:12:05.027767 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:05.027745 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-2dkml" event={"ID":"baca9b64-0c10-49d7-9301-8d81286c87b6","Type":"ContainerStarted","Data":"d68de3e40af7c4b8fe01213285027aa73838624ca99804507c14b012317535d3"} Apr 16 15:12:05.030333 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:05.030309 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-151.ec2.internal" event={"ID":"965c3203e0e8294321ef158b59084066","Type":"ContainerStarted","Data":"303e7be976621dd246239a3dd563365351d311953bf8df19ab1d9efcd48b059f"} Apr 16 15:12:05.031303 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:05.031282 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-sjt8j" event={"ID":"1462d2c4-a9b3-43d9-a968-33a1359d6ccc","Type":"ContainerStarted","Data":"a5f810039ae8fe5be492034b54c9729ec9f4ff053eda61c9bc4611f56cd3e700"} Apr 16 15:12:05.032282 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:05.032261 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lrfp8" event={"ID":"4caba475-2461-4633-a1d4-00e049169fb5","Type":"ContainerStarted","Data":"9e4dee31d95fca33360cae6bd9b26cb7da771b54a1591153dd398b5288191260"} Apr 16 15:12:05.033185 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:05.033168 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8cgfx" event={"ID":"0b3e092e-d180-4205-ac3b-9d71123acb4d","Type":"ContainerStarted","Data":"fd9aee0e995d35bf0f6c63ff137b956925b595c0ec459b1fcd6a188158351947"} Apr 16 15:12:05.034041 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:05.034020 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-bsxxt" event={"ID":"81bf7298-f21b-4f3a-adb6-d3a8c98c8da7","Type":"ContainerStarted","Data":"783f8e6a0c1b072badb8b8aeb3c1b310f9932a835912adecd620479ef5ce477c"} Apr 16 15:12:05.136511 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:05.136372 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-151.ec2.internal" podStartSLOduration=2.136353455 podStartE2EDuration="2.136353455s" podCreationTimestamp="2026-04-16 15:12:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:12:05.136168334 +0000 UTC m=+3.722372166" watchObservedRunningTime="2026-04-16 15:12:05.136353455 +0000 UTC m=+3.722557265" Apr 16 15:12:05.599802 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:05.599756 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/414282ac-76dc-4992-819a-bdf05043c09d-metrics-certs\") pod \"network-metrics-daemon-h9c44\" (UID: \"414282ac-76dc-4992-819a-bdf05043c09d\") " pod="openshift-multus/network-metrics-daemon-h9c44" Apr 16 15:12:05.599922 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:05.599887 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 15:12:05.599980 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:05.599958 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/414282ac-76dc-4992-819a-bdf05043c09d-metrics-certs podName:414282ac-76dc-4992-819a-bdf05043c09d nodeName:}" failed. No retries permitted until 2026-04-16 15:12:07.599938521 +0000 UTC m=+6.186142309 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/414282ac-76dc-4992-819a-bdf05043c09d-metrics-certs") pod "network-metrics-daemon-h9c44" (UID: "414282ac-76dc-4992-819a-bdf05043c09d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 15:12:05.602978 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:05.602946 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 15:07:03 +0000 UTC" deadline="2027-12-04 17:33:06.295424249 +0000 UTC" Apr 16 15:12:05.602978 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:05.602977 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14330h21m0.692450297s" Apr 16 15:12:05.700578 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:05.700488 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ctkdb\" (UniqueName: \"kubernetes.io/projected/905049f6-70d0-42ea-bf9f-d37c4f436506-kube-api-access-ctkdb\") pod \"network-check-target-ggs7q\" (UID: \"905049f6-70d0-42ea-bf9f-d37c4f436506\") " pod="openshift-network-diagnostics/network-check-target-ggs7q" Apr 16 15:12:05.700752 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:05.700636 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 15:12:05.700752 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:05.700676 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 15:12:05.700752 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:05.700688 2573 projected.go:194] Error preparing data for projected volume kube-api-access-ctkdb for pod openshift-network-diagnostics/network-check-target-ggs7q: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 15:12:05.700752 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:05.700743 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/905049f6-70d0-42ea-bf9f-d37c4f436506-kube-api-access-ctkdb podName:905049f6-70d0-42ea-bf9f-d37c4f436506 nodeName:}" failed. No retries permitted until 2026-04-16 15:12:07.700724676 +0000 UTC m=+6.286928483 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-ctkdb" (UniqueName: "kubernetes.io/projected/905049f6-70d0-42ea-bf9f-d37c4f436506-kube-api-access-ctkdb") pod "network-check-target-ggs7q" (UID: "905049f6-70d0-42ea-bf9f-d37c4f436506") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 15:12:06.015980 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:06.015946 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h9c44" Apr 16 15:12:06.016192 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:06.016077 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h9c44" podUID="414282ac-76dc-4992-819a-bdf05043c09d" Apr 16 15:12:06.016494 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:06.016471 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ggs7q" Apr 16 15:12:06.016595 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:06.016569 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ggs7q" podUID="905049f6-70d0-42ea-bf9f-d37c4f436506" Apr 16 15:12:06.040281 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:06.040186 2573 generic.go:358] "Generic (PLEG): container finished" podID="a58163a404ebe8fd3d421aefb0ba0f25" containerID="b1c7d8941f160d1c572a59a90d3c03de1dba91d3049e56d3a85bffe366a1ba9d" exitCode=0 Apr 16 15:12:06.040728 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:06.040289 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-151.ec2.internal" event={"ID":"a58163a404ebe8fd3d421aefb0ba0f25","Type":"ContainerDied","Data":"b1c7d8941f160d1c572a59a90d3c03de1dba91d3049e56d3a85bffe366a1ba9d"} Apr 16 15:12:07.049016 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:07.048828 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-151.ec2.internal" event={"ID":"a58163a404ebe8fd3d421aefb0ba0f25","Type":"ContainerStarted","Data":"80e34851f1818d41ee36e1a80ddc38e71b6ff5cd9d3d63c333e451d56a7c19b5"} Apr 16 15:12:07.616884 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:07.616837 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/414282ac-76dc-4992-819a-bdf05043c09d-metrics-certs\") pod \"network-metrics-daemon-h9c44\" (UID: \"414282ac-76dc-4992-819a-bdf05043c09d\") " pod="openshift-multus/network-metrics-daemon-h9c44" Apr 16 15:12:07.617072 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:07.617016 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 15:12:07.617157 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:07.617076 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/414282ac-76dc-4992-819a-bdf05043c09d-metrics-certs podName:414282ac-76dc-4992-819a-bdf05043c09d nodeName:}" failed. No retries permitted until 2026-04-16 15:12:11.617058457 +0000 UTC m=+10.203262248 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/414282ac-76dc-4992-819a-bdf05043c09d-metrics-certs") pod "network-metrics-daemon-h9c44" (UID: "414282ac-76dc-4992-819a-bdf05043c09d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 15:12:07.717279 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:07.717245 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ctkdb\" (UniqueName: \"kubernetes.io/projected/905049f6-70d0-42ea-bf9f-d37c4f436506-kube-api-access-ctkdb\") pod \"network-check-target-ggs7q\" (UID: \"905049f6-70d0-42ea-bf9f-d37c4f436506\") " pod="openshift-network-diagnostics/network-check-target-ggs7q" Apr 16 15:12:07.717436 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:07.717385 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 15:12:07.717436 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:07.717409 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 15:12:07.717436 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:07.717422 2573 projected.go:194] Error preparing data for projected volume kube-api-access-ctkdb for pod openshift-network-diagnostics/network-check-target-ggs7q: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 15:12:07.717600 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:07.717476 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/905049f6-70d0-42ea-bf9f-d37c4f436506-kube-api-access-ctkdb podName:905049f6-70d0-42ea-bf9f-d37c4f436506 nodeName:}" failed. No retries permitted until 2026-04-16 15:12:11.717457729 +0000 UTC m=+10.303661522 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-ctkdb" (UniqueName: "kubernetes.io/projected/905049f6-70d0-42ea-bf9f-d37c4f436506-kube-api-access-ctkdb") pod "network-check-target-ggs7q" (UID: "905049f6-70d0-42ea-bf9f-d37c4f436506") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 15:12:08.013934 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:08.013874 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ggs7q" Apr 16 15:12:08.013934 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:08.013936 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h9c44" Apr 16 15:12:08.014160 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:08.014045 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h9c44" podUID="414282ac-76dc-4992-819a-bdf05043c09d" Apr 16 15:12:08.014327 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:08.014295 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ggs7q" podUID="905049f6-70d0-42ea-bf9f-d37c4f436506" Apr 16 15:12:10.013828 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:10.013794 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ggs7q" Apr 16 15:12:10.014323 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:10.013924 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ggs7q" podUID="905049f6-70d0-42ea-bf9f-d37c4f436506" Apr 16 15:12:10.014323 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:10.013982 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h9c44" Apr 16 15:12:10.014323 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:10.014092 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h9c44" podUID="414282ac-76dc-4992-819a-bdf05043c09d" Apr 16 15:12:11.650779 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:11.650737 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/414282ac-76dc-4992-819a-bdf05043c09d-metrics-certs\") pod \"network-metrics-daemon-h9c44\" (UID: \"414282ac-76dc-4992-819a-bdf05043c09d\") " pod="openshift-multus/network-metrics-daemon-h9c44" Apr 16 15:12:11.651206 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:11.650885 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 15:12:11.651206 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:11.650947 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/414282ac-76dc-4992-819a-bdf05043c09d-metrics-certs podName:414282ac-76dc-4992-819a-bdf05043c09d nodeName:}" failed. No retries permitted until 2026-04-16 15:12:19.650928713 +0000 UTC m=+18.237132503 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/414282ac-76dc-4992-819a-bdf05043c09d-metrics-certs") pod "network-metrics-daemon-h9c44" (UID: "414282ac-76dc-4992-819a-bdf05043c09d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 15:12:11.751347 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:11.751264 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ctkdb\" (UniqueName: \"kubernetes.io/projected/905049f6-70d0-42ea-bf9f-d37c4f436506-kube-api-access-ctkdb\") pod \"network-check-target-ggs7q\" (UID: \"905049f6-70d0-42ea-bf9f-d37c4f436506\") " pod="openshift-network-diagnostics/network-check-target-ggs7q" Apr 16 15:12:11.751520 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:11.751467 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 15:12:11.751520 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:11.751505 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 15:12:11.751520 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:11.751520 2573 projected.go:194] Error preparing data for projected volume kube-api-access-ctkdb for pod openshift-network-diagnostics/network-check-target-ggs7q: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 15:12:11.751714 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:11.751579 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/905049f6-70d0-42ea-bf9f-d37c4f436506-kube-api-access-ctkdb podName:905049f6-70d0-42ea-bf9f-d37c4f436506 nodeName:}" failed. No retries permitted until 2026-04-16 15:12:19.75155939 +0000 UTC m=+18.337763181 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-ctkdb" (UniqueName: "kubernetes.io/projected/905049f6-70d0-42ea-bf9f-d37c4f436506-kube-api-access-ctkdb") pod "network-check-target-ggs7q" (UID: "905049f6-70d0-42ea-bf9f-d37c4f436506") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 15:12:12.014707 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:12.014607 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ggs7q" Apr 16 15:12:12.014849 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:12.014707 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h9c44" Apr 16 15:12:12.014849 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:12.014776 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ggs7q" podUID="905049f6-70d0-42ea-bf9f-d37c4f436506" Apr 16 15:12:12.014959 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:12.014908 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h9c44" podUID="414282ac-76dc-4992-819a-bdf05043c09d" Apr 16 15:12:14.013962 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:14.013924 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ggs7q" Apr 16 15:12:14.014335 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:14.013937 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h9c44" Apr 16 15:12:14.014335 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:14.014062 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ggs7q" podUID="905049f6-70d0-42ea-bf9f-d37c4f436506" Apr 16 15:12:14.014335 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:14.014114 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h9c44" podUID="414282ac-76dc-4992-819a-bdf05043c09d" Apr 16 15:12:16.013947 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:16.013910 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ggs7q" Apr 16 15:12:16.014375 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:16.013925 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h9c44" Apr 16 15:12:16.014375 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:16.014052 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ggs7q" podUID="905049f6-70d0-42ea-bf9f-d37c4f436506" Apr 16 15:12:16.014375 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:16.014177 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h9c44" podUID="414282ac-76dc-4992-819a-bdf05043c09d" Apr 16 15:12:18.014099 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:18.014011 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ggs7q" Apr 16 15:12:18.014099 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:18.014058 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h9c44" Apr 16 15:12:18.014514 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:18.014145 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ggs7q" podUID="905049f6-70d0-42ea-bf9f-d37c4f436506" Apr 16 15:12:18.014514 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:18.014254 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h9c44" podUID="414282ac-76dc-4992-819a-bdf05043c09d" Apr 16 15:12:19.710553 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:19.710504 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/414282ac-76dc-4992-819a-bdf05043c09d-metrics-certs\") pod \"network-metrics-daemon-h9c44\" (UID: \"414282ac-76dc-4992-819a-bdf05043c09d\") " pod="openshift-multus/network-metrics-daemon-h9c44" Apr 16 15:12:19.711015 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:19.710687 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 15:12:19.711015 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:19.710748 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/414282ac-76dc-4992-819a-bdf05043c09d-metrics-certs podName:414282ac-76dc-4992-819a-bdf05043c09d nodeName:}" failed. No retries permitted until 2026-04-16 15:12:35.710733624 +0000 UTC m=+34.296937416 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/414282ac-76dc-4992-819a-bdf05043c09d-metrics-certs") pod "network-metrics-daemon-h9c44" (UID: "414282ac-76dc-4992-819a-bdf05043c09d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 15:12:19.811391 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:19.811339 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ctkdb\" (UniqueName: \"kubernetes.io/projected/905049f6-70d0-42ea-bf9f-d37c4f436506-kube-api-access-ctkdb\") pod \"network-check-target-ggs7q\" (UID: \"905049f6-70d0-42ea-bf9f-d37c4f436506\") " pod="openshift-network-diagnostics/network-check-target-ggs7q" Apr 16 15:12:19.811571 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:19.811554 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 15:12:19.811617 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:19.811579 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 15:12:19.811617 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:19.811593 2573 projected.go:194] Error preparing data for projected volume kube-api-access-ctkdb for pod openshift-network-diagnostics/network-check-target-ggs7q: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 15:12:19.811716 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:19.811676 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/905049f6-70d0-42ea-bf9f-d37c4f436506-kube-api-access-ctkdb podName:905049f6-70d0-42ea-bf9f-d37c4f436506 nodeName:}" failed. No retries permitted until 2026-04-16 15:12:35.811642051 +0000 UTC m=+34.397845855 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-ctkdb" (UniqueName: "kubernetes.io/projected/905049f6-70d0-42ea-bf9f-d37c4f436506-kube-api-access-ctkdb") pod "network-check-target-ggs7q" (UID: "905049f6-70d0-42ea-bf9f-d37c4f436506") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 15:12:20.013755 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:20.013675 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h9c44" Apr 16 15:12:20.013966 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:20.013692 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ggs7q" Apr 16 15:12:20.013966 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:20.013837 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h9c44" podUID="414282ac-76dc-4992-819a-bdf05043c09d" Apr 16 15:12:20.013966 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:20.013876 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ggs7q" podUID="905049f6-70d0-42ea-bf9f-d37c4f436506" Apr 16 15:12:22.014188 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:22.014103 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ggs7q" Apr 16 15:12:22.014188 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:22.014184 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h9c44" Apr 16 15:12:22.014545 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:22.014288 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h9c44" podUID="414282ac-76dc-4992-819a-bdf05043c09d" Apr 16 15:12:22.014545 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:22.014428 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ggs7q" podUID="905049f6-70d0-42ea-bf9f-d37c4f436506" Apr 16 15:12:23.079514 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:23.079093 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-4pfr6" event={"ID":"bf188288-b071-4a5d-867d-17022354cd51","Type":"ContainerStarted","Data":"576d7b9e795f46be395c6590262673401ebb85b1ae0146544741981d3c73839c"} Apr 16 15:12:23.085739 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:23.085717 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n96lw_edf7b8a9-878b-451c-86c3-6a78d2ece872/ovn-acl-logging/0.log" Apr 16 15:12:23.086880 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:23.086734 2573 generic.go:358] "Generic (PLEG): container finished" podID="edf7b8a9-878b-451c-86c3-6a78d2ece872" containerID="05952efaa1eb430352924e7785c4fd5f37ada95a28ba3a5ea4003e8110ad011c" exitCode=1 Apr 16 15:12:23.086880 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:23.086769 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n96lw" event={"ID":"edf7b8a9-878b-451c-86c3-6a78d2ece872","Type":"ContainerStarted","Data":"0c55b58feabfda0bd43218627ba3f13bf5000827defcd0c6dd42bcd8bbf63aad"} Apr 16 15:12:23.086880 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:23.086796 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n96lw" event={"ID":"edf7b8a9-878b-451c-86c3-6a78d2ece872","Type":"ContainerStarted","Data":"526e7d43ec8c930395202106cf7d05ade5078e1878c76da1bd97c0a379bdd90a"} Apr 16 15:12:23.086880 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:23.086813 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n96lw" event={"ID":"edf7b8a9-878b-451c-86c3-6a78d2ece872","Type":"ContainerStarted","Data":"ed577538fb42000912349e1717a2331eae671b00799706bcd314402bfa5ff5fc"} Apr 16 15:12:23.086880 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:23.086835 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n96lw" event={"ID":"edf7b8a9-878b-451c-86c3-6a78d2ece872","Type":"ContainerStarted","Data":"3dd6b2434f4c140b163a2d1d7454fa8658e836d15ae3baac6cb95d206ab55495"} Apr 16 15:12:23.086880 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:23.086846 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n96lw" event={"ID":"edf7b8a9-878b-451c-86c3-6a78d2ece872","Type":"ContainerDied","Data":"05952efaa1eb430352924e7785c4fd5f37ada95a28ba3a5ea4003e8110ad011c"} Apr 16 15:12:23.086880 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:23.086859 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n96lw" event={"ID":"edf7b8a9-878b-451c-86c3-6a78d2ece872","Type":"ContainerStarted","Data":"2e785a941a0a9993a708fd2e4afcc7a4733328b763eeac13308c84f2bca224b6"} Apr 16 15:12:23.088477 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:23.088459 2573 generic.go:358] "Generic (PLEG): container finished" podID="914c9c02-e314-4b0c-b5f6-aab4f4681c8d" containerID="252791ce6cbf58b79c246a5109cbdfbb497c45314e428036f7efc831ed6a1478" exitCode=0 Apr 16 15:12:23.088551 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:23.088513 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4kjbb" event={"ID":"914c9c02-e314-4b0c-b5f6-aab4f4681c8d","Type":"ContainerDied","Data":"252791ce6cbf58b79c246a5109cbdfbb497c45314e428036f7efc831ed6a1478"} Apr 16 15:12:23.090432 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:23.090407 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-2dkml" event={"ID":"baca9b64-0c10-49d7-9301-8d81286c87b6","Type":"ContainerStarted","Data":"9092a0224d7643575e94a2d9e5c2fe14bd2643c96e6bedb2667ba630ff1690a0"} Apr 16 15:12:23.092160 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:23.091793 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-sjt8j" event={"ID":"1462d2c4-a9b3-43d9-a968-33a1359d6ccc","Type":"ContainerStarted","Data":"76e378fb74c51a24929e48e147b06006eb3b9e2c43394855a1e75d1a82bcb26b"} Apr 16 15:12:23.093253 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:23.093228 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lrfp8" event={"ID":"4caba475-2461-4633-a1d4-00e049169fb5","Type":"ContainerStarted","Data":"54c889d75b3aecfb6cc627c357b7d764bc2c591040cfc8fd74e7e5f645033963"} Apr 16 15:12:23.094583 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:23.094560 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8cgfx" event={"ID":"0b3e092e-d180-4205-ac3b-9d71123acb4d","Type":"ContainerStarted","Data":"f201bf54721dfc055ea8c3529807ff249d16c0670440c4a1d3daf5bb22a2e084"} Apr 16 15:12:23.096320 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:23.096282 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-bsxxt" event={"ID":"81bf7298-f21b-4f3a-adb6-d3a8c98c8da7","Type":"ContainerStarted","Data":"d021b1d66b756a71fd1226b541326ab48ae119b6d5f316353619b9239863b1b8"} Apr 16 15:12:23.109705 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:23.109643 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-151.ec2.internal" podStartSLOduration=20.109607325 podStartE2EDuration="20.109607325s" podCreationTimestamp="2026-04-16 15:12:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:12:07.08240091 +0000 UTC m=+5.668604723" watchObservedRunningTime="2026-04-16 15:12:23.109607325 +0000 UTC m=+21.695811135" Apr 16 15:12:23.195542 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:23.195489 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-4pfr6" podStartSLOduration=8.624618128 podStartE2EDuration="21.195476614s" podCreationTimestamp="2026-04-16 15:12:02 +0000 UTC" firstStartedPulling="2026-04-16 15:12:04.783001497 +0000 UTC m=+3.369205298" lastFinishedPulling="2026-04-16 15:12:17.353859993 +0000 UTC m=+15.940063784" observedRunningTime="2026-04-16 15:12:23.110804003 +0000 UTC m=+21.697007825" watchObservedRunningTime="2026-04-16 15:12:23.195476614 +0000 UTC m=+21.781680422" Apr 16 15:12:23.195739 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:23.195575 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-8cgfx" podStartSLOduration=3.854086615 podStartE2EDuration="21.195569027s" podCreationTimestamp="2026-04-16 15:12:02 +0000 UTC" firstStartedPulling="2026-04-16 15:12:04.773623142 +0000 UTC m=+3.359826933" lastFinishedPulling="2026-04-16 15:12:22.115105558 +0000 UTC m=+20.701309345" observedRunningTime="2026-04-16 15:12:23.195211697 +0000 UTC m=+21.781415518" watchObservedRunningTime="2026-04-16 15:12:23.195569027 +0000 UTC m=+21.781773023" Apr 16 15:12:23.279851 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:23.279796 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-2dkml" podStartSLOduration=4.295695327 podStartE2EDuration="21.279776623s" podCreationTimestamp="2026-04-16 15:12:02 +0000 UTC" firstStartedPulling="2026-04-16 15:12:04.777612393 +0000 UTC m=+3.363816183" lastFinishedPulling="2026-04-16 15:12:21.761693679 +0000 UTC m=+20.347897479" observedRunningTime="2026-04-16 15:12:23.279478338 +0000 UTC m=+21.865682147" watchObservedRunningTime="2026-04-16 15:12:23.279776623 +0000 UTC m=+21.865980433" Apr 16 15:12:23.383352 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:23.383314 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-sjt8j" podStartSLOduration=3.091251497 podStartE2EDuration="20.383300102s" podCreationTimestamp="2026-04-16 15:12:03 +0000 UTC" firstStartedPulling="2026-04-16 15:12:04.783348584 +0000 UTC m=+3.369552375" lastFinishedPulling="2026-04-16 15:12:22.075397179 +0000 UTC m=+20.661600980" observedRunningTime="2026-04-16 15:12:23.383014153 +0000 UTC m=+21.969217961" watchObservedRunningTime="2026-04-16 15:12:23.383300102 +0000 UTC m=+21.969503910" Apr 16 15:12:23.490081 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:23.490034 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-bsxxt" podStartSLOduration=4.186410046 podStartE2EDuration="21.490020712s" podCreationTimestamp="2026-04-16 15:12:02 +0000 UTC" firstStartedPulling="2026-04-16 15:12:04.771789555 +0000 UTC m=+3.357993345" lastFinishedPulling="2026-04-16 15:12:22.075400219 +0000 UTC m=+20.661604011" observedRunningTime="2026-04-16 15:12:23.471879307 +0000 UTC m=+22.058083115" watchObservedRunningTime="2026-04-16 15:12:23.490020712 +0000 UTC m=+22.076224520" Apr 16 15:12:23.885230 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:23.885202 2573 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 15:12:23.952486 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:23.952378 2573 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T15:12:23.885221991Z","UUID":"9579ad2d-6c66-40b5-8f94-8937234c1208","Handler":null,"Name":"","Endpoint":""} Apr 16 15:12:23.955659 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:23.955631 2573 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 15:12:23.955764 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:23.955691 2573 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 15:12:24.013992 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:24.013483 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h9c44" Apr 16 15:12:24.013992 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:24.013501 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ggs7q" Apr 16 15:12:24.013992 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:24.013610 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h9c44" podUID="414282ac-76dc-4992-819a-bdf05043c09d" Apr 16 15:12:24.013992 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:24.013749 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ggs7q" podUID="905049f6-70d0-42ea-bf9f-d37c4f436506" Apr 16 15:12:24.099567 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:24.099537 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lrfp8" event={"ID":"4caba475-2461-4633-a1d4-00e049169fb5","Type":"ContainerStarted","Data":"4aa64b3c73552e3dddbcb02b6ada2ffaab3e220bc6499a6206959bf6dfd584c6"} Apr 16 15:12:24.101078 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:24.101038 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-zf2wm" event={"ID":"2f287493-2507-4465-97b3-78fa8046ab43","Type":"ContainerStarted","Data":"02354b5e2145955ecb0479267743b836b648d287873bbde0e6341077211a9c06"} Apr 16 15:12:24.146384 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:24.146337 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-zf2wm" podStartSLOduration=5.166507345 podStartE2EDuration="22.146320876s" podCreationTimestamp="2026-04-16 15:12:02 +0000 UTC" firstStartedPulling="2026-04-16 15:12:04.781877917 +0000 UTC m=+3.368081705" lastFinishedPulling="2026-04-16 15:12:21.76169143 +0000 UTC m=+20.347895236" observedRunningTime="2026-04-16 15:12:24.145476978 +0000 UTC m=+22.731680787" watchObservedRunningTime="2026-04-16 15:12:24.146320876 +0000 UTC m=+22.732524686" Apr 16 15:12:26.013843 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:26.013591 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h9c44" Apr 16 15:12:26.014262 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:26.013582 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ggs7q" Apr 16 15:12:26.014262 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:26.013964 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h9c44" podUID="414282ac-76dc-4992-819a-bdf05043c09d" Apr 16 15:12:26.014262 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:26.014013 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ggs7q" podUID="905049f6-70d0-42ea-bf9f-d37c4f436506" Apr 16 15:12:26.106586 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:26.106551 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lrfp8" event={"ID":"4caba475-2461-4633-a1d4-00e049169fb5","Type":"ContainerStarted","Data":"c0620bb2f6082521b054a4ad7d3aac89cb8fb27f06436b97782320d48955f25e"} Apr 16 15:12:26.109323 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:26.109299 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n96lw_edf7b8a9-878b-451c-86c3-6a78d2ece872/ovn-acl-logging/0.log" Apr 16 15:12:26.109751 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:26.109719 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n96lw" event={"ID":"edf7b8a9-878b-451c-86c3-6a78d2ece872","Type":"ContainerStarted","Data":"e6e8afd48f42d140191f688ebadd39137ed2681d817659e0071ca090eb299c5c"} Apr 16 15:12:26.135385 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:26.135328 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lrfp8" podStartSLOduration=3.61320643 podStartE2EDuration="24.135313737s" podCreationTimestamp="2026-04-16 15:12:02 +0000 UTC" firstStartedPulling="2026-04-16 15:12:04.781730442 +0000 UTC m=+3.367934233" lastFinishedPulling="2026-04-16 15:12:25.303837738 +0000 UTC m=+23.890041540" observedRunningTime="2026-04-16 15:12:26.135237669 +0000 UTC m=+24.721441479" watchObservedRunningTime="2026-04-16 15:12:26.135313737 +0000 UTC m=+24.721517546" Apr 16 15:12:26.156364 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:26.156329 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-2dkml" Apr 16 15:12:26.157094 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:26.157074 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-2dkml" Apr 16 15:12:27.112521 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:27.112283 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-2dkml" Apr 16 15:12:27.113366 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:27.113349 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-2dkml" Apr 16 15:12:28.014014 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:28.013839 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ggs7q" Apr 16 15:12:28.014193 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:28.013899 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h9c44" Apr 16 15:12:28.014193 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:28.014082 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ggs7q" podUID="905049f6-70d0-42ea-bf9f-d37c4f436506" Apr 16 15:12:28.014327 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:28.014194 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h9c44" podUID="414282ac-76dc-4992-819a-bdf05043c09d" Apr 16 15:12:28.115865 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:28.115841 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n96lw_edf7b8a9-878b-451c-86c3-6a78d2ece872/ovn-acl-logging/0.log" Apr 16 15:12:28.116750 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:28.116173 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n96lw" event={"ID":"edf7b8a9-878b-451c-86c3-6a78d2ece872","Type":"ContainerStarted","Data":"3bd5e865fbaaf63cbe29949de34da83f175cf89520e3ae3c4e043b0ad749f1e5"} Apr 16 15:12:28.116750 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:28.116416 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-n96lw" Apr 16 15:12:28.116750 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:28.116679 2573 scope.go:117] "RemoveContainer" containerID="05952efaa1eb430352924e7785c4fd5f37ada95a28ba3a5ea4003e8110ad011c" Apr 16 15:12:28.117797 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:28.117771 2573 generic.go:358] "Generic (PLEG): container finished" podID="914c9c02-e314-4b0c-b5f6-aab4f4681c8d" containerID="05b5e2834c31d39a1c7a2a8394a52a9c2c85a9dd43a8d0b085511161bdd86802" exitCode=0 Apr 16 15:12:28.118381 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:28.118356 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4kjbb" event={"ID":"914c9c02-e314-4b0c-b5f6-aab4f4681c8d","Type":"ContainerDied","Data":"05b5e2834c31d39a1c7a2a8394a52a9c2c85a9dd43a8d0b085511161bdd86802"} Apr 16 15:12:28.133024 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:28.132993 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-n96lw" Apr 16 15:12:29.123075 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:29.123000 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n96lw_edf7b8a9-878b-451c-86c3-6a78d2ece872/ovn-acl-logging/0.log" Apr 16 15:12:29.123450 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:29.123312 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n96lw" event={"ID":"edf7b8a9-878b-451c-86c3-6a78d2ece872","Type":"ContainerStarted","Data":"d60cc1ef96c1cef0d665308a34847a94aa0beba46f8a5140d059de96ead1a9db"} Apr 16 15:12:29.123513 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:29.123468 2573 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 15:12:29.123681 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:29.123639 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-n96lw" Apr 16 15:12:29.125261 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:29.125237 2573 generic.go:358] "Generic (PLEG): container finished" podID="914c9c02-e314-4b0c-b5f6-aab4f4681c8d" containerID="91b7786ed8b2634b70e5bac3910ba5fbcad302f09f131121af6c8a87c5c9b2bf" exitCode=0 Apr 16 15:12:29.125365 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:29.125299 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4kjbb" event={"ID":"914c9c02-e314-4b0c-b5f6-aab4f4681c8d","Type":"ContainerDied","Data":"91b7786ed8b2634b70e5bac3910ba5fbcad302f09f131121af6c8a87c5c9b2bf"} Apr 16 15:12:29.138780 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:29.138759 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-n96lw" Apr 16 15:12:29.218997 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:29.218953 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-n96lw" podStartSLOduration=8.842928255 podStartE2EDuration="26.218938676s" podCreationTimestamp="2026-04-16 15:12:03 +0000 UTC" firstStartedPulling="2026-04-16 15:12:04.779770794 +0000 UTC m=+3.365974588" lastFinishedPulling="2026-04-16 15:12:22.155781206 +0000 UTC m=+20.741985009" observedRunningTime="2026-04-16 15:12:29.178259416 +0000 UTC m=+27.764463224" watchObservedRunningTime="2026-04-16 15:12:29.218938676 +0000 UTC m=+27.805142482" Apr 16 15:12:29.895937 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:29.895892 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-ggs7q"] Apr 16 15:12:29.896119 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:29.896026 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ggs7q" Apr 16 15:12:29.896169 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:29.896129 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ggs7q" podUID="905049f6-70d0-42ea-bf9f-d37c4f436506" Apr 16 15:12:29.900201 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:29.900174 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-h9c44"] Apr 16 15:12:29.900355 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:29.900291 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h9c44" Apr 16 15:12:29.900421 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:29.900401 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h9c44" podUID="414282ac-76dc-4992-819a-bdf05043c09d" Apr 16 15:12:30.107521 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:30.107502 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-n96lw" Apr 16 15:12:30.129639 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:30.129611 2573 generic.go:358] "Generic (PLEG): container finished" podID="914c9c02-e314-4b0c-b5f6-aab4f4681c8d" containerID="5a72df7fa9450a003ad0568c1058933b931240b0af4f0ead4936cf98e8ed5bb9" exitCode=0 Apr 16 15:12:30.129972 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:30.129681 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4kjbb" event={"ID":"914c9c02-e314-4b0c-b5f6-aab4f4681c8d","Type":"ContainerDied","Data":"5a72df7fa9450a003ad0568c1058933b931240b0af4f0ead4936cf98e8ed5bb9"} Apr 16 15:12:31.014111 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:31.013930 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ggs7q" Apr 16 15:12:31.014288 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:31.014181 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ggs7q" podUID="905049f6-70d0-42ea-bf9f-d37c4f436506" Apr 16 15:12:32.015102 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:32.015068 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h9c44" Apr 16 15:12:32.015507 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:32.015185 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h9c44" podUID="414282ac-76dc-4992-819a-bdf05043c09d" Apr 16 15:12:33.013335 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:33.013299 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ggs7q" Apr 16 15:12:33.013513 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:33.013426 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ggs7q" podUID="905049f6-70d0-42ea-bf9f-d37c4f436506" Apr 16 15:12:34.014162 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:34.014130 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h9c44" Apr 16 15:12:34.014634 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:34.014280 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h9c44" podUID="414282ac-76dc-4992-819a-bdf05043c09d" Apr 16 15:12:35.013611 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:35.013577 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ggs7q" Apr 16 15:12:35.013862 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:35.013707 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ggs7q" podUID="905049f6-70d0-42ea-bf9f-d37c4f436506" Apr 16 15:12:35.729874 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:35.729848 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/414282ac-76dc-4992-819a-bdf05043c09d-metrics-certs\") pod \"network-metrics-daemon-h9c44\" (UID: \"414282ac-76dc-4992-819a-bdf05043c09d\") " pod="openshift-multus/network-metrics-daemon-h9c44" Apr 16 15:12:35.730207 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:35.729984 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 15:12:35.730207 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:35.730047 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/414282ac-76dc-4992-819a-bdf05043c09d-metrics-certs podName:414282ac-76dc-4992-819a-bdf05043c09d nodeName:}" failed. No retries permitted until 2026-04-16 15:13:07.730032774 +0000 UTC m=+66.316236562 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/414282ac-76dc-4992-819a-bdf05043c09d-metrics-certs") pod "network-metrics-daemon-h9c44" (UID: "414282ac-76dc-4992-819a-bdf05043c09d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 15:12:35.830761 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:35.830735 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ctkdb\" (UniqueName: \"kubernetes.io/projected/905049f6-70d0-42ea-bf9f-d37c4f436506-kube-api-access-ctkdb\") pod \"network-check-target-ggs7q\" (UID: \"905049f6-70d0-42ea-bf9f-d37c4f436506\") " pod="openshift-network-diagnostics/network-check-target-ggs7q" Apr 16 15:12:35.830881 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:35.830866 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 15:12:35.830881 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:35.830879 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 15:12:35.830959 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:35.830889 2573 projected.go:194] Error preparing data for projected volume kube-api-access-ctkdb for pod openshift-network-diagnostics/network-check-target-ggs7q: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 15:12:35.830959 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:35.830932 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/905049f6-70d0-42ea-bf9f-d37c4f436506-kube-api-access-ctkdb podName:905049f6-70d0-42ea-bf9f-d37c4f436506 nodeName:}" failed. No retries permitted until 2026-04-16 15:13:07.830920647 +0000 UTC m=+66.417124434 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-ctkdb" (UniqueName: "kubernetes.io/projected/905049f6-70d0-42ea-bf9f-d37c4f436506-kube-api-access-ctkdb") pod "network-check-target-ggs7q" (UID: "905049f6-70d0-42ea-bf9f-d37c4f436506") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 15:12:36.013690 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:36.013669 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h9c44" Apr 16 15:12:36.013799 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:36.013781 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h9c44" podUID="414282ac-76dc-4992-819a-bdf05043c09d" Apr 16 15:12:36.143235 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:36.143146 2573 generic.go:358] "Generic (PLEG): container finished" podID="914c9c02-e314-4b0c-b5f6-aab4f4681c8d" containerID="b5100d1e54cb5e49805dbff4798617a5d4583709c4ba02d0de31d736c5fd5ec6" exitCode=0 Apr 16 15:12:36.143235 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:36.143200 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4kjbb" event={"ID":"914c9c02-e314-4b0c-b5f6-aab4f4681c8d","Type":"ContainerDied","Data":"b5100d1e54cb5e49805dbff4798617a5d4583709c4ba02d0de31d736c5fd5ec6"} Apr 16 15:12:36.186008 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:36.185986 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-151.ec2.internal" event="NodeReady" Apr 16 15:12:36.186130 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:36.186103 2573 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 15:12:36.278470 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:36.278444 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-mgxdh"] Apr 16 15:12:36.296474 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:36.296447 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-kw9vq"] Apr 16 15:12:36.296594 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:36.296563 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-mgxdh" Apr 16 15:12:36.303538 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:36.303471 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-xrkdr\"" Apr 16 15:12:36.303538 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:36.303503 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 15:12:36.305361 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:36.305343 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 15:12:36.317804 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:36.317784 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-kw9vq"] Apr 16 15:12:36.317804 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:36.317806 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-mgxdh"] Apr 16 15:12:36.317912 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:36.317887 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-kw9vq" Apr 16 15:12:36.323026 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:36.323006 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 15:12:36.324905 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:36.324887 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 15:12:36.325166 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:36.325152 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-rc5hb\"" Apr 16 15:12:36.325240 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:36.325185 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 15:12:36.434375 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:36.434348 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5c03038e-c257-4668-a7fa-62c438af26c8-tmp-dir\") pod \"dns-default-mgxdh\" (UID: \"5c03038e-c257-4668-a7fa-62c438af26c8\") " pod="openshift-dns/dns-default-mgxdh" Apr 16 15:12:36.434514 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:36.434390 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8d53317c-dbd7-4c13-ba73-b51e72998223-cert\") pod \"ingress-canary-kw9vq\" (UID: \"8d53317c-dbd7-4c13-ba73-b51e72998223\") " pod="openshift-ingress-canary/ingress-canary-kw9vq" Apr 16 15:12:36.434514 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:36.434420 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5c03038e-c257-4668-a7fa-62c438af26c8-config-volume\") pod \"dns-default-mgxdh\" (UID: \"5c03038e-c257-4668-a7fa-62c438af26c8\") " pod="openshift-dns/dns-default-mgxdh" Apr 16 15:12:36.434514 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:36.434461 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5c03038e-c257-4668-a7fa-62c438af26c8-metrics-tls\") pod \"dns-default-mgxdh\" (UID: \"5c03038e-c257-4668-a7fa-62c438af26c8\") " pod="openshift-dns/dns-default-mgxdh" Apr 16 15:12:36.434608 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:36.434545 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5dfn\" (UniqueName: \"kubernetes.io/projected/5c03038e-c257-4668-a7fa-62c438af26c8-kube-api-access-v5dfn\") pod \"dns-default-mgxdh\" (UID: \"5c03038e-c257-4668-a7fa-62c438af26c8\") " pod="openshift-dns/dns-default-mgxdh" Apr 16 15:12:36.434608 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:36.434570 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ds49f\" (UniqueName: \"kubernetes.io/projected/8d53317c-dbd7-4c13-ba73-b51e72998223-kube-api-access-ds49f\") pod \"ingress-canary-kw9vq\" (UID: \"8d53317c-dbd7-4c13-ba73-b51e72998223\") " pod="openshift-ingress-canary/ingress-canary-kw9vq" Apr 16 15:12:36.535614 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:36.535538 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8d53317c-dbd7-4c13-ba73-b51e72998223-cert\") pod \"ingress-canary-kw9vq\" (UID: \"8d53317c-dbd7-4c13-ba73-b51e72998223\") " pod="openshift-ingress-canary/ingress-canary-kw9vq" Apr 16 15:12:36.535614 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:36.535583 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5c03038e-c257-4668-a7fa-62c438af26c8-config-volume\") pod \"dns-default-mgxdh\" (UID: \"5c03038e-c257-4668-a7fa-62c438af26c8\") " pod="openshift-dns/dns-default-mgxdh" Apr 16 15:12:36.535614 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:36.535600 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5c03038e-c257-4668-a7fa-62c438af26c8-metrics-tls\") pod \"dns-default-mgxdh\" (UID: \"5c03038e-c257-4668-a7fa-62c438af26c8\") " pod="openshift-dns/dns-default-mgxdh" Apr 16 15:12:36.535873 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:36.535626 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v5dfn\" (UniqueName: \"kubernetes.io/projected/5c03038e-c257-4668-a7fa-62c438af26c8-kube-api-access-v5dfn\") pod \"dns-default-mgxdh\" (UID: \"5c03038e-c257-4668-a7fa-62c438af26c8\") " pod="openshift-dns/dns-default-mgxdh" Apr 16 15:12:36.535873 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:36.535642 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ds49f\" (UniqueName: \"kubernetes.io/projected/8d53317c-dbd7-4c13-ba73-b51e72998223-kube-api-access-ds49f\") pod \"ingress-canary-kw9vq\" (UID: \"8d53317c-dbd7-4c13-ba73-b51e72998223\") " pod="openshift-ingress-canary/ingress-canary-kw9vq" Apr 16 15:12:36.535873 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:36.535697 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5c03038e-c257-4668-a7fa-62c438af26c8-tmp-dir\") pod \"dns-default-mgxdh\" (UID: \"5c03038e-c257-4668-a7fa-62c438af26c8\") " pod="openshift-dns/dns-default-mgxdh" Apr 16 15:12:36.535873 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:36.535723 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 15:12:36.535873 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:36.535787 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8d53317c-dbd7-4c13-ba73-b51e72998223-cert podName:8d53317c-dbd7-4c13-ba73-b51e72998223 nodeName:}" failed. No retries permitted until 2026-04-16 15:12:37.035767523 +0000 UTC m=+35.621971310 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8d53317c-dbd7-4c13-ba73-b51e72998223-cert") pod "ingress-canary-kw9vq" (UID: "8d53317c-dbd7-4c13-ba73-b51e72998223") : secret "canary-serving-cert" not found Apr 16 15:12:36.535873 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:36.535724 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 15:12:36.535873 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:36.535831 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5c03038e-c257-4668-a7fa-62c438af26c8-metrics-tls podName:5c03038e-c257-4668-a7fa-62c438af26c8 nodeName:}" failed. No retries permitted until 2026-04-16 15:12:37.035822406 +0000 UTC m=+35.622026200 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5c03038e-c257-4668-a7fa-62c438af26c8-metrics-tls") pod "dns-default-mgxdh" (UID: "5c03038e-c257-4668-a7fa-62c438af26c8") : secret "dns-default-metrics-tls" not found Apr 16 15:12:36.536083 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:36.536019 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5c03038e-c257-4668-a7fa-62c438af26c8-tmp-dir\") pod \"dns-default-mgxdh\" (UID: \"5c03038e-c257-4668-a7fa-62c438af26c8\") " pod="openshift-dns/dns-default-mgxdh" Apr 16 15:12:36.547556 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:36.547521 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5c03038e-c257-4668-a7fa-62c438af26c8-config-volume\") pod \"dns-default-mgxdh\" (UID: \"5c03038e-c257-4668-a7fa-62c438af26c8\") " pod="openshift-dns/dns-default-mgxdh" Apr 16 15:12:36.549690 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:36.549669 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ds49f\" (UniqueName: \"kubernetes.io/projected/8d53317c-dbd7-4c13-ba73-b51e72998223-kube-api-access-ds49f\") pod \"ingress-canary-kw9vq\" (UID: \"8d53317c-dbd7-4c13-ba73-b51e72998223\") " pod="openshift-ingress-canary/ingress-canary-kw9vq" Apr 16 15:12:36.549690 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:36.549685 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5dfn\" (UniqueName: \"kubernetes.io/projected/5c03038e-c257-4668-a7fa-62c438af26c8-kube-api-access-v5dfn\") pod \"dns-default-mgxdh\" (UID: \"5c03038e-c257-4668-a7fa-62c438af26c8\") " pod="openshift-dns/dns-default-mgxdh" Apr 16 15:12:37.013339 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:37.013309 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ggs7q" Apr 16 15:12:37.016919 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:37.016900 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-xkvgm\"" Apr 16 15:12:37.017805 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:37.017790 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 15:12:37.017860 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:37.017810 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 15:12:37.039285 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:37.039264 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8d53317c-dbd7-4c13-ba73-b51e72998223-cert\") pod \"ingress-canary-kw9vq\" (UID: \"8d53317c-dbd7-4c13-ba73-b51e72998223\") " pod="openshift-ingress-canary/ingress-canary-kw9vq" Apr 16 15:12:37.039367 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:37.039303 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5c03038e-c257-4668-a7fa-62c438af26c8-metrics-tls\") pod \"dns-default-mgxdh\" (UID: \"5c03038e-c257-4668-a7fa-62c438af26c8\") " pod="openshift-dns/dns-default-mgxdh" Apr 16 15:12:37.039406 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:37.039393 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 15:12:37.039406 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:37.039400 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 15:12:37.039463 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:37.039440 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5c03038e-c257-4668-a7fa-62c438af26c8-metrics-tls podName:5c03038e-c257-4668-a7fa-62c438af26c8 nodeName:}" failed. No retries permitted until 2026-04-16 15:12:38.039426607 +0000 UTC m=+36.625630394 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5c03038e-c257-4668-a7fa-62c438af26c8-metrics-tls") pod "dns-default-mgxdh" (UID: "5c03038e-c257-4668-a7fa-62c438af26c8") : secret "dns-default-metrics-tls" not found Apr 16 15:12:37.039463 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:37.039451 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8d53317c-dbd7-4c13-ba73-b51e72998223-cert podName:8d53317c-dbd7-4c13-ba73-b51e72998223 nodeName:}" failed. No retries permitted until 2026-04-16 15:12:38.039446189 +0000 UTC m=+36.625649975 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8d53317c-dbd7-4c13-ba73-b51e72998223-cert") pod "ingress-canary-kw9vq" (UID: "8d53317c-dbd7-4c13-ba73-b51e72998223") : secret "canary-serving-cert" not found Apr 16 15:12:37.148085 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:37.148058 2573 generic.go:358] "Generic (PLEG): container finished" podID="914c9c02-e314-4b0c-b5f6-aab4f4681c8d" containerID="bbe5f97028b32601b20c090c9203aa99b681823db242561903768915cc6bda6f" exitCode=0 Apr 16 15:12:37.148223 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:37.148102 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4kjbb" event={"ID":"914c9c02-e314-4b0c-b5f6-aab4f4681c8d","Type":"ContainerDied","Data":"bbe5f97028b32601b20c090c9203aa99b681823db242561903768915cc6bda6f"} Apr 16 15:12:38.014021 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:38.013987 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h9c44" Apr 16 15:12:38.018815 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:38.018583 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 15:12:38.018934 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:38.018708 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-5mf5n\"" Apr 16 15:12:38.047570 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:38.047544 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8d53317c-dbd7-4c13-ba73-b51e72998223-cert\") pod \"ingress-canary-kw9vq\" (UID: \"8d53317c-dbd7-4c13-ba73-b51e72998223\") " pod="openshift-ingress-canary/ingress-canary-kw9vq" Apr 16 15:12:38.047694 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:38.047582 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5c03038e-c257-4668-a7fa-62c438af26c8-metrics-tls\") pod \"dns-default-mgxdh\" (UID: \"5c03038e-c257-4668-a7fa-62c438af26c8\") " pod="openshift-dns/dns-default-mgxdh" Apr 16 15:12:38.047694 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:38.047687 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 15:12:38.047790 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:38.047689 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 15:12:38.047790 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:38.047734 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5c03038e-c257-4668-a7fa-62c438af26c8-metrics-tls podName:5c03038e-c257-4668-a7fa-62c438af26c8 nodeName:}" failed. No retries permitted until 2026-04-16 15:12:40.047721789 +0000 UTC m=+38.633925576 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5c03038e-c257-4668-a7fa-62c438af26c8-metrics-tls") pod "dns-default-mgxdh" (UID: "5c03038e-c257-4668-a7fa-62c438af26c8") : secret "dns-default-metrics-tls" not found Apr 16 15:12:38.047790 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:38.047747 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8d53317c-dbd7-4c13-ba73-b51e72998223-cert podName:8d53317c-dbd7-4c13-ba73-b51e72998223 nodeName:}" failed. No retries permitted until 2026-04-16 15:12:40.047741357 +0000 UTC m=+38.633945144 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8d53317c-dbd7-4c13-ba73-b51e72998223-cert") pod "ingress-canary-kw9vq" (UID: "8d53317c-dbd7-4c13-ba73-b51e72998223") : secret "canary-serving-cert" not found Apr 16 15:12:38.155569 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:38.155537 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4kjbb" event={"ID":"914c9c02-e314-4b0c-b5f6-aab4f4681c8d","Type":"ContainerStarted","Data":"ba49d495cdcd4cd1bdbeab20a7ae06b8f7662b224082176a8de053c3b843ecec"} Apr 16 15:12:38.187315 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:38.187269 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-4kjbb" podStartSLOduration=5.110996381 podStartE2EDuration="36.187255929s" podCreationTimestamp="2026-04-16 15:12:02 +0000 UTC" firstStartedPulling="2026-04-16 15:12:04.775728532 +0000 UTC m=+3.361932322" lastFinishedPulling="2026-04-16 15:12:35.851988083 +0000 UTC m=+34.438191870" observedRunningTime="2026-04-16 15:12:38.186937559 +0000 UTC m=+36.773141379" watchObservedRunningTime="2026-04-16 15:12:38.187255929 +0000 UTC m=+36.773459739" Apr 16 15:12:40.061726 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:40.061693 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8d53317c-dbd7-4c13-ba73-b51e72998223-cert\") pod \"ingress-canary-kw9vq\" (UID: \"8d53317c-dbd7-4c13-ba73-b51e72998223\") " pod="openshift-ingress-canary/ingress-canary-kw9vq" Apr 16 15:12:40.062234 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:40.061752 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5c03038e-c257-4668-a7fa-62c438af26c8-metrics-tls\") pod \"dns-default-mgxdh\" (UID: \"5c03038e-c257-4668-a7fa-62c438af26c8\") " pod="openshift-dns/dns-default-mgxdh" Apr 16 15:12:40.062234 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:40.061851 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 15:12:40.062234 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:40.061903 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 15:12:40.062234 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:40.061930 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8d53317c-dbd7-4c13-ba73-b51e72998223-cert podName:8d53317c-dbd7-4c13-ba73-b51e72998223 nodeName:}" failed. No retries permitted until 2026-04-16 15:12:44.061914222 +0000 UTC m=+42.648118029 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8d53317c-dbd7-4c13-ba73-b51e72998223-cert") pod "ingress-canary-kw9vq" (UID: "8d53317c-dbd7-4c13-ba73-b51e72998223") : secret "canary-serving-cert" not found Apr 16 15:12:40.062234 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:40.061952 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5c03038e-c257-4668-a7fa-62c438af26c8-metrics-tls podName:5c03038e-c257-4668-a7fa-62c438af26c8 nodeName:}" failed. No retries permitted until 2026-04-16 15:12:44.061937046 +0000 UTC m=+42.648140846 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5c03038e-c257-4668-a7fa-62c438af26c8-metrics-tls") pod "dns-default-mgxdh" (UID: "5c03038e-c257-4668-a7fa-62c438af26c8") : secret "dns-default-metrics-tls" not found Apr 16 15:12:44.091312 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:44.091258 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8d53317c-dbd7-4c13-ba73-b51e72998223-cert\") pod \"ingress-canary-kw9vq\" (UID: \"8d53317c-dbd7-4c13-ba73-b51e72998223\") " pod="openshift-ingress-canary/ingress-canary-kw9vq" Apr 16 15:12:44.091312 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:44.091318 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5c03038e-c257-4668-a7fa-62c438af26c8-metrics-tls\") pod \"dns-default-mgxdh\" (UID: \"5c03038e-c257-4668-a7fa-62c438af26c8\") " pod="openshift-dns/dns-default-mgxdh" Apr 16 15:12:44.091780 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:44.091407 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 15:12:44.091780 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:44.091413 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 15:12:44.091780 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:44.091459 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5c03038e-c257-4668-a7fa-62c438af26c8-metrics-tls podName:5c03038e-c257-4668-a7fa-62c438af26c8 nodeName:}" failed. No retries permitted until 2026-04-16 15:12:52.09144569 +0000 UTC m=+50.677649477 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5c03038e-c257-4668-a7fa-62c438af26c8-metrics-tls") pod "dns-default-mgxdh" (UID: "5c03038e-c257-4668-a7fa-62c438af26c8") : secret "dns-default-metrics-tls" not found Apr 16 15:12:44.091780 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:44.091473 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8d53317c-dbd7-4c13-ba73-b51e72998223-cert podName:8d53317c-dbd7-4c13-ba73-b51e72998223 nodeName:}" failed. No retries permitted until 2026-04-16 15:12:52.091467307 +0000 UTC m=+50.677671094 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8d53317c-dbd7-4c13-ba73-b51e72998223-cert") pod "ingress-canary-kw9vq" (UID: "8d53317c-dbd7-4c13-ba73-b51e72998223") : secret "canary-serving-cert" not found Apr 16 15:12:52.145348 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:52.145305 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8d53317c-dbd7-4c13-ba73-b51e72998223-cert\") pod \"ingress-canary-kw9vq\" (UID: \"8d53317c-dbd7-4c13-ba73-b51e72998223\") " pod="openshift-ingress-canary/ingress-canary-kw9vq" Apr 16 15:12:52.145348 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:12:52.145357 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5c03038e-c257-4668-a7fa-62c438af26c8-metrics-tls\") pod \"dns-default-mgxdh\" (UID: \"5c03038e-c257-4668-a7fa-62c438af26c8\") " pod="openshift-dns/dns-default-mgxdh" Apr 16 15:12:52.146308 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:52.145452 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 15:12:52.146308 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:52.145456 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 15:12:52.146308 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:52.145514 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5c03038e-c257-4668-a7fa-62c438af26c8-metrics-tls podName:5c03038e-c257-4668-a7fa-62c438af26c8 nodeName:}" failed. No retries permitted until 2026-04-16 15:13:08.145501571 +0000 UTC m=+66.731705359 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5c03038e-c257-4668-a7fa-62c438af26c8-metrics-tls") pod "dns-default-mgxdh" (UID: "5c03038e-c257-4668-a7fa-62c438af26c8") : secret "dns-default-metrics-tls" not found Apr 16 15:12:52.146308 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:12:52.145528 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8d53317c-dbd7-4c13-ba73-b51e72998223-cert podName:8d53317c-dbd7-4c13-ba73-b51e72998223 nodeName:}" failed. No retries permitted until 2026-04-16 15:13:08.145521514 +0000 UTC m=+66.731725301 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8d53317c-dbd7-4c13-ba73-b51e72998223-cert") pod "ingress-canary-kw9vq" (UID: "8d53317c-dbd7-4c13-ba73-b51e72998223") : secret "canary-serving-cert" not found Apr 16 15:13:01.142684 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:13:01.142619 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-n96lw" Apr 16 15:13:07.757621 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:13:07.757436 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/414282ac-76dc-4992-819a-bdf05043c09d-metrics-certs\") pod \"network-metrics-daemon-h9c44\" (UID: \"414282ac-76dc-4992-819a-bdf05043c09d\") " pod="openshift-multus/network-metrics-daemon-h9c44" Apr 16 15:13:07.760791 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:13:07.760773 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 15:13:07.768444 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:13:07.768428 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 15:13:07.768497 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:13:07.768484 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/414282ac-76dc-4992-819a-bdf05043c09d-metrics-certs podName:414282ac-76dc-4992-819a-bdf05043c09d nodeName:}" failed. No retries permitted until 2026-04-16 15:14:11.768469935 +0000 UTC m=+130.354673721 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/414282ac-76dc-4992-819a-bdf05043c09d-metrics-certs") pod "network-metrics-daemon-h9c44" (UID: "414282ac-76dc-4992-819a-bdf05043c09d") : secret "metrics-daemon-secret" not found Apr 16 15:13:07.858753 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:13:07.858722 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ctkdb\" (UniqueName: \"kubernetes.io/projected/905049f6-70d0-42ea-bf9f-d37c4f436506-kube-api-access-ctkdb\") pod \"network-check-target-ggs7q\" (UID: \"905049f6-70d0-42ea-bf9f-d37c4f436506\") " pod="openshift-network-diagnostics/network-check-target-ggs7q" Apr 16 15:13:07.861931 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:13:07.861906 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 15:13:07.872114 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:13:07.872090 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 15:13:07.883057 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:13:07.883032 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctkdb\" (UniqueName: \"kubernetes.io/projected/905049f6-70d0-42ea-bf9f-d37c4f436506-kube-api-access-ctkdb\") pod \"network-check-target-ggs7q\" (UID: \"905049f6-70d0-42ea-bf9f-d37c4f436506\") " pod="openshift-network-diagnostics/network-check-target-ggs7q" Apr 16 15:13:07.924914 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:13:07.924883 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-xkvgm\"" Apr 16 15:13:07.932583 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:13:07.932565 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ggs7q" Apr 16 15:13:08.054902 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:13:08.054873 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-ggs7q"] Apr 16 15:13:08.059310 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:13:08.059270 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod905049f6_70d0_42ea_bf9f_d37c4f436506.slice/crio-0d46f84af160178b3da3fd91172c6c455a695345c69901d01dc93b0917c03c37 WatchSource:0}: Error finding container 0d46f84af160178b3da3fd91172c6c455a695345c69901d01dc93b0917c03c37: Status 404 returned error can't find the container with id 0d46f84af160178b3da3fd91172c6c455a695345c69901d01dc93b0917c03c37 Apr 16 15:13:08.160606 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:13:08.160580 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8d53317c-dbd7-4c13-ba73-b51e72998223-cert\") pod \"ingress-canary-kw9vq\" (UID: \"8d53317c-dbd7-4c13-ba73-b51e72998223\") " pod="openshift-ingress-canary/ingress-canary-kw9vq" Apr 16 15:13:08.160777 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:13:08.160621 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5c03038e-c257-4668-a7fa-62c438af26c8-metrics-tls\") pod \"dns-default-mgxdh\" (UID: \"5c03038e-c257-4668-a7fa-62c438af26c8\") " pod="openshift-dns/dns-default-mgxdh" Apr 16 15:13:08.160777 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:13:08.160742 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 15:13:08.160777 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:13:08.160748 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 15:13:08.160902 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:13:08.160793 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5c03038e-c257-4668-a7fa-62c438af26c8-metrics-tls podName:5c03038e-c257-4668-a7fa-62c438af26c8 nodeName:}" failed. No retries permitted until 2026-04-16 15:13:40.160779776 +0000 UTC m=+98.746983570 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5c03038e-c257-4668-a7fa-62c438af26c8-metrics-tls") pod "dns-default-mgxdh" (UID: "5c03038e-c257-4668-a7fa-62c438af26c8") : secret "dns-default-metrics-tls" not found Apr 16 15:13:08.160902 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:13:08.160805 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8d53317c-dbd7-4c13-ba73-b51e72998223-cert podName:8d53317c-dbd7-4c13-ba73-b51e72998223 nodeName:}" failed. No retries permitted until 2026-04-16 15:13:40.160800097 +0000 UTC m=+98.747003884 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8d53317c-dbd7-4c13-ba73-b51e72998223-cert") pod "ingress-canary-kw9vq" (UID: "8d53317c-dbd7-4c13-ba73-b51e72998223") : secret "canary-serving-cert" not found Apr 16 15:13:08.212482 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:13:08.212456 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-ggs7q" event={"ID":"905049f6-70d0-42ea-bf9f-d37c4f436506","Type":"ContainerStarted","Data":"0d46f84af160178b3da3fd91172c6c455a695345c69901d01dc93b0917c03c37"} Apr 16 15:13:11.219158 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:13:11.219123 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-ggs7q" event={"ID":"905049f6-70d0-42ea-bf9f-d37c4f436506","Type":"ContainerStarted","Data":"23d58b13d483c2f90bfe4d40862e5234d0e2da9d769d18b7f986d810a68d0a3b"} Apr 16 15:13:11.219604 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:13:11.219242 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-ggs7q" Apr 16 15:13:11.239947 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:13:11.239900 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-ggs7q" podStartSLOduration=66.54610391 podStartE2EDuration="1m9.239887897s" podCreationTimestamp="2026-04-16 15:12:02 +0000 UTC" firstStartedPulling="2026-04-16 15:13:08.06113289 +0000 UTC m=+66.647336677" lastFinishedPulling="2026-04-16 15:13:10.75491686 +0000 UTC m=+69.341120664" observedRunningTime="2026-04-16 15:13:11.239517415 +0000 UTC m=+69.825721224" watchObservedRunningTime="2026-04-16 15:13:11.239887897 +0000 UTC m=+69.826091706" Apr 16 15:13:40.185934 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:13:40.185892 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8d53317c-dbd7-4c13-ba73-b51e72998223-cert\") pod \"ingress-canary-kw9vq\" (UID: \"8d53317c-dbd7-4c13-ba73-b51e72998223\") " pod="openshift-ingress-canary/ingress-canary-kw9vq" Apr 16 15:13:40.185934 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:13:40.185945 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5c03038e-c257-4668-a7fa-62c438af26c8-metrics-tls\") pod \"dns-default-mgxdh\" (UID: \"5c03038e-c257-4668-a7fa-62c438af26c8\") " pod="openshift-dns/dns-default-mgxdh" Apr 16 15:13:40.186379 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:13:40.186044 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 15:13:40.186379 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:13:40.186046 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 15:13:40.186379 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:13:40.186111 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8d53317c-dbd7-4c13-ba73-b51e72998223-cert podName:8d53317c-dbd7-4c13-ba73-b51e72998223 nodeName:}" failed. No retries permitted until 2026-04-16 15:14:44.186095529 +0000 UTC m=+162.772299316 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8d53317c-dbd7-4c13-ba73-b51e72998223-cert") pod "ingress-canary-kw9vq" (UID: "8d53317c-dbd7-4c13-ba73-b51e72998223") : secret "canary-serving-cert" not found Apr 16 15:13:40.186379 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:13:40.186125 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5c03038e-c257-4668-a7fa-62c438af26c8-metrics-tls podName:5c03038e-c257-4668-a7fa-62c438af26c8 nodeName:}" failed. No retries permitted until 2026-04-16 15:14:44.186118657 +0000 UTC m=+162.772322444 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5c03038e-c257-4668-a7fa-62c438af26c8-metrics-tls") pod "dns-default-mgxdh" (UID: "5c03038e-c257-4668-a7fa-62c438af26c8") : secret "dns-default-metrics-tls" not found Apr 16 15:13:42.223218 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:13:42.223185 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-ggs7q" Apr 16 15:14:06.203769 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:06.203734 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-sx5pd"] Apr 16 15:14:06.206386 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:06.206370 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-sx5pd" Apr 16 15:14:06.209165 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:06.209149 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 16 15:14:06.209257 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:06.209148 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-dgstk\"" Apr 16 15:14:06.210425 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:06.210410 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 16 15:14:06.214536 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:06.214514 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-sx5pd"] Apr 16 15:14:06.258965 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:06.258929 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcthc\" (UniqueName: \"kubernetes.io/projected/af0b227b-f370-498a-9afb-9106b7108461-kube-api-access-zcthc\") pod \"volume-data-source-validator-7d955d5dd4-sx5pd\" (UID: \"af0b227b-f370-498a-9afb-9106b7108461\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-sx5pd" Apr 16 15:14:06.299950 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:06.299922 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-qnf9j"] Apr 16 15:14:06.302756 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:06.302740 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-qnf9j" Apr 16 15:14:06.305387 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:06.305364 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 16 15:14:06.305508 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:06.305373 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 16 15:14:06.305508 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:06.305397 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 16 15:14:06.305508 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:06.305398 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-n4bg8\"" Apr 16 15:14:06.305508 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:06.305373 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 16 15:14:06.312022 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:06.312003 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-fc75r"] Apr 16 15:14:06.314859 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:06.314841 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-xlsw6"] Apr 16 15:14:06.314992 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:06.314975 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-5785d4fcdd-fc75r" Apr 16 15:14:06.317475 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:06.317445 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-qnf9j"] Apr 16 15:14:06.317475 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:06.317463 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 15:14:06.317732 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:06.317489 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 15:14:06.317732 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:06.317562 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-d87b8d5fc-xlsw6" Apr 16 15:14:06.318105 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:06.318080 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 16 15:14:06.318195 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:06.318131 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-f6jln\"" Apr 16 15:14:06.318499 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:06.318478 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 16 15:14:06.320641 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:06.320618 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 16 15:14:06.320762 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:06.320706 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 16 15:14:06.320762 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:06.320727 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-4zh7j\"" Apr 16 15:14:06.321003 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:06.320982 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 16 15:14:06.321107 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:06.321038 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 16 15:14:06.324564 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:06.324545 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 16 15:14:06.328916 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:06.328894 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 16 15:14:06.331209 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:06.331189 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-fc75r"] Apr 16 15:14:06.331839 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:06.331821 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-xlsw6"] Apr 16 15:14:06.369288 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:06.369247 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/03b174e8-ba16-492a-9a00-8245945ade22-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-fc75r\" (UID: \"03b174e8-ba16-492a-9a00-8245945ade22\") " pod="openshift-insights/insights-operator-5785d4fcdd-fc75r" Apr 16 15:14:06.369480 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:06.369301 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p98sk\" (UniqueName: \"kubernetes.io/projected/8f600d4b-94cc-4150-8ca8-7002385a1b56-kube-api-access-p98sk\") pod \"kube-storage-version-migrator-operator-756bb7d76f-qnf9j\" (UID: \"8f600d4b-94cc-4150-8ca8-7002385a1b56\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-qnf9j" Apr 16 15:14:06.369480 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:06.369334 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zcthc\" (UniqueName: \"kubernetes.io/projected/af0b227b-f370-498a-9afb-9106b7108461-kube-api-access-zcthc\") pod \"volume-data-source-validator-7d955d5dd4-sx5pd\" (UID: \"af0b227b-f370-498a-9afb-9106b7108461\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-sx5pd" Apr 16 15:14:06.369480 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:06.369359 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/03b174e8-ba16-492a-9a00-8245945ade22-serving-cert\") pod \"insights-operator-5785d4fcdd-fc75r\" (UID: \"03b174e8-ba16-492a-9a00-8245945ade22\") " pod="openshift-insights/insights-operator-5785d4fcdd-fc75r" Apr 16 15:14:06.369480 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:06.369391 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/03b174e8-ba16-492a-9a00-8245945ade22-tmp\") pod \"insights-operator-5785d4fcdd-fc75r\" (UID: \"03b174e8-ba16-492a-9a00-8245945ade22\") " pod="openshift-insights/insights-operator-5785d4fcdd-fc75r" Apr 16 15:14:06.369480 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:06.369414 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/03b174e8-ba16-492a-9a00-8245945ade22-snapshots\") pod \"insights-operator-5785d4fcdd-fc75r\" (UID: \"03b174e8-ba16-492a-9a00-8245945ade22\") " pod="openshift-insights/insights-operator-5785d4fcdd-fc75r" Apr 16 15:14:06.369480 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:06.369444 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/03b174e8-ba16-492a-9a00-8245945ade22-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-fc75r\" (UID: \"03b174e8-ba16-492a-9a00-8245945ade22\") " pod="openshift-insights/insights-operator-5785d4fcdd-fc75r" Apr 16 15:14:06.369841 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:06.369788 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f600d4b-94cc-4150-8ca8-7002385a1b56-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-qnf9j\" (UID: \"8f600d4b-94cc-4150-8ca8-7002385a1b56\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-qnf9j" Apr 16 15:14:06.369898 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:06.369882 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skkzd\" (UniqueName: \"kubernetes.io/projected/03b174e8-ba16-492a-9a00-8245945ade22-kube-api-access-skkzd\") pod \"insights-operator-5785d4fcdd-fc75r\" (UID: \"03b174e8-ba16-492a-9a00-8245945ade22\") " pod="openshift-insights/insights-operator-5785d4fcdd-fc75r" Apr 16 15:14:06.370440 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:06.369948 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f600d4b-94cc-4150-8ca8-7002385a1b56-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-qnf9j\" (UID: \"8f600d4b-94cc-4150-8ca8-7002385a1b56\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-qnf9j" Apr 16 15:14:06.378242 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:06.378213 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcthc\" (UniqueName: \"kubernetes.io/projected/af0b227b-f370-498a-9afb-9106b7108461-kube-api-access-zcthc\") pod \"volume-data-source-validator-7d955d5dd4-sx5pd\" (UID: \"af0b227b-f370-498a-9afb-9106b7108461\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-sx5pd" Apr 16 15:14:06.402445 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:06.402425 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-58ff4b54fd-qzf5r"] Apr 16 15:14:06.405236 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:06.405220 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-58ff4b54fd-qzf5r" Apr 16 15:14:06.409322 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:06.409300 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 16 15:14:06.409422 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:06.409394 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 16 15:14:06.409493 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:06.409424 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 16 15:14:06.409545 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:06.409506 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 16 15:14:06.409758 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:06.409739 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 16 15:14:06.409868 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:06.409794 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-k8ttd\"" Apr 16 15:14:06.409868 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:06.409794 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 16 15:14:06.419627 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:06.419606 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-58ff4b54fd-qzf5r"] Apr 16 15:14:06.471205 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:06.471137 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/03b174e8-ba16-492a-9a00-8245945ade22-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-fc75r\" (UID: \"03b174e8-ba16-492a-9a00-8245945ade22\") " pod="openshift-insights/insights-operator-5785d4fcdd-fc75r" Apr 16 15:14:06.471205 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:06.471177 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p98sk\" (UniqueName: \"kubernetes.io/projected/8f600d4b-94cc-4150-8ca8-7002385a1b56-kube-api-access-p98sk\") pod \"kube-storage-version-migrator-operator-756bb7d76f-qnf9j\" (UID: \"8f600d4b-94cc-4150-8ca8-7002385a1b56\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-qnf9j" Apr 16 15:14:06.471369 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:06.471208 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/03b174e8-ba16-492a-9a00-8245945ade22-serving-cert\") pod \"insights-operator-5785d4fcdd-fc75r\" (UID: \"03b174e8-ba16-492a-9a00-8245945ade22\") " pod="openshift-insights/insights-operator-5785d4fcdd-fc75r" Apr 16 15:14:06.471369 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:06.471235 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/22e6f93e-aa56-4c51-85da-4f105af8d76d-trusted-ca\") pod \"console-operator-d87b8d5fc-xlsw6\" (UID: \"22e6f93e-aa56-4c51-85da-4f105af8d76d\") " pod="openshift-console-operator/console-operator-d87b8d5fc-xlsw6" Apr 16 15:14:06.471369 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:06.471260 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c378b37b-c6a3-4eee-8f58-73402bb067b1-stats-auth\") pod \"router-default-58ff4b54fd-qzf5r\" (UID: \"c378b37b-c6a3-4eee-8f58-73402bb067b1\") " pod="openshift-ingress/router-default-58ff4b54fd-qzf5r" Apr 16 15:14:06.471369 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:06.471283 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqcvl\" (UniqueName: \"kubernetes.io/projected/c378b37b-c6a3-4eee-8f58-73402bb067b1-kube-api-access-jqcvl\") pod \"router-default-58ff4b54fd-qzf5r\" (UID: \"c378b37b-c6a3-4eee-8f58-73402bb067b1\") " pod="openshift-ingress/router-default-58ff4b54fd-qzf5r" Apr 16 15:14:06.471369 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:06.471315 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/03b174e8-ba16-492a-9a00-8245945ade22-tmp\") pod \"insights-operator-5785d4fcdd-fc75r\" (UID: \"03b174e8-ba16-492a-9a00-8245945ade22\") " pod="openshift-insights/insights-operator-5785d4fcdd-fc75r" Apr 16 15:14:06.471369 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:06.471340 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/03b174e8-ba16-492a-9a00-8245945ade22-snapshots\") pod \"insights-operator-5785d4fcdd-fc75r\" (UID: \"03b174e8-ba16-492a-9a00-8245945ade22\") " pod="openshift-insights/insights-operator-5785d4fcdd-fc75r" Apr 16 15:14:06.471601 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:06.471447 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rf9f\" (UniqueName: \"kubernetes.io/projected/22e6f93e-aa56-4c51-85da-4f105af8d76d-kube-api-access-7rf9f\") pod \"console-operator-d87b8d5fc-xlsw6\" (UID: \"22e6f93e-aa56-4c51-85da-4f105af8d76d\") " pod="openshift-console-operator/console-operator-d87b8d5fc-xlsw6" Apr 16 15:14:06.471601 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:06.471472 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c378b37b-c6a3-4eee-8f58-73402bb067b1-default-certificate\") pod \"router-default-58ff4b54fd-qzf5r\" (UID: \"c378b37b-c6a3-4eee-8f58-73402bb067b1\") " pod="openshift-ingress/router-default-58ff4b54fd-qzf5r" Apr 16 15:14:06.471601 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:06.471500 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/22e6f93e-aa56-4c51-85da-4f105af8d76d-serving-cert\") pod \"console-operator-d87b8d5fc-xlsw6\" (UID: \"22e6f93e-aa56-4c51-85da-4f105af8d76d\") " pod="openshift-console-operator/console-operator-d87b8d5fc-xlsw6" Apr 16 15:14:06.471601 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:06.471550 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/03b174e8-ba16-492a-9a00-8245945ade22-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-fc75r\" (UID: \"03b174e8-ba16-492a-9a00-8245945ade22\") " pod="openshift-insights/insights-operator-5785d4fcdd-fc75r" Apr 16 15:14:06.471821 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:06.471604 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c378b37b-c6a3-4eee-8f58-73402bb067b1-metrics-certs\") pod \"router-default-58ff4b54fd-qzf5r\" (UID: \"c378b37b-c6a3-4eee-8f58-73402bb067b1\") " pod="openshift-ingress/router-default-58ff4b54fd-qzf5r" Apr 16 15:14:06.471821 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:06.471640 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f600d4b-94cc-4150-8ca8-7002385a1b56-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-qnf9j\" (UID: \"8f600d4b-94cc-4150-8ca8-7002385a1b56\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-qnf9j" Apr 16 15:14:06.471821 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:06.471704 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/03b174e8-ba16-492a-9a00-8245945ade22-tmp\") pod \"insights-operator-5785d4fcdd-fc75r\" (UID: \"03b174e8-ba16-492a-9a00-8245945ade22\") " pod="openshift-insights/insights-operator-5785d4fcdd-fc75r" Apr 16 15:14:06.471821 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:06.471728 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-skkzd\" (UniqueName: \"kubernetes.io/projected/03b174e8-ba16-492a-9a00-8245945ade22-kube-api-access-skkzd\") pod \"insights-operator-5785d4fcdd-fc75r\" (UID: \"03b174e8-ba16-492a-9a00-8245945ade22\") " pod="openshift-insights/insights-operator-5785d4fcdd-fc75r" Apr 16 15:14:06.471821 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:06.471763 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c378b37b-c6a3-4eee-8f58-73402bb067b1-service-ca-bundle\") pod \"router-default-58ff4b54fd-qzf5r\" (UID: \"c378b37b-c6a3-4eee-8f58-73402bb067b1\") " pod="openshift-ingress/router-default-58ff4b54fd-qzf5r" Apr 16 15:14:06.472049 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:06.471820 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f600d4b-94cc-4150-8ca8-7002385a1b56-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-qnf9j\" (UID: \"8f600d4b-94cc-4150-8ca8-7002385a1b56\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-qnf9j" Apr 16 15:14:06.472049 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:06.471884 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22e6f93e-aa56-4c51-85da-4f105af8d76d-config\") pod \"console-operator-d87b8d5fc-xlsw6\" (UID: \"22e6f93e-aa56-4c51-85da-4f105af8d76d\") " pod="openshift-console-operator/console-operator-d87b8d5fc-xlsw6" Apr 16 15:14:06.472049 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:06.472033 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/03b174e8-ba16-492a-9a00-8245945ade22-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-fc75r\" (UID: \"03b174e8-ba16-492a-9a00-8245945ade22\") " pod="openshift-insights/insights-operator-5785d4fcdd-fc75r" Apr 16 15:14:06.472198 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:06.472043 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/03b174e8-ba16-492a-9a00-8245945ade22-snapshots\") pod \"insights-operator-5785d4fcdd-fc75r\" (UID: \"03b174e8-ba16-492a-9a00-8245945ade22\") " pod="openshift-insights/insights-operator-5785d4fcdd-fc75r" Apr 16 15:14:06.472198 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:06.472132 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/03b174e8-ba16-492a-9a00-8245945ade22-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-fc75r\" (UID: \"03b174e8-ba16-492a-9a00-8245945ade22\") " pod="openshift-insights/insights-operator-5785d4fcdd-fc75r" Apr 16 15:14:06.472429 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:06.472406 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f600d4b-94cc-4150-8ca8-7002385a1b56-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-qnf9j\" (UID: \"8f600d4b-94cc-4150-8ca8-7002385a1b56\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-qnf9j" Apr 16 15:14:06.474059 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:06.474040 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f600d4b-94cc-4150-8ca8-7002385a1b56-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-qnf9j\" (UID: \"8f600d4b-94cc-4150-8ca8-7002385a1b56\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-qnf9j" Apr 16 15:14:06.474106 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:06.474048 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/03b174e8-ba16-492a-9a00-8245945ade22-serving-cert\") pod \"insights-operator-5785d4fcdd-fc75r\" (UID: \"03b174e8-ba16-492a-9a00-8245945ade22\") " pod="openshift-insights/insights-operator-5785d4fcdd-fc75r" Apr 16 15:14:06.479244 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:06.479222 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-skkzd\" (UniqueName: \"kubernetes.io/projected/03b174e8-ba16-492a-9a00-8245945ade22-kube-api-access-skkzd\") pod \"insights-operator-5785d4fcdd-fc75r\" (UID: \"03b174e8-ba16-492a-9a00-8245945ade22\") " pod="openshift-insights/insights-operator-5785d4fcdd-fc75r" Apr 16 15:14:06.479346 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:06.479327 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p98sk\" (UniqueName: \"kubernetes.io/projected/8f600d4b-94cc-4150-8ca8-7002385a1b56-kube-api-access-p98sk\") pod \"kube-storage-version-migrator-operator-756bb7d76f-qnf9j\" (UID: \"8f600d4b-94cc-4150-8ca8-7002385a1b56\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-qnf9j" Apr 16 15:14:06.515843 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:06.515820 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-sx5pd" Apr 16 15:14:06.572791 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:06.572641 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/22e6f93e-aa56-4c51-85da-4f105af8d76d-trusted-ca\") pod \"console-operator-d87b8d5fc-xlsw6\" (UID: \"22e6f93e-aa56-4c51-85da-4f105af8d76d\") " pod="openshift-console-operator/console-operator-d87b8d5fc-xlsw6" Apr 16 15:14:06.572791 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:06.572710 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c378b37b-c6a3-4eee-8f58-73402bb067b1-stats-auth\") pod \"router-default-58ff4b54fd-qzf5r\" (UID: \"c378b37b-c6a3-4eee-8f58-73402bb067b1\") " pod="openshift-ingress/router-default-58ff4b54fd-qzf5r" Apr 16 15:14:06.572791 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:06.572737 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jqcvl\" (UniqueName: \"kubernetes.io/projected/c378b37b-c6a3-4eee-8f58-73402bb067b1-kube-api-access-jqcvl\") pod \"router-default-58ff4b54fd-qzf5r\" (UID: \"c378b37b-c6a3-4eee-8f58-73402bb067b1\") " pod="openshift-ingress/router-default-58ff4b54fd-qzf5r" Apr 16 15:14:06.572791 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:06.572774 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7rf9f\" (UniqueName: \"kubernetes.io/projected/22e6f93e-aa56-4c51-85da-4f105af8d76d-kube-api-access-7rf9f\") pod \"console-operator-d87b8d5fc-xlsw6\" (UID: \"22e6f93e-aa56-4c51-85da-4f105af8d76d\") " pod="openshift-console-operator/console-operator-d87b8d5fc-xlsw6" Apr 16 15:14:06.573060 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:06.572798 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c378b37b-c6a3-4eee-8f58-73402bb067b1-default-certificate\") pod \"router-default-58ff4b54fd-qzf5r\" (UID: \"c378b37b-c6a3-4eee-8f58-73402bb067b1\") " pod="openshift-ingress/router-default-58ff4b54fd-qzf5r" Apr 16 15:14:06.573060 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:06.572828 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/22e6f93e-aa56-4c51-85da-4f105af8d76d-serving-cert\") pod \"console-operator-d87b8d5fc-xlsw6\" (UID: \"22e6f93e-aa56-4c51-85da-4f105af8d76d\") " pod="openshift-console-operator/console-operator-d87b8d5fc-xlsw6" Apr 16 15:14:06.573060 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:06.572855 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c378b37b-c6a3-4eee-8f58-73402bb067b1-metrics-certs\") pod \"router-default-58ff4b54fd-qzf5r\" (UID: \"c378b37b-c6a3-4eee-8f58-73402bb067b1\") " pod="openshift-ingress/router-default-58ff4b54fd-qzf5r" Apr 16 15:14:06.573060 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:06.572880 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c378b37b-c6a3-4eee-8f58-73402bb067b1-service-ca-bundle\") pod \"router-default-58ff4b54fd-qzf5r\" (UID: \"c378b37b-c6a3-4eee-8f58-73402bb067b1\") " pod="openshift-ingress/router-default-58ff4b54fd-qzf5r" Apr 16 15:14:06.573060 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:06.572955 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22e6f93e-aa56-4c51-85da-4f105af8d76d-config\") pod \"console-operator-d87b8d5fc-xlsw6\" (UID: \"22e6f93e-aa56-4c51-85da-4f105af8d76d\") " pod="openshift-console-operator/console-operator-d87b8d5fc-xlsw6" Apr 16 15:14:06.574247 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:14:06.573462 2573 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 15:14:06.574247 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:14:06.573549 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c378b37b-c6a3-4eee-8f58-73402bb067b1-metrics-certs podName:c378b37b-c6a3-4eee-8f58-73402bb067b1 nodeName:}" failed. No retries permitted until 2026-04-16 15:14:07.073526448 +0000 UTC m=+125.659730250 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c378b37b-c6a3-4eee-8f58-73402bb067b1-metrics-certs") pod "router-default-58ff4b54fd-qzf5r" (UID: "c378b37b-c6a3-4eee-8f58-73402bb067b1") : secret "router-metrics-certs-default" not found Apr 16 15:14:06.574247 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:14:06.573823 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c378b37b-c6a3-4eee-8f58-73402bb067b1-service-ca-bundle podName:c378b37b-c6a3-4eee-8f58-73402bb067b1 nodeName:}" failed. No retries permitted until 2026-04-16 15:14:07.073803888 +0000 UTC m=+125.660007694 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/c378b37b-c6a3-4eee-8f58-73402bb067b1-service-ca-bundle") pod "router-default-58ff4b54fd-qzf5r" (UID: "c378b37b-c6a3-4eee-8f58-73402bb067b1") : configmap references non-existent config key: service-ca.crt Apr 16 15:14:06.574478 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:06.574342 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22e6f93e-aa56-4c51-85da-4f105af8d76d-config\") pod \"console-operator-d87b8d5fc-xlsw6\" (UID: \"22e6f93e-aa56-4c51-85da-4f105af8d76d\") " pod="openshift-console-operator/console-operator-d87b8d5fc-xlsw6" Apr 16 15:14:06.575320 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:06.575265 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/22e6f93e-aa56-4c51-85da-4f105af8d76d-trusted-ca\") pod \"console-operator-d87b8d5fc-xlsw6\" (UID: \"22e6f93e-aa56-4c51-85da-4f105af8d76d\") " pod="openshift-console-operator/console-operator-d87b8d5fc-xlsw6" Apr 16 15:14:06.577226 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:06.577184 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c378b37b-c6a3-4eee-8f58-73402bb067b1-stats-auth\") pod \"router-default-58ff4b54fd-qzf5r\" (UID: \"c378b37b-c6a3-4eee-8f58-73402bb067b1\") " pod="openshift-ingress/router-default-58ff4b54fd-qzf5r" Apr 16 15:14:06.577711 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:06.577673 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/22e6f93e-aa56-4c51-85da-4f105af8d76d-serving-cert\") pod \"console-operator-d87b8d5fc-xlsw6\" (UID: \"22e6f93e-aa56-4c51-85da-4f105af8d76d\") " pod="openshift-console-operator/console-operator-d87b8d5fc-xlsw6" Apr 16 15:14:06.578618 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:06.578598 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c378b37b-c6a3-4eee-8f58-73402bb067b1-default-certificate\") pod \"router-default-58ff4b54fd-qzf5r\" (UID: \"c378b37b-c6a3-4eee-8f58-73402bb067b1\") " pod="openshift-ingress/router-default-58ff4b54fd-qzf5r" Apr 16 15:14:06.585444 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:06.585425 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rf9f\" (UniqueName: \"kubernetes.io/projected/22e6f93e-aa56-4c51-85da-4f105af8d76d-kube-api-access-7rf9f\") pod \"console-operator-d87b8d5fc-xlsw6\" (UID: \"22e6f93e-aa56-4c51-85da-4f105af8d76d\") " pod="openshift-console-operator/console-operator-d87b8d5fc-xlsw6" Apr 16 15:14:06.586549 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:06.586532 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqcvl\" (UniqueName: \"kubernetes.io/projected/c378b37b-c6a3-4eee-8f58-73402bb067b1-kube-api-access-jqcvl\") pod \"router-default-58ff4b54fd-qzf5r\" (UID: \"c378b37b-c6a3-4eee-8f58-73402bb067b1\") " pod="openshift-ingress/router-default-58ff4b54fd-qzf5r" Apr 16 15:14:06.611024 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:06.610999 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-qnf9j" Apr 16 15:14:06.625059 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:06.625026 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-5785d4fcdd-fc75r" Apr 16 15:14:06.629793 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:06.629767 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-d87b8d5fc-xlsw6" Apr 16 15:14:06.634563 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:06.634533 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-sx5pd"] Apr 16 15:14:06.640265 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:14:06.640233 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf0b227b_f370_498a_9afb_9106b7108461.slice/crio-3a4d54935bbc40817e6c8bb524f20412bcf11280ffe6b5c5a4fcc989358c53fc WatchSource:0}: Error finding container 3a4d54935bbc40817e6c8bb524f20412bcf11280ffe6b5c5a4fcc989358c53fc: Status 404 returned error can't find the container with id 3a4d54935bbc40817e6c8bb524f20412bcf11280ffe6b5c5a4fcc989358c53fc Apr 16 15:14:06.752298 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:06.752263 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-qnf9j"] Apr 16 15:14:06.755959 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:14:06.755930 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f600d4b_94cc_4150_8ca8_7002385a1b56.slice/crio-4acfd7f54948787bcc8fc3ada338cd570b0885e3b352720e0ae4167aa5a13c7f WatchSource:0}: Error finding container 4acfd7f54948787bcc8fc3ada338cd570b0885e3b352720e0ae4167aa5a13c7f: Status 404 returned error can't find the container with id 4acfd7f54948787bcc8fc3ada338cd570b0885e3b352720e0ae4167aa5a13c7f Apr 16 15:14:06.972358 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:06.972327 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-xlsw6"] Apr 16 15:14:06.975121 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:14:06.975088 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22e6f93e_aa56_4c51_85da_4f105af8d76d.slice/crio-cb38798d33da32561e77081fe6381493d1a3b984b189949e659a07acf13f3d1b WatchSource:0}: Error finding container cb38798d33da32561e77081fe6381493d1a3b984b189949e659a07acf13f3d1b: Status 404 returned error can't find the container with id cb38798d33da32561e77081fe6381493d1a3b984b189949e659a07acf13f3d1b Apr 16 15:14:06.975793 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:06.975773 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-fc75r"] Apr 16 15:14:06.978718 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:14:06.978697 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod03b174e8_ba16_492a_9a00_8245945ade22.slice/crio-ca68aabcb1febb8d31159b77fdfd08e73cc2835f4fbb73dd2305ffe52b3ff8a3 WatchSource:0}: Error finding container ca68aabcb1febb8d31159b77fdfd08e73cc2835f4fbb73dd2305ffe52b3ff8a3: Status 404 returned error can't find the container with id ca68aabcb1febb8d31159b77fdfd08e73cc2835f4fbb73dd2305ffe52b3ff8a3 Apr 16 15:14:07.077319 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:07.077288 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c378b37b-c6a3-4eee-8f58-73402bb067b1-metrics-certs\") pod \"router-default-58ff4b54fd-qzf5r\" (UID: \"c378b37b-c6a3-4eee-8f58-73402bb067b1\") " pod="openshift-ingress/router-default-58ff4b54fd-qzf5r" Apr 16 15:14:07.077319 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:07.077320 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c378b37b-c6a3-4eee-8f58-73402bb067b1-service-ca-bundle\") pod \"router-default-58ff4b54fd-qzf5r\" (UID: \"c378b37b-c6a3-4eee-8f58-73402bb067b1\") " pod="openshift-ingress/router-default-58ff4b54fd-qzf5r" Apr 16 15:14:07.077503 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:14:07.077424 2573 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 15:14:07.077503 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:14:07.077464 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c378b37b-c6a3-4eee-8f58-73402bb067b1-metrics-certs podName:c378b37b-c6a3-4eee-8f58-73402bb067b1 nodeName:}" failed. No retries permitted until 2026-04-16 15:14:08.077451873 +0000 UTC m=+126.663655659 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c378b37b-c6a3-4eee-8f58-73402bb067b1-metrics-certs") pod "router-default-58ff4b54fd-qzf5r" (UID: "c378b37b-c6a3-4eee-8f58-73402bb067b1") : secret "router-metrics-certs-default" not found Apr 16 15:14:07.077600 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:14:07.077579 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c378b37b-c6a3-4eee-8f58-73402bb067b1-service-ca-bundle podName:c378b37b-c6a3-4eee-8f58-73402bb067b1 nodeName:}" failed. No retries permitted until 2026-04-16 15:14:08.077561312 +0000 UTC m=+126.663765105 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/c378b37b-c6a3-4eee-8f58-73402bb067b1-service-ca-bundle") pod "router-default-58ff4b54fd-qzf5r" (UID: "c378b37b-c6a3-4eee-8f58-73402bb067b1") : configmap references non-existent config key: service-ca.crt Apr 16 15:14:07.321638 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:07.321548 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-fc75r" event={"ID":"03b174e8-ba16-492a-9a00-8245945ade22","Type":"ContainerStarted","Data":"ca68aabcb1febb8d31159b77fdfd08e73cc2835f4fbb73dd2305ffe52b3ff8a3"} Apr 16 15:14:07.322570 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:07.322543 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-sx5pd" event={"ID":"af0b227b-f370-498a-9afb-9106b7108461","Type":"ContainerStarted","Data":"3a4d54935bbc40817e6c8bb524f20412bcf11280ffe6b5c5a4fcc989358c53fc"} Apr 16 15:14:07.323585 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:07.323563 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-xlsw6" event={"ID":"22e6f93e-aa56-4c51-85da-4f105af8d76d","Type":"ContainerStarted","Data":"cb38798d33da32561e77081fe6381493d1a3b984b189949e659a07acf13f3d1b"} Apr 16 15:14:07.324581 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:07.324558 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-qnf9j" event={"ID":"8f600d4b-94cc-4150-8ca8-7002385a1b56","Type":"ContainerStarted","Data":"4acfd7f54948787bcc8fc3ada338cd570b0885e3b352720e0ae4167aa5a13c7f"} Apr 16 15:14:08.085261 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:08.085228 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c378b37b-c6a3-4eee-8f58-73402bb067b1-metrics-certs\") pod \"router-default-58ff4b54fd-qzf5r\" (UID: \"c378b37b-c6a3-4eee-8f58-73402bb067b1\") " pod="openshift-ingress/router-default-58ff4b54fd-qzf5r" Apr 16 15:14:08.085391 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:08.085278 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c378b37b-c6a3-4eee-8f58-73402bb067b1-service-ca-bundle\") pod \"router-default-58ff4b54fd-qzf5r\" (UID: \"c378b37b-c6a3-4eee-8f58-73402bb067b1\") " pod="openshift-ingress/router-default-58ff4b54fd-qzf5r" Apr 16 15:14:08.085878 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:14:08.085512 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c378b37b-c6a3-4eee-8f58-73402bb067b1-service-ca-bundle podName:c378b37b-c6a3-4eee-8f58-73402bb067b1 nodeName:}" failed. No retries permitted until 2026-04-16 15:14:10.08549118 +0000 UTC m=+128.671694968 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/c378b37b-c6a3-4eee-8f58-73402bb067b1-service-ca-bundle") pod "router-default-58ff4b54fd-qzf5r" (UID: "c378b37b-c6a3-4eee-8f58-73402bb067b1") : configmap references non-existent config key: service-ca.crt Apr 16 15:14:08.085878 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:14:08.085558 2573 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 15:14:08.085878 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:14:08.085624 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c378b37b-c6a3-4eee-8f58-73402bb067b1-metrics-certs podName:c378b37b-c6a3-4eee-8f58-73402bb067b1 nodeName:}" failed. No retries permitted until 2026-04-16 15:14:10.085606688 +0000 UTC m=+128.671810495 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c378b37b-c6a3-4eee-8f58-73402bb067b1-metrics-certs") pod "router-default-58ff4b54fd-qzf5r" (UID: "c378b37b-c6a3-4eee-8f58-73402bb067b1") : secret "router-metrics-certs-default" not found Apr 16 15:14:08.329728 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:08.329308 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-sx5pd" event={"ID":"af0b227b-f370-498a-9afb-9106b7108461","Type":"ContainerStarted","Data":"f225f702d0810e48380e734a14a5bbd7cae7d315a1fa87c83f7000e75cb9f959"} Apr 16 15:14:08.348102 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:08.348047 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-sx5pd" podStartSLOduration=0.920634718 podStartE2EDuration="2.348028763s" podCreationTimestamp="2026-04-16 15:14:06 +0000 UTC" firstStartedPulling="2026-04-16 15:14:06.64232897 +0000 UTC m=+125.228532761" lastFinishedPulling="2026-04-16 15:14:08.069723005 +0000 UTC m=+126.655926806" observedRunningTime="2026-04-16 15:14:08.347106168 +0000 UTC m=+126.933310022" watchObservedRunningTime="2026-04-16 15:14:08.348028763 +0000 UTC m=+126.934232573" Apr 16 15:14:10.100878 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:10.100842 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c378b37b-c6a3-4eee-8f58-73402bb067b1-metrics-certs\") pod \"router-default-58ff4b54fd-qzf5r\" (UID: \"c378b37b-c6a3-4eee-8f58-73402bb067b1\") " pod="openshift-ingress/router-default-58ff4b54fd-qzf5r" Apr 16 15:14:10.100878 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:10.100888 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c378b37b-c6a3-4eee-8f58-73402bb067b1-service-ca-bundle\") pod \"router-default-58ff4b54fd-qzf5r\" (UID: \"c378b37b-c6a3-4eee-8f58-73402bb067b1\") " pod="openshift-ingress/router-default-58ff4b54fd-qzf5r" Apr 16 15:14:10.101257 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:14:10.101017 2573 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 15:14:10.101257 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:14:10.101059 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c378b37b-c6a3-4eee-8f58-73402bb067b1-service-ca-bundle podName:c378b37b-c6a3-4eee-8f58-73402bb067b1 nodeName:}" failed. No retries permitted until 2026-04-16 15:14:14.101043827 +0000 UTC m=+132.687247613 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/c378b37b-c6a3-4eee-8f58-73402bb067b1-service-ca-bundle") pod "router-default-58ff4b54fd-qzf5r" (UID: "c378b37b-c6a3-4eee-8f58-73402bb067b1") : configmap references non-existent config key: service-ca.crt Apr 16 15:14:10.101257 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:14:10.101097 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c378b37b-c6a3-4eee-8f58-73402bb067b1-metrics-certs podName:c378b37b-c6a3-4eee-8f58-73402bb067b1 nodeName:}" failed. No retries permitted until 2026-04-16 15:14:14.101070768 +0000 UTC m=+132.687274562 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c378b37b-c6a3-4eee-8f58-73402bb067b1-metrics-certs") pod "router-default-58ff4b54fd-qzf5r" (UID: "c378b37b-c6a3-4eee-8f58-73402bb067b1") : secret "router-metrics-certs-default" not found Apr 16 15:14:10.335604 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:10.335573 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-xlsw6_22e6f93e-aa56-4c51-85da-4f105af8d76d/console-operator/0.log" Apr 16 15:14:10.335905 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:10.335622 2573 generic.go:358] "Generic (PLEG): container finished" podID="22e6f93e-aa56-4c51-85da-4f105af8d76d" containerID="a91ee62b68d22b315b8ec6b98f54c610cb927197c6b04c60855aa92c3b798f91" exitCode=255 Apr 16 15:14:10.335905 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:10.335689 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-xlsw6" event={"ID":"22e6f93e-aa56-4c51-85da-4f105af8d76d","Type":"ContainerDied","Data":"a91ee62b68d22b315b8ec6b98f54c610cb927197c6b04c60855aa92c3b798f91"} Apr 16 15:14:10.336018 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:10.335941 2573 scope.go:117] "RemoveContainer" containerID="a91ee62b68d22b315b8ec6b98f54c610cb927197c6b04c60855aa92c3b798f91" Apr 16 15:14:10.337180 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:10.337152 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-qnf9j" event={"ID":"8f600d4b-94cc-4150-8ca8-7002385a1b56","Type":"ContainerStarted","Data":"a97176a8a7a5863107af02ee6cc699d53eb83e89ca9f844308834171b6006a16"} Apr 16 15:14:10.338579 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:10.338556 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-fc75r" event={"ID":"03b174e8-ba16-492a-9a00-8245945ade22","Type":"ContainerStarted","Data":"8e83905dc4a28c1a58a033a0efb6e0c8319ea2a08da57c9b6a651ee19fb33846"} Apr 16 15:14:10.408567 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:10.408519 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-qnf9j" podStartSLOduration=1.282187054 podStartE2EDuration="4.408502213s" podCreationTimestamp="2026-04-16 15:14:06 +0000 UTC" firstStartedPulling="2026-04-16 15:14:06.757683336 +0000 UTC m=+125.343887123" lastFinishedPulling="2026-04-16 15:14:09.883998478 +0000 UTC m=+128.470202282" observedRunningTime="2026-04-16 15:14:10.407966885 +0000 UTC m=+128.994170698" watchObservedRunningTime="2026-04-16 15:14:10.408502213 +0000 UTC m=+128.994706023" Apr 16 15:14:10.427764 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:10.427482 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-5785d4fcdd-fc75r" podStartSLOduration=1.519400204 podStartE2EDuration="4.427461009s" podCreationTimestamp="2026-04-16 15:14:06 +0000 UTC" firstStartedPulling="2026-04-16 15:14:06.98025661 +0000 UTC m=+125.566460397" lastFinishedPulling="2026-04-16 15:14:09.888317415 +0000 UTC m=+128.474521202" observedRunningTime="2026-04-16 15:14:10.426084939 +0000 UTC m=+129.012288749" watchObservedRunningTime="2026-04-16 15:14:10.427461009 +0000 UTC m=+129.013664820" Apr 16 15:14:11.342442 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:11.342367 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-xlsw6_22e6f93e-aa56-4c51-85da-4f105af8d76d/console-operator/1.log" Apr 16 15:14:11.342881 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:11.342769 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-xlsw6_22e6f93e-aa56-4c51-85da-4f105af8d76d/console-operator/0.log" Apr 16 15:14:11.342881 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:11.342811 2573 generic.go:358] "Generic (PLEG): container finished" podID="22e6f93e-aa56-4c51-85da-4f105af8d76d" containerID="8ae4bfcecd3323c2a99a603b4dd4734f4f8fa43d8662083faad77ee895bd21c8" exitCode=255 Apr 16 15:14:11.343008 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:11.342882 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-xlsw6" event={"ID":"22e6f93e-aa56-4c51-85da-4f105af8d76d","Type":"ContainerDied","Data":"8ae4bfcecd3323c2a99a603b4dd4734f4f8fa43d8662083faad77ee895bd21c8"} Apr 16 15:14:11.343008 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:11.342925 2573 scope.go:117] "RemoveContainer" containerID="a91ee62b68d22b315b8ec6b98f54c610cb927197c6b04c60855aa92c3b798f91" Apr 16 15:14:11.343502 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:11.343354 2573 scope.go:117] "RemoveContainer" containerID="8ae4bfcecd3323c2a99a603b4dd4734f4f8fa43d8662083faad77ee895bd21c8" Apr 16 15:14:11.343607 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:14:11.343541 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-d87b8d5fc-xlsw6_openshift-console-operator(22e6f93e-aa56-4c51-85da-4f105af8d76d)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-xlsw6" podUID="22e6f93e-aa56-4c51-85da-4f105af8d76d" Apr 16 15:14:11.494262 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:11.494222 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-gms79"] Apr 16 15:14:11.497457 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:11.497435 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-gms79" Apr 16 15:14:11.500479 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:11.500456 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 16 15:14:11.500609 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:11.500499 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-p2zt9\"" Apr 16 15:14:11.500609 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:11.500503 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 16 15:14:11.507099 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:11.507075 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-gms79"] Apr 16 15:14:11.613206 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:11.613112 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gnlx\" (UniqueName: \"kubernetes.io/projected/3c33efa9-0d42-4773-96d7-a6aea9290b8b-kube-api-access-2gnlx\") pod \"migrator-64d4d94569-gms79\" (UID: \"3c33efa9-0d42-4773-96d7-a6aea9290b8b\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-gms79" Apr 16 15:14:11.714087 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:11.714051 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2gnlx\" (UniqueName: \"kubernetes.io/projected/3c33efa9-0d42-4773-96d7-a6aea9290b8b-kube-api-access-2gnlx\") pod \"migrator-64d4d94569-gms79\" (UID: \"3c33efa9-0d42-4773-96d7-a6aea9290b8b\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-gms79" Apr 16 15:14:11.723180 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:11.723146 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gnlx\" (UniqueName: \"kubernetes.io/projected/3c33efa9-0d42-4773-96d7-a6aea9290b8b-kube-api-access-2gnlx\") pod \"migrator-64d4d94569-gms79\" (UID: \"3c33efa9-0d42-4773-96d7-a6aea9290b8b\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-gms79" Apr 16 15:14:11.806213 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:11.806180 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-gms79" Apr 16 15:14:11.815303 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:11.815276 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/414282ac-76dc-4992-819a-bdf05043c09d-metrics-certs\") pod \"network-metrics-daemon-h9c44\" (UID: \"414282ac-76dc-4992-819a-bdf05043c09d\") " pod="openshift-multus/network-metrics-daemon-h9c44" Apr 16 15:14:11.815434 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:14:11.815422 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 15:14:11.815508 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:14:11.815497 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/414282ac-76dc-4992-819a-bdf05043c09d-metrics-certs podName:414282ac-76dc-4992-819a-bdf05043c09d nodeName:}" failed. No retries permitted until 2026-04-16 15:16:13.815482468 +0000 UTC m=+252.401686274 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/414282ac-76dc-4992-819a-bdf05043c09d-metrics-certs") pod "network-metrics-daemon-h9c44" (UID: "414282ac-76dc-4992-819a-bdf05043c09d") : secret "metrics-daemon-secret" not found Apr 16 15:14:11.927814 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:11.927781 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-gms79"] Apr 16 15:14:11.930943 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:14:11.930918 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c33efa9_0d42_4773_96d7_a6aea9290b8b.slice/crio-c5a22fc64a657cb54e919b8ce4a8ac7c882c5bcca16e7b9f3b3d0d6bc37178fb WatchSource:0}: Error finding container c5a22fc64a657cb54e919b8ce4a8ac7c882c5bcca16e7b9f3b3d0d6bc37178fb: Status 404 returned error can't find the container with id c5a22fc64a657cb54e919b8ce4a8ac7c882c5bcca16e7b9f3b3d0d6bc37178fb Apr 16 15:14:12.346435 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:12.346399 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-gms79" event={"ID":"3c33efa9-0d42-4773-96d7-a6aea9290b8b","Type":"ContainerStarted","Data":"c5a22fc64a657cb54e919b8ce4a8ac7c882c5bcca16e7b9f3b3d0d6bc37178fb"} Apr 16 15:14:12.347778 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:12.347760 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-xlsw6_22e6f93e-aa56-4c51-85da-4f105af8d76d/console-operator/1.log" Apr 16 15:14:12.348135 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:12.348119 2573 scope.go:117] "RemoveContainer" containerID="8ae4bfcecd3323c2a99a603b4dd4734f4f8fa43d8662083faad77ee895bd21c8" Apr 16 15:14:12.348298 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:14:12.348282 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-d87b8d5fc-xlsw6_openshift-console-operator(22e6f93e-aa56-4c51-85da-4f105af8d76d)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-xlsw6" podUID="22e6f93e-aa56-4c51-85da-4f105af8d76d" Apr 16 15:14:13.351891 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:13.351806 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-gms79" event={"ID":"3c33efa9-0d42-4773-96d7-a6aea9290b8b","Type":"ContainerStarted","Data":"96b186d46023082e14a75f308190990ab1bacfc2874c1a77cca0f6d0db05a59e"} Apr 16 15:14:13.351891 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:13.351844 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-gms79" event={"ID":"3c33efa9-0d42-4773-96d7-a6aea9290b8b","Type":"ContainerStarted","Data":"5283af65d89d44cc204c9af5cc2bfb957c85500a077acc783acd0e2fa4675f17"} Apr 16 15:14:13.372348 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:13.372304 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-gms79" podStartSLOduration=1.239827287 podStartE2EDuration="2.372290659s" podCreationTimestamp="2026-04-16 15:14:11 +0000 UTC" firstStartedPulling="2026-04-16 15:14:11.933348509 +0000 UTC m=+130.519552296" lastFinishedPulling="2026-04-16 15:14:13.065811879 +0000 UTC m=+131.652015668" observedRunningTime="2026-04-16 15:14:13.370684684 +0000 UTC m=+131.956888494" watchObservedRunningTime="2026-04-16 15:14:13.372290659 +0000 UTC m=+131.958494509" Apr 16 15:14:14.131422 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:14.131385 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c378b37b-c6a3-4eee-8f58-73402bb067b1-metrics-certs\") pod \"router-default-58ff4b54fd-qzf5r\" (UID: \"c378b37b-c6a3-4eee-8f58-73402bb067b1\") " pod="openshift-ingress/router-default-58ff4b54fd-qzf5r" Apr 16 15:14:14.131422 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:14.131423 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c378b37b-c6a3-4eee-8f58-73402bb067b1-service-ca-bundle\") pod \"router-default-58ff4b54fd-qzf5r\" (UID: \"c378b37b-c6a3-4eee-8f58-73402bb067b1\") " pod="openshift-ingress/router-default-58ff4b54fd-qzf5r" Apr 16 15:14:14.131707 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:14:14.131540 2573 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 15:14:14.131707 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:14:14.131611 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c378b37b-c6a3-4eee-8f58-73402bb067b1-metrics-certs podName:c378b37b-c6a3-4eee-8f58-73402bb067b1 nodeName:}" failed. No retries permitted until 2026-04-16 15:14:22.131593685 +0000 UTC m=+140.717797486 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c378b37b-c6a3-4eee-8f58-73402bb067b1-metrics-certs") pod "router-default-58ff4b54fd-qzf5r" (UID: "c378b37b-c6a3-4eee-8f58-73402bb067b1") : secret "router-metrics-certs-default" not found Apr 16 15:14:14.131707 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:14:14.131630 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c378b37b-c6a3-4eee-8f58-73402bb067b1-service-ca-bundle podName:c378b37b-c6a3-4eee-8f58-73402bb067b1 nodeName:}" failed. No retries permitted until 2026-04-16 15:14:22.131620959 +0000 UTC m=+140.717824749 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/c378b37b-c6a3-4eee-8f58-73402bb067b1-service-ca-bundle") pod "router-default-58ff4b54fd-qzf5r" (UID: "c378b37b-c6a3-4eee-8f58-73402bb067b1") : configmap references non-existent config key: service-ca.crt Apr 16 15:14:15.126822 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:15.126794 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-sjt8j_1462d2c4-a9b3-43d9-a968-33a1359d6ccc/dns-node-resolver/0.log" Apr 16 15:14:15.726989 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:15.726958 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-4pfr6_bf188288-b071-4a5d-867d-17022354cd51/node-ca/0.log" Apr 16 15:14:16.630543 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:16.630490 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-d87b8d5fc-xlsw6" Apr 16 15:14:16.630543 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:16.630546 2573 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-d87b8d5fc-xlsw6" Apr 16 15:14:16.631093 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:16.631006 2573 scope.go:117] "RemoveContainer" containerID="8ae4bfcecd3323c2a99a603b4dd4734f4f8fa43d8662083faad77ee895bd21c8" Apr 16 15:14:16.631233 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:14:16.631211 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-d87b8d5fc-xlsw6_openshift-console-operator(22e6f93e-aa56-4c51-85da-4f105af8d76d)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-xlsw6" podUID="22e6f93e-aa56-4c51-85da-4f105af8d76d" Apr 16 15:14:17.128910 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:17.128876 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-64d4d94569-gms79_3c33efa9-0d42-4773-96d7-a6aea9290b8b/migrator/0.log" Apr 16 15:14:17.329711 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:17.329669 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-64d4d94569-gms79_3c33efa9-0d42-4773-96d7-a6aea9290b8b/graceful-termination/0.log" Apr 16 15:14:17.528154 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:17.528123 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-756bb7d76f-qnf9j_8f600d4b-94cc-4150-8ca8-7002385a1b56/kube-storage-version-migrator-operator/0.log" Apr 16 15:14:22.196002 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:22.195966 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c378b37b-c6a3-4eee-8f58-73402bb067b1-metrics-certs\") pod \"router-default-58ff4b54fd-qzf5r\" (UID: \"c378b37b-c6a3-4eee-8f58-73402bb067b1\") " pod="openshift-ingress/router-default-58ff4b54fd-qzf5r" Apr 16 15:14:22.196443 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:22.196025 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c378b37b-c6a3-4eee-8f58-73402bb067b1-service-ca-bundle\") pod \"router-default-58ff4b54fd-qzf5r\" (UID: \"c378b37b-c6a3-4eee-8f58-73402bb067b1\") " pod="openshift-ingress/router-default-58ff4b54fd-qzf5r" Apr 16 15:14:22.196866 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:22.196796 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c378b37b-c6a3-4eee-8f58-73402bb067b1-service-ca-bundle\") pod \"router-default-58ff4b54fd-qzf5r\" (UID: \"c378b37b-c6a3-4eee-8f58-73402bb067b1\") " pod="openshift-ingress/router-default-58ff4b54fd-qzf5r" Apr 16 15:14:22.199292 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:22.198950 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c378b37b-c6a3-4eee-8f58-73402bb067b1-metrics-certs\") pod \"router-default-58ff4b54fd-qzf5r\" (UID: \"c378b37b-c6a3-4eee-8f58-73402bb067b1\") " pod="openshift-ingress/router-default-58ff4b54fd-qzf5r" Apr 16 15:14:22.313346 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:22.313311 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-58ff4b54fd-qzf5r" Apr 16 15:14:22.442066 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:22.442036 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-58ff4b54fd-qzf5r"] Apr 16 15:14:22.446234 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:14:22.446171 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc378b37b_c6a3_4eee_8f58_73402bb067b1.slice/crio-3931cf87d2e0f48f0180742bb851043bec57cc9aefff22409ddad38cee022a97 WatchSource:0}: Error finding container 3931cf87d2e0f48f0180742bb851043bec57cc9aefff22409ddad38cee022a97: Status 404 returned error can't find the container with id 3931cf87d2e0f48f0180742bb851043bec57cc9aefff22409ddad38cee022a97 Apr 16 15:14:23.378198 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:23.378161 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-58ff4b54fd-qzf5r" event={"ID":"c378b37b-c6a3-4eee-8f58-73402bb067b1","Type":"ContainerStarted","Data":"c531f120550aac5dc50efcd7dce78a8a1e9a5bb944ead3af403c26f610db0a81"} Apr 16 15:14:23.378198 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:23.378200 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-58ff4b54fd-qzf5r" event={"ID":"c378b37b-c6a3-4eee-8f58-73402bb067b1","Type":"ContainerStarted","Data":"3931cf87d2e0f48f0180742bb851043bec57cc9aefff22409ddad38cee022a97"} Apr 16 15:14:24.313541 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:24.313500 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-58ff4b54fd-qzf5r" Apr 16 15:14:24.316866 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:24.316842 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-58ff4b54fd-qzf5r" Apr 16 15:14:24.337900 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:24.337850 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-58ff4b54fd-qzf5r" podStartSLOduration=18.33783313 podStartE2EDuration="18.33783313s" podCreationTimestamp="2026-04-16 15:14:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:14:23.399396256 +0000 UTC m=+141.985600064" watchObservedRunningTime="2026-04-16 15:14:24.33783313 +0000 UTC m=+142.924036940" Apr 16 15:14:24.380884 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:24.380855 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-58ff4b54fd-qzf5r" Apr 16 15:14:24.382027 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:24.382005 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-58ff4b54fd-qzf5r" Apr 16 15:14:28.013814 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:28.013789 2573 scope.go:117] "RemoveContainer" containerID="8ae4bfcecd3323c2a99a603b4dd4734f4f8fa43d8662083faad77ee895bd21c8" Apr 16 15:14:28.392019 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:28.391946 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-xlsw6_22e6f93e-aa56-4c51-85da-4f105af8d76d/console-operator/2.log" Apr 16 15:14:28.392323 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:28.392308 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-xlsw6_22e6f93e-aa56-4c51-85da-4f105af8d76d/console-operator/1.log" Apr 16 15:14:28.392365 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:28.392338 2573 generic.go:358] "Generic (PLEG): container finished" podID="22e6f93e-aa56-4c51-85da-4f105af8d76d" containerID="fd07b0070f6eb2ca78096c2016803510adff02d82e27c460dafdcb06d981e768" exitCode=255 Apr 16 15:14:28.392405 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:28.392387 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-xlsw6" event={"ID":"22e6f93e-aa56-4c51-85da-4f105af8d76d","Type":"ContainerDied","Data":"fd07b0070f6eb2ca78096c2016803510adff02d82e27c460dafdcb06d981e768"} Apr 16 15:14:28.392435 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:28.392419 2573 scope.go:117] "RemoveContainer" containerID="8ae4bfcecd3323c2a99a603b4dd4734f4f8fa43d8662083faad77ee895bd21c8" Apr 16 15:14:28.392750 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:28.392728 2573 scope.go:117] "RemoveContainer" containerID="fd07b0070f6eb2ca78096c2016803510adff02d82e27c460dafdcb06d981e768" Apr 16 15:14:28.392964 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:14:28.392945 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-d87b8d5fc-xlsw6_openshift-console-operator(22e6f93e-aa56-4c51-85da-4f105af8d76d)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-xlsw6" podUID="22e6f93e-aa56-4c51-85da-4f105af8d76d" Apr 16 15:14:29.396609 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:29.396582 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-xlsw6_22e6f93e-aa56-4c51-85da-4f105af8d76d/console-operator/2.log" Apr 16 15:14:32.042327 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:32.042294 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-d6m95"] Apr 16 15:14:32.046727 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:32.046704 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-d6m95" Apr 16 15:14:32.049562 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:32.049542 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 15:14:32.049856 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:32.049833 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 15:14:32.051039 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:32.051018 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-wvcg7\"" Apr 16 15:14:32.061397 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:32.061374 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-d6m95"] Apr 16 15:14:32.069346 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:32.069326 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jncfd\" (UniqueName: \"kubernetes.io/projected/0554221f-bc0c-43b7-ac3f-0baa33a72fc7-kube-api-access-jncfd\") pod \"insights-runtime-extractor-d6m95\" (UID: \"0554221f-bc0c-43b7-ac3f-0baa33a72fc7\") " pod="openshift-insights/insights-runtime-extractor-d6m95" Apr 16 15:14:32.069468 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:32.069368 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/0554221f-bc0c-43b7-ac3f-0baa33a72fc7-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-d6m95\" (UID: \"0554221f-bc0c-43b7-ac3f-0baa33a72fc7\") " pod="openshift-insights/insights-runtime-extractor-d6m95" Apr 16 15:14:32.069468 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:32.069389 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/0554221f-bc0c-43b7-ac3f-0baa33a72fc7-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-d6m95\" (UID: \"0554221f-bc0c-43b7-ac3f-0baa33a72fc7\") " pod="openshift-insights/insights-runtime-extractor-d6m95" Apr 16 15:14:32.069468 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:32.069449 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/0554221f-bc0c-43b7-ac3f-0baa33a72fc7-data-volume\") pod \"insights-runtime-extractor-d6m95\" (UID: \"0554221f-bc0c-43b7-ac3f-0baa33a72fc7\") " pod="openshift-insights/insights-runtime-extractor-d6m95" Apr 16 15:14:32.069620 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:32.069499 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/0554221f-bc0c-43b7-ac3f-0baa33a72fc7-crio-socket\") pod \"insights-runtime-extractor-d6m95\" (UID: \"0554221f-bc0c-43b7-ac3f-0baa33a72fc7\") " pod="openshift-insights/insights-runtime-extractor-d6m95" Apr 16 15:14:32.170688 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:32.170642 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jncfd\" (UniqueName: \"kubernetes.io/projected/0554221f-bc0c-43b7-ac3f-0baa33a72fc7-kube-api-access-jncfd\") pod \"insights-runtime-extractor-d6m95\" (UID: \"0554221f-bc0c-43b7-ac3f-0baa33a72fc7\") " pod="openshift-insights/insights-runtime-extractor-d6m95" Apr 16 15:14:32.170848 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:32.170707 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/0554221f-bc0c-43b7-ac3f-0baa33a72fc7-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-d6m95\" (UID: \"0554221f-bc0c-43b7-ac3f-0baa33a72fc7\") " pod="openshift-insights/insights-runtime-extractor-d6m95" Apr 16 15:14:32.170848 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:32.170727 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/0554221f-bc0c-43b7-ac3f-0baa33a72fc7-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-d6m95\" (UID: \"0554221f-bc0c-43b7-ac3f-0baa33a72fc7\") " pod="openshift-insights/insights-runtime-extractor-d6m95" Apr 16 15:14:32.170848 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:32.170749 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/0554221f-bc0c-43b7-ac3f-0baa33a72fc7-data-volume\") pod \"insights-runtime-extractor-d6m95\" (UID: \"0554221f-bc0c-43b7-ac3f-0baa33a72fc7\") " pod="openshift-insights/insights-runtime-extractor-d6m95" Apr 16 15:14:32.170848 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:32.170765 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/0554221f-bc0c-43b7-ac3f-0baa33a72fc7-crio-socket\") pod \"insights-runtime-extractor-d6m95\" (UID: \"0554221f-bc0c-43b7-ac3f-0baa33a72fc7\") " pod="openshift-insights/insights-runtime-extractor-d6m95" Apr 16 15:14:32.171014 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:32.170853 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/0554221f-bc0c-43b7-ac3f-0baa33a72fc7-crio-socket\") pod \"insights-runtime-extractor-d6m95\" (UID: \"0554221f-bc0c-43b7-ac3f-0baa33a72fc7\") " pod="openshift-insights/insights-runtime-extractor-d6m95" Apr 16 15:14:32.171149 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:32.171125 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/0554221f-bc0c-43b7-ac3f-0baa33a72fc7-data-volume\") pod \"insights-runtime-extractor-d6m95\" (UID: \"0554221f-bc0c-43b7-ac3f-0baa33a72fc7\") " pod="openshift-insights/insights-runtime-extractor-d6m95" Apr 16 15:14:32.171311 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:32.171294 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/0554221f-bc0c-43b7-ac3f-0baa33a72fc7-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-d6m95\" (UID: \"0554221f-bc0c-43b7-ac3f-0baa33a72fc7\") " pod="openshift-insights/insights-runtime-extractor-d6m95" Apr 16 15:14:32.173724 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:32.173709 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/0554221f-bc0c-43b7-ac3f-0baa33a72fc7-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-d6m95\" (UID: \"0554221f-bc0c-43b7-ac3f-0baa33a72fc7\") " pod="openshift-insights/insights-runtime-extractor-d6m95" Apr 16 15:14:32.181945 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:32.181921 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jncfd\" (UniqueName: \"kubernetes.io/projected/0554221f-bc0c-43b7-ac3f-0baa33a72fc7-kube-api-access-jncfd\") pod \"insights-runtime-extractor-d6m95\" (UID: \"0554221f-bc0c-43b7-ac3f-0baa33a72fc7\") " pod="openshift-insights/insights-runtime-extractor-d6m95" Apr 16 15:14:32.356127 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:32.356034 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-d6m95" Apr 16 15:14:32.473532 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:32.473439 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-d6m95"] Apr 16 15:14:32.475916 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:14:32.475890 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0554221f_bc0c_43b7_ac3f_0baa33a72fc7.slice/crio-61896042c6ac7afb9c37f16fc9caa843ec465131bd59d6769c1647a835eefb1a WatchSource:0}: Error finding container 61896042c6ac7afb9c37f16fc9caa843ec465131bd59d6769c1647a835eefb1a: Status 404 returned error can't find the container with id 61896042c6ac7afb9c37f16fc9caa843ec465131bd59d6769c1647a835eefb1a Apr 16 15:14:33.408536 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:33.408505 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-d6m95" event={"ID":"0554221f-bc0c-43b7-ac3f-0baa33a72fc7","Type":"ContainerStarted","Data":"f3bfc227ae2ede6c0301766fb3ecc2bdd5a34e4455e18da48e52f14584b68e19"} Apr 16 15:14:33.408896 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:33.408544 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-d6m95" event={"ID":"0554221f-bc0c-43b7-ac3f-0baa33a72fc7","Type":"ContainerStarted","Data":"874b33c5436a5193a43897023dbc52f8a081b0466c969c249121721d32303967"} Apr 16 15:14:33.408896 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:33.408552 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-d6m95" event={"ID":"0554221f-bc0c-43b7-ac3f-0baa33a72fc7","Type":"ContainerStarted","Data":"61896042c6ac7afb9c37f16fc9caa843ec465131bd59d6769c1647a835eefb1a"} Apr 16 15:14:35.415114 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:35.415079 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-d6m95" event={"ID":"0554221f-bc0c-43b7-ac3f-0baa33a72fc7","Type":"ContainerStarted","Data":"8e6ebcc8840efac0d8704bc97ecd990f4d6f83d5572907a2591960826b30c547"} Apr 16 15:14:35.436738 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:35.436685 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-d6m95" podStartSLOduration=1.131142024 podStartE2EDuration="3.43667015s" podCreationTimestamp="2026-04-16 15:14:32 +0000 UTC" firstStartedPulling="2026-04-16 15:14:32.526964463 +0000 UTC m=+151.113168253" lastFinishedPulling="2026-04-16 15:14:34.832492588 +0000 UTC m=+153.418696379" observedRunningTime="2026-04-16 15:14:35.436439025 +0000 UTC m=+154.022642836" watchObservedRunningTime="2026-04-16 15:14:35.43667015 +0000 UTC m=+154.022873954" Apr 16 15:14:36.630437 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:36.630392 2573 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-d87b8d5fc-xlsw6" Apr 16 15:14:36.630437 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:36.630443 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-d87b8d5fc-xlsw6" Apr 16 15:14:36.630867 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:36.630776 2573 scope.go:117] "RemoveContainer" containerID="fd07b0070f6eb2ca78096c2016803510adff02d82e27c460dafdcb06d981e768" Apr 16 15:14:36.630953 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:14:36.630936 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-d87b8d5fc-xlsw6_openshift-console-operator(22e6f93e-aa56-4c51-85da-4f105af8d76d)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-xlsw6" podUID="22e6f93e-aa56-4c51-85da-4f105af8d76d" Apr 16 15:14:39.313407 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:14:39.313375 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-mgxdh" podUID="5c03038e-c257-4668-a7fa-62c438af26c8" Apr 16 15:14:39.326808 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:14:39.326773 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-kw9vq" podUID="8d53317c-dbd7-4c13-ba73-b51e72998223" Apr 16 15:14:39.424932 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:39.424903 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-mgxdh" Apr 16 15:14:40.548172 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:40.548141 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-fcsws"] Apr 16 15:14:40.551514 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:40.551499 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-fcsws" Apr 16 15:14:40.554409 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:40.554382 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 16 15:14:40.554513 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:40.554490 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-zmtmt\"" Apr 16 15:14:40.561905 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:40.561882 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-fcsws"] Apr 16 15:14:40.632463 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:40.632426 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/9a07ebbc-7250-4238-9bbe-e0feea8ed073-tls-certificates\") pod \"prometheus-operator-admission-webhook-9cb97cd87-fcsws\" (UID: \"9a07ebbc-7250-4238-9bbe-e0feea8ed073\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-fcsws" Apr 16 15:14:40.733496 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:40.733459 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/9a07ebbc-7250-4238-9bbe-e0feea8ed073-tls-certificates\") pod \"prometheus-operator-admission-webhook-9cb97cd87-fcsws\" (UID: \"9a07ebbc-7250-4238-9bbe-e0feea8ed073\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-fcsws" Apr 16 15:14:40.736059 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:40.736035 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/9a07ebbc-7250-4238-9bbe-e0feea8ed073-tls-certificates\") pod \"prometheus-operator-admission-webhook-9cb97cd87-fcsws\" (UID: \"9a07ebbc-7250-4238-9bbe-e0feea8ed073\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-fcsws" Apr 16 15:14:40.860044 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:40.859937 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-fcsws" Apr 16 15:14:40.980043 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:40.980012 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-fcsws"] Apr 16 15:14:40.982847 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:14:40.982812 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a07ebbc_7250_4238_9bbe_e0feea8ed073.slice/crio-525b651231a70357821657d6c54572c8eab04ee72ba8c69f03f85c0038cad8a6 WatchSource:0}: Error finding container 525b651231a70357821657d6c54572c8eab04ee72ba8c69f03f85c0038cad8a6: Status 404 returned error can't find the container with id 525b651231a70357821657d6c54572c8eab04ee72ba8c69f03f85c0038cad8a6 Apr 16 15:14:41.023226 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:14:41.023193 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-h9c44" podUID="414282ac-76dc-4992-819a-bdf05043c09d" Apr 16 15:14:41.432911 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:41.432871 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-fcsws" event={"ID":"9a07ebbc-7250-4238-9bbe-e0feea8ed073","Type":"ContainerStarted","Data":"525b651231a70357821657d6c54572c8eab04ee72ba8c69f03f85c0038cad8a6"} Apr 16 15:14:42.437640 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:42.437603 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-fcsws" event={"ID":"9a07ebbc-7250-4238-9bbe-e0feea8ed073","Type":"ContainerStarted","Data":"a5b241f0d3fb99899d6746fe454ccb3f7b1fe5dec96eb7eee21ff8b99452a167"} Apr 16 15:14:42.454106 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:42.454061 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-fcsws" podStartSLOduration=1.52462251 podStartE2EDuration="2.454046816s" podCreationTimestamp="2026-04-16 15:14:40 +0000 UTC" firstStartedPulling="2026-04-16 15:14:40.984632157 +0000 UTC m=+159.570835944" lastFinishedPulling="2026-04-16 15:14:41.914056462 +0000 UTC m=+160.500260250" observedRunningTime="2026-04-16 15:14:42.452931612 +0000 UTC m=+161.039135468" watchObservedRunningTime="2026-04-16 15:14:42.454046816 +0000 UTC m=+161.040250625" Apr 16 15:14:43.440469 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:43.440433 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-fcsws" Apr 16 15:14:43.445067 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:43.445044 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-fcsws" Apr 16 15:14:43.660887 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:43.660853 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-78f957474d-2c5vl"] Apr 16 15:14:43.697739 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:43.697665 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-78f957474d-2c5vl"] Apr 16 15:14:43.697866 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:43.697791 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-78f957474d-2c5vl" Apr 16 15:14:43.701516 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:43.701496 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 16 15:14:43.701760 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:43.701746 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 15:14:43.701886 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:43.701866 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 15:14:43.701987 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:43.701890 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 15:14:43.701987 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:43.701910 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 16 15:14:43.702151 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:43.702107 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-gdqt5\"" Apr 16 15:14:43.754101 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:43.754061 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/f973310e-d7cd-4927-b934-0e10d5013bd8-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-2c5vl\" (UID: \"f973310e-d7cd-4927-b934-0e10d5013bd8\") " pod="openshift-monitoring/prometheus-operator-78f957474d-2c5vl" Apr 16 15:14:43.754248 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:43.754153 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzwlv\" (UniqueName: \"kubernetes.io/projected/f973310e-d7cd-4927-b934-0e10d5013bd8-kube-api-access-tzwlv\") pod \"prometheus-operator-78f957474d-2c5vl\" (UID: \"f973310e-d7cd-4927-b934-0e10d5013bd8\") " pod="openshift-monitoring/prometheus-operator-78f957474d-2c5vl" Apr 16 15:14:43.754248 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:43.754181 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f973310e-d7cd-4927-b934-0e10d5013bd8-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-78f957474d-2c5vl\" (UID: \"f973310e-d7cd-4927-b934-0e10d5013bd8\") " pod="openshift-monitoring/prometheus-operator-78f957474d-2c5vl" Apr 16 15:14:43.754322 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:43.754246 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f973310e-d7cd-4927-b934-0e10d5013bd8-metrics-client-ca\") pod \"prometheus-operator-78f957474d-2c5vl\" (UID: \"f973310e-d7cd-4927-b934-0e10d5013bd8\") " pod="openshift-monitoring/prometheus-operator-78f957474d-2c5vl" Apr 16 15:14:43.855478 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:43.855435 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f973310e-d7cd-4927-b934-0e10d5013bd8-metrics-client-ca\") pod \"prometheus-operator-78f957474d-2c5vl\" (UID: \"f973310e-d7cd-4927-b934-0e10d5013bd8\") " pod="openshift-monitoring/prometheus-operator-78f957474d-2c5vl" Apr 16 15:14:43.855627 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:43.855493 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/f973310e-d7cd-4927-b934-0e10d5013bd8-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-2c5vl\" (UID: \"f973310e-d7cd-4927-b934-0e10d5013bd8\") " pod="openshift-monitoring/prometheus-operator-78f957474d-2c5vl" Apr 16 15:14:43.855627 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:43.855545 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tzwlv\" (UniqueName: \"kubernetes.io/projected/f973310e-d7cd-4927-b934-0e10d5013bd8-kube-api-access-tzwlv\") pod \"prometheus-operator-78f957474d-2c5vl\" (UID: \"f973310e-d7cd-4927-b934-0e10d5013bd8\") " pod="openshift-monitoring/prometheus-operator-78f957474d-2c5vl" Apr 16 15:14:43.855627 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:43.855570 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f973310e-d7cd-4927-b934-0e10d5013bd8-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-78f957474d-2c5vl\" (UID: \"f973310e-d7cd-4927-b934-0e10d5013bd8\") " pod="openshift-monitoring/prometheus-operator-78f957474d-2c5vl" Apr 16 15:14:43.855818 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:14:43.855683 2573 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 16 15:14:43.855818 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:14:43.855757 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f973310e-d7cd-4927-b934-0e10d5013bd8-prometheus-operator-tls podName:f973310e-d7cd-4927-b934-0e10d5013bd8 nodeName:}" failed. No retries permitted until 2026-04-16 15:14:44.35573761 +0000 UTC m=+162.941941416 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/f973310e-d7cd-4927-b934-0e10d5013bd8-prometheus-operator-tls") pod "prometheus-operator-78f957474d-2c5vl" (UID: "f973310e-d7cd-4927-b934-0e10d5013bd8") : secret "prometheus-operator-tls" not found Apr 16 15:14:43.856195 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:43.856173 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f973310e-d7cd-4927-b934-0e10d5013bd8-metrics-client-ca\") pod \"prometheus-operator-78f957474d-2c5vl\" (UID: \"f973310e-d7cd-4927-b934-0e10d5013bd8\") " pod="openshift-monitoring/prometheus-operator-78f957474d-2c5vl" Apr 16 15:14:43.858069 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:43.858046 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f973310e-d7cd-4927-b934-0e10d5013bd8-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-78f957474d-2c5vl\" (UID: \"f973310e-d7cd-4927-b934-0e10d5013bd8\") " pod="openshift-monitoring/prometheus-operator-78f957474d-2c5vl" Apr 16 15:14:43.864673 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:43.864639 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzwlv\" (UniqueName: \"kubernetes.io/projected/f973310e-d7cd-4927-b934-0e10d5013bd8-kube-api-access-tzwlv\") pod \"prometheus-operator-78f957474d-2c5vl\" (UID: \"f973310e-d7cd-4927-b934-0e10d5013bd8\") " pod="openshift-monitoring/prometheus-operator-78f957474d-2c5vl" Apr 16 15:14:44.259141 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:44.259106 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8d53317c-dbd7-4c13-ba73-b51e72998223-cert\") pod \"ingress-canary-kw9vq\" (UID: \"8d53317c-dbd7-4c13-ba73-b51e72998223\") " pod="openshift-ingress-canary/ingress-canary-kw9vq" Apr 16 15:14:44.259351 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:44.259164 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5c03038e-c257-4668-a7fa-62c438af26c8-metrics-tls\") pod \"dns-default-mgxdh\" (UID: \"5c03038e-c257-4668-a7fa-62c438af26c8\") " pod="openshift-dns/dns-default-mgxdh" Apr 16 15:14:44.261614 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:44.261595 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8d53317c-dbd7-4c13-ba73-b51e72998223-cert\") pod \"ingress-canary-kw9vq\" (UID: \"8d53317c-dbd7-4c13-ba73-b51e72998223\") " pod="openshift-ingress-canary/ingress-canary-kw9vq" Apr 16 15:14:44.273339 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:44.273313 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5c03038e-c257-4668-a7fa-62c438af26c8-metrics-tls\") pod \"dns-default-mgxdh\" (UID: \"5c03038e-c257-4668-a7fa-62c438af26c8\") " pod="openshift-dns/dns-default-mgxdh" Apr 16 15:14:44.359749 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:44.359710 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/f973310e-d7cd-4927-b934-0e10d5013bd8-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-2c5vl\" (UID: \"f973310e-d7cd-4927-b934-0e10d5013bd8\") " pod="openshift-monitoring/prometheus-operator-78f957474d-2c5vl" Apr 16 15:14:44.362147 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:44.362127 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/f973310e-d7cd-4927-b934-0e10d5013bd8-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-2c5vl\" (UID: \"f973310e-d7cd-4927-b934-0e10d5013bd8\") " pod="openshift-monitoring/prometheus-operator-78f957474d-2c5vl" Apr 16 15:14:44.529034 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:44.528955 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-xrkdr\"" Apr 16 15:14:44.536233 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:44.536210 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-mgxdh" Apr 16 15:14:44.608372 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:44.608330 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-78f957474d-2c5vl" Apr 16 15:14:44.674943 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:44.674911 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-mgxdh"] Apr 16 15:14:44.677945 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:14:44.677918 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c03038e_c257_4668_a7fa_62c438af26c8.slice/crio-f86a8274d25094328e2a19f20e1082dd76a65bd60f8c40911675407bc02aaff3 WatchSource:0}: Error finding container f86a8274d25094328e2a19f20e1082dd76a65bd60f8c40911675407bc02aaff3: Status 404 returned error can't find the container with id f86a8274d25094328e2a19f20e1082dd76a65bd60f8c40911675407bc02aaff3 Apr 16 15:14:44.733321 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:44.733286 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-78f957474d-2c5vl"] Apr 16 15:14:44.735975 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:14:44.735950 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf973310e_d7cd_4927_b934_0e10d5013bd8.slice/crio-4d6f516db09a31bf6133010ced65b5ed048ddd1c222c67a6065f6b48b15738b1 WatchSource:0}: Error finding container 4d6f516db09a31bf6133010ced65b5ed048ddd1c222c67a6065f6b48b15738b1: Status 404 returned error can't find the container with id 4d6f516db09a31bf6133010ced65b5ed048ddd1c222c67a6065f6b48b15738b1 Apr 16 15:14:45.447308 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:45.447266 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-78f957474d-2c5vl" event={"ID":"f973310e-d7cd-4927-b934-0e10d5013bd8","Type":"ContainerStarted","Data":"4d6f516db09a31bf6133010ced65b5ed048ddd1c222c67a6065f6b48b15738b1"} Apr 16 15:14:45.449611 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:45.449578 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-mgxdh" event={"ID":"5c03038e-c257-4668-a7fa-62c438af26c8","Type":"ContainerStarted","Data":"f86a8274d25094328e2a19f20e1082dd76a65bd60f8c40911675407bc02aaff3"} Apr 16 15:14:46.453692 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:46.453622 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-78f957474d-2c5vl" event={"ID":"f973310e-d7cd-4927-b934-0e10d5013bd8","Type":"ContainerStarted","Data":"e29ec4046bfb422ebe34aa844e505fdaf7e965005f9ee6622f643ee59747cc9e"} Apr 16 15:14:47.458572 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:47.458542 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-mgxdh" event={"ID":"5c03038e-c257-4668-a7fa-62c438af26c8","Type":"ContainerStarted","Data":"d8afae64bc03aabb86a5484fadfbb9ee1797bfa1d6815e8cd179a52bfd6dc0ac"} Apr 16 15:14:47.458572 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:47.458577 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-mgxdh" event={"ID":"5c03038e-c257-4668-a7fa-62c438af26c8","Type":"ContainerStarted","Data":"9508d6d28283adc8fef29813e75975f0e5db0e2cfcba7eed1f26eed2312fea6f"} Apr 16 15:14:47.459078 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:47.458634 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-mgxdh" Apr 16 15:14:47.460218 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:47.460200 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-78f957474d-2c5vl" event={"ID":"f973310e-d7cd-4927-b934-0e10d5013bd8","Type":"ContainerStarted","Data":"65b7306b53bd01a6c601702e9604ca111be155dc559eb132ef5eba694e3ac769"} Apr 16 15:14:47.475783 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:47.475745 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-mgxdh" podStartSLOduration=129.798916296 podStartE2EDuration="2m11.475732582s" podCreationTimestamp="2026-04-16 15:12:36 +0000 UTC" firstStartedPulling="2026-04-16 15:14:44.680237024 +0000 UTC m=+163.266440818" lastFinishedPulling="2026-04-16 15:14:46.357053317 +0000 UTC m=+164.943257104" observedRunningTime="2026-04-16 15:14:47.475266312 +0000 UTC m=+166.061470120" watchObservedRunningTime="2026-04-16 15:14:47.475732582 +0000 UTC m=+166.061936391" Apr 16 15:14:48.014528 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:48.014491 2573 scope.go:117] "RemoveContainer" containerID="fd07b0070f6eb2ca78096c2016803510adff02d82e27c460dafdcb06d981e768" Apr 16 15:14:48.017946 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:14:48.017900 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-d87b8d5fc-xlsw6_openshift-console-operator(22e6f93e-aa56-4c51-85da-4f105af8d76d)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-xlsw6" podUID="22e6f93e-aa56-4c51-85da-4f105af8d76d" Apr 16 15:14:49.044724 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:49.044673 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-78f957474d-2c5vl" podStartSLOduration=4.42381566 podStartE2EDuration="6.044638724s" podCreationTimestamp="2026-04-16 15:14:43 +0000 UTC" firstStartedPulling="2026-04-16 15:14:44.737782238 +0000 UTC m=+163.323986026" lastFinishedPulling="2026-04-16 15:14:46.358605301 +0000 UTC m=+164.944809090" observedRunningTime="2026-04-16 15:14:47.491752818 +0000 UTC m=+166.077956627" watchObservedRunningTime="2026-04-16 15:14:49.044638724 +0000 UTC m=+167.630842532" Apr 16 15:14:49.045111 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:49.044817 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-qz4mc"] Apr 16 15:14:49.048180 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:49.048160 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-qz4mc" Apr 16 15:14:49.050845 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:49.050818 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 15:14:49.051094 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:49.051078 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-8z8zc\"" Apr 16 15:14:49.051184 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:49.051128 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 15:14:49.052269 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:49.052252 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 15:14:49.101357 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:49.101322 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/a114c53c-19c2-4d2c-8001-45b045bfca65-node-exporter-accelerators-collector-config\") pod \"node-exporter-qz4mc\" (UID: \"a114c53c-19c2-4d2c-8001-45b045bfca65\") " pod="openshift-monitoring/node-exporter-qz4mc" Apr 16 15:14:49.101515 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:49.101365 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/a114c53c-19c2-4d2c-8001-45b045bfca65-root\") pod \"node-exporter-qz4mc\" (UID: \"a114c53c-19c2-4d2c-8001-45b045bfca65\") " pod="openshift-monitoring/node-exporter-qz4mc" Apr 16 15:14:49.101515 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:49.101384 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/a114c53c-19c2-4d2c-8001-45b045bfca65-node-exporter-tls\") pod \"node-exporter-qz4mc\" (UID: \"a114c53c-19c2-4d2c-8001-45b045bfca65\") " pod="openshift-monitoring/node-exporter-qz4mc" Apr 16 15:14:49.101515 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:49.101404 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a114c53c-19c2-4d2c-8001-45b045bfca65-metrics-client-ca\") pod \"node-exporter-qz4mc\" (UID: \"a114c53c-19c2-4d2c-8001-45b045bfca65\") " pod="openshift-monitoring/node-exporter-qz4mc" Apr 16 15:14:49.101515 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:49.101498 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a114c53c-19c2-4d2c-8001-45b045bfca65-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-qz4mc\" (UID: \"a114c53c-19c2-4d2c-8001-45b045bfca65\") " pod="openshift-monitoring/node-exporter-qz4mc" Apr 16 15:14:49.101742 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:49.101541 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/a114c53c-19c2-4d2c-8001-45b045bfca65-node-exporter-textfile\") pod \"node-exporter-qz4mc\" (UID: \"a114c53c-19c2-4d2c-8001-45b045bfca65\") " pod="openshift-monitoring/node-exporter-qz4mc" Apr 16 15:14:49.101742 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:49.101570 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/a114c53c-19c2-4d2c-8001-45b045bfca65-node-exporter-wtmp\") pod \"node-exporter-qz4mc\" (UID: \"a114c53c-19c2-4d2c-8001-45b045bfca65\") " pod="openshift-monitoring/node-exporter-qz4mc" Apr 16 15:14:49.101742 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:49.101611 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dd2cx\" (UniqueName: \"kubernetes.io/projected/a114c53c-19c2-4d2c-8001-45b045bfca65-kube-api-access-dd2cx\") pod \"node-exporter-qz4mc\" (UID: \"a114c53c-19c2-4d2c-8001-45b045bfca65\") " pod="openshift-monitoring/node-exporter-qz4mc" Apr 16 15:14:49.101742 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:49.101698 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a114c53c-19c2-4d2c-8001-45b045bfca65-sys\") pod \"node-exporter-qz4mc\" (UID: \"a114c53c-19c2-4d2c-8001-45b045bfca65\") " pod="openshift-monitoring/node-exporter-qz4mc" Apr 16 15:14:49.202113 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:49.202079 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a114c53c-19c2-4d2c-8001-45b045bfca65-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-qz4mc\" (UID: \"a114c53c-19c2-4d2c-8001-45b045bfca65\") " pod="openshift-monitoring/node-exporter-qz4mc" Apr 16 15:14:49.202270 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:49.202118 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/a114c53c-19c2-4d2c-8001-45b045bfca65-node-exporter-textfile\") pod \"node-exporter-qz4mc\" (UID: \"a114c53c-19c2-4d2c-8001-45b045bfca65\") " pod="openshift-monitoring/node-exporter-qz4mc" Apr 16 15:14:49.202270 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:49.202241 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/a114c53c-19c2-4d2c-8001-45b045bfca65-node-exporter-wtmp\") pod \"node-exporter-qz4mc\" (UID: \"a114c53c-19c2-4d2c-8001-45b045bfca65\") " pod="openshift-monitoring/node-exporter-qz4mc" Apr 16 15:14:49.202374 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:49.202289 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dd2cx\" (UniqueName: \"kubernetes.io/projected/a114c53c-19c2-4d2c-8001-45b045bfca65-kube-api-access-dd2cx\") pod \"node-exporter-qz4mc\" (UID: \"a114c53c-19c2-4d2c-8001-45b045bfca65\") " pod="openshift-monitoring/node-exporter-qz4mc" Apr 16 15:14:49.202374 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:49.202308 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a114c53c-19c2-4d2c-8001-45b045bfca65-sys\") pod \"node-exporter-qz4mc\" (UID: \"a114c53c-19c2-4d2c-8001-45b045bfca65\") " pod="openshift-monitoring/node-exporter-qz4mc" Apr 16 15:14:49.202374 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:49.202345 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/a114c53c-19c2-4d2c-8001-45b045bfca65-node-exporter-accelerators-collector-config\") pod \"node-exporter-qz4mc\" (UID: \"a114c53c-19c2-4d2c-8001-45b045bfca65\") " pod="openshift-monitoring/node-exporter-qz4mc" Apr 16 15:14:49.202518 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:49.202388 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/a114c53c-19c2-4d2c-8001-45b045bfca65-root\") pod \"node-exporter-qz4mc\" (UID: \"a114c53c-19c2-4d2c-8001-45b045bfca65\") " pod="openshift-monitoring/node-exporter-qz4mc" Apr 16 15:14:49.202518 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:49.202417 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/a114c53c-19c2-4d2c-8001-45b045bfca65-node-exporter-tls\") pod \"node-exporter-qz4mc\" (UID: \"a114c53c-19c2-4d2c-8001-45b045bfca65\") " pod="openshift-monitoring/node-exporter-qz4mc" Apr 16 15:14:49.202518 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:49.202426 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/a114c53c-19c2-4d2c-8001-45b045bfca65-node-exporter-textfile\") pod \"node-exporter-qz4mc\" (UID: \"a114c53c-19c2-4d2c-8001-45b045bfca65\") " pod="openshift-monitoring/node-exporter-qz4mc" Apr 16 15:14:49.202518 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:49.202430 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a114c53c-19c2-4d2c-8001-45b045bfca65-sys\") pod \"node-exporter-qz4mc\" (UID: \"a114c53c-19c2-4d2c-8001-45b045bfca65\") " pod="openshift-monitoring/node-exporter-qz4mc" Apr 16 15:14:49.202518 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:49.202449 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a114c53c-19c2-4d2c-8001-45b045bfca65-metrics-client-ca\") pod \"node-exporter-qz4mc\" (UID: \"a114c53c-19c2-4d2c-8001-45b045bfca65\") " pod="openshift-monitoring/node-exporter-qz4mc" Apr 16 15:14:49.202518 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:49.202432 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/a114c53c-19c2-4d2c-8001-45b045bfca65-node-exporter-wtmp\") pod \"node-exporter-qz4mc\" (UID: \"a114c53c-19c2-4d2c-8001-45b045bfca65\") " pod="openshift-monitoring/node-exporter-qz4mc" Apr 16 15:14:49.202518 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:49.202473 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/a114c53c-19c2-4d2c-8001-45b045bfca65-root\") pod \"node-exporter-qz4mc\" (UID: \"a114c53c-19c2-4d2c-8001-45b045bfca65\") " pod="openshift-monitoring/node-exporter-qz4mc" Apr 16 15:14:49.202965 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:49.202945 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/a114c53c-19c2-4d2c-8001-45b045bfca65-node-exporter-accelerators-collector-config\") pod \"node-exporter-qz4mc\" (UID: \"a114c53c-19c2-4d2c-8001-45b045bfca65\") " pod="openshift-monitoring/node-exporter-qz4mc" Apr 16 15:14:49.203196 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:49.202947 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a114c53c-19c2-4d2c-8001-45b045bfca65-metrics-client-ca\") pod \"node-exporter-qz4mc\" (UID: \"a114c53c-19c2-4d2c-8001-45b045bfca65\") " pod="openshift-monitoring/node-exporter-qz4mc" Apr 16 15:14:49.204517 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:49.204495 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a114c53c-19c2-4d2c-8001-45b045bfca65-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-qz4mc\" (UID: \"a114c53c-19c2-4d2c-8001-45b045bfca65\") " pod="openshift-monitoring/node-exporter-qz4mc" Apr 16 15:14:49.204748 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:49.204729 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/a114c53c-19c2-4d2c-8001-45b045bfca65-node-exporter-tls\") pod \"node-exporter-qz4mc\" (UID: \"a114c53c-19c2-4d2c-8001-45b045bfca65\") " pod="openshift-monitoring/node-exporter-qz4mc" Apr 16 15:14:49.214550 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:49.214530 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dd2cx\" (UniqueName: \"kubernetes.io/projected/a114c53c-19c2-4d2c-8001-45b045bfca65-kube-api-access-dd2cx\") pod \"node-exporter-qz4mc\" (UID: \"a114c53c-19c2-4d2c-8001-45b045bfca65\") " pod="openshift-monitoring/node-exporter-qz4mc" Apr 16 15:14:49.357221 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:49.357142 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-qz4mc" Apr 16 15:14:49.365095 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:14:49.365066 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda114c53c_19c2_4d2c_8001_45b045bfca65.slice/crio-5c2c6c8ed8827cb9d9d66792698f8f2797f893fb1e4e569fa3057584c1919681 WatchSource:0}: Error finding container 5c2c6c8ed8827cb9d9d66792698f8f2797f893fb1e4e569fa3057584c1919681: Status 404 returned error can't find the container with id 5c2c6c8ed8827cb9d9d66792698f8f2797f893fb1e4e569fa3057584c1919681 Apr 16 15:14:49.467453 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:49.467408 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-qz4mc" event={"ID":"a114c53c-19c2-4d2c-8001-45b045bfca65","Type":"ContainerStarted","Data":"5c2c6c8ed8827cb9d9d66792698f8f2797f893fb1e4e569fa3057584c1919681"} Apr 16 15:14:50.014309 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:50.014270 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-kw9vq" Apr 16 15:14:50.017368 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:50.017347 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-rc5hb\"" Apr 16 15:14:50.025086 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:50.025063 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-kw9vq" Apr 16 15:14:50.089939 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:50.089908 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 15:14:50.094237 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:50.094216 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 15:14:50.097903 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:50.097740 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 16 15:14:50.097903 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:50.097757 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 16 15:14:50.098055 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:50.097930 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 16 15:14:50.098239 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:50.098221 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 16 15:14:50.099450 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:50.099232 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 16 15:14:50.099450 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:50.099281 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 16 15:14:50.099450 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:50.099329 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 16 15:14:50.099715 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:50.099463 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-kgqmd\"" Apr 16 15:14:50.099715 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:50.099571 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 16 15:14:50.099715 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:50.099598 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 16 15:14:50.122885 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:50.122677 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 15:14:50.160902 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:50.160859 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-kw9vq"] Apr 16 15:14:50.164727 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:14:50.164690 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d53317c_dbd7_4c13_ba73_b51e72998223.slice/crio-f2503965d162b7d8edcdeaef52d2455f040a542584af33f8b08d24b32d46f290 WatchSource:0}: Error finding container f2503965d162b7d8edcdeaef52d2455f040a542584af33f8b08d24b32d46f290: Status 404 returned error can't find the container with id f2503965d162b7d8edcdeaef52d2455f040a542584af33f8b08d24b32d46f290 Apr 16 15:14:50.210782 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:50.210756 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c3411df5-97c2-4726-80d7-73987b6690e9-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"c3411df5-97c2-4726-80d7-73987b6690e9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 15:14:50.210909 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:50.210790 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c3411df5-97c2-4726-80d7-73987b6690e9-tls-assets\") pod \"alertmanager-main-0\" (UID: \"c3411df5-97c2-4726-80d7-73987b6690e9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 15:14:50.210909 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:50.210805 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/c3411df5-97c2-4726-80d7-73987b6690e9-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"c3411df5-97c2-4726-80d7-73987b6690e9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 15:14:50.210909 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:50.210827 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c3411df5-97c2-4726-80d7-73987b6690e9-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"c3411df5-97c2-4726-80d7-73987b6690e9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 15:14:50.211022 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:50.210920 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c3411df5-97c2-4726-80d7-73987b6690e9-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"c3411df5-97c2-4726-80d7-73987b6690e9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 15:14:50.211022 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:50.210949 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c3411df5-97c2-4726-80d7-73987b6690e9-config-out\") pod \"alertmanager-main-0\" (UID: \"c3411df5-97c2-4726-80d7-73987b6690e9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 15:14:50.211022 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:50.210974 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c3411df5-97c2-4726-80d7-73987b6690e9-web-config\") pod \"alertmanager-main-0\" (UID: \"c3411df5-97c2-4726-80d7-73987b6690e9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 15:14:50.211125 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:50.211022 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/c3411df5-97c2-4726-80d7-73987b6690e9-config-volume\") pod \"alertmanager-main-0\" (UID: \"c3411df5-97c2-4726-80d7-73987b6690e9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 15:14:50.211125 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:50.211053 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c3411df5-97c2-4726-80d7-73987b6690e9-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"c3411df5-97c2-4726-80d7-73987b6690e9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 15:14:50.211125 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:50.211074 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/c3411df5-97c2-4726-80d7-73987b6690e9-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"c3411df5-97c2-4726-80d7-73987b6690e9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 15:14:50.211125 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:50.211091 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/c3411df5-97c2-4726-80d7-73987b6690e9-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"c3411df5-97c2-4726-80d7-73987b6690e9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 15:14:50.211125 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:50.211105 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bhpb\" (UniqueName: \"kubernetes.io/projected/c3411df5-97c2-4726-80d7-73987b6690e9-kube-api-access-7bhpb\") pod \"alertmanager-main-0\" (UID: \"c3411df5-97c2-4726-80d7-73987b6690e9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 15:14:50.211125 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:50.211122 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/c3411df5-97c2-4726-80d7-73987b6690e9-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"c3411df5-97c2-4726-80d7-73987b6690e9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 15:14:50.312200 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:50.312113 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c3411df5-97c2-4726-80d7-73987b6690e9-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"c3411df5-97c2-4726-80d7-73987b6690e9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 15:14:50.312200 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:50.312153 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c3411df5-97c2-4726-80d7-73987b6690e9-config-out\") pod \"alertmanager-main-0\" (UID: \"c3411df5-97c2-4726-80d7-73987b6690e9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 15:14:50.312200 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:50.312176 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c3411df5-97c2-4726-80d7-73987b6690e9-web-config\") pod \"alertmanager-main-0\" (UID: \"c3411df5-97c2-4726-80d7-73987b6690e9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 15:14:50.312200 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:50.312196 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/c3411df5-97c2-4726-80d7-73987b6690e9-config-volume\") pod \"alertmanager-main-0\" (UID: \"c3411df5-97c2-4726-80d7-73987b6690e9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 15:14:50.312506 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:50.312217 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c3411df5-97c2-4726-80d7-73987b6690e9-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"c3411df5-97c2-4726-80d7-73987b6690e9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 15:14:50.312506 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:50.312246 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/c3411df5-97c2-4726-80d7-73987b6690e9-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"c3411df5-97c2-4726-80d7-73987b6690e9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 15:14:50.312506 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:50.312271 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/c3411df5-97c2-4726-80d7-73987b6690e9-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"c3411df5-97c2-4726-80d7-73987b6690e9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 15:14:50.312506 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:50.312454 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7bhpb\" (UniqueName: \"kubernetes.io/projected/c3411df5-97c2-4726-80d7-73987b6690e9-kube-api-access-7bhpb\") pod \"alertmanager-main-0\" (UID: \"c3411df5-97c2-4726-80d7-73987b6690e9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 15:14:50.312759 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:50.312546 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/c3411df5-97c2-4726-80d7-73987b6690e9-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"c3411df5-97c2-4726-80d7-73987b6690e9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 15:14:50.312759 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:50.312639 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c3411df5-97c2-4726-80d7-73987b6690e9-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"c3411df5-97c2-4726-80d7-73987b6690e9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 15:14:50.312759 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:50.312697 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c3411df5-97c2-4726-80d7-73987b6690e9-tls-assets\") pod \"alertmanager-main-0\" (UID: \"c3411df5-97c2-4726-80d7-73987b6690e9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 15:14:50.312759 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:50.312721 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/c3411df5-97c2-4726-80d7-73987b6690e9-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"c3411df5-97c2-4726-80d7-73987b6690e9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 15:14:50.312759 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:50.312747 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c3411df5-97c2-4726-80d7-73987b6690e9-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"c3411df5-97c2-4726-80d7-73987b6690e9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 15:14:50.313018 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:50.312805 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/c3411df5-97c2-4726-80d7-73987b6690e9-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"c3411df5-97c2-4726-80d7-73987b6690e9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 15:14:50.313928 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:50.313296 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c3411df5-97c2-4726-80d7-73987b6690e9-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"c3411df5-97c2-4726-80d7-73987b6690e9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 15:14:50.314089 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:50.314057 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c3411df5-97c2-4726-80d7-73987b6690e9-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"c3411df5-97c2-4726-80d7-73987b6690e9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 15:14:50.315597 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:50.315538 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c3411df5-97c2-4726-80d7-73987b6690e9-config-out\") pod \"alertmanager-main-0\" (UID: \"c3411df5-97c2-4726-80d7-73987b6690e9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 15:14:50.315597 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:50.315546 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/c3411df5-97c2-4726-80d7-73987b6690e9-config-volume\") pod \"alertmanager-main-0\" (UID: \"c3411df5-97c2-4726-80d7-73987b6690e9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 15:14:50.315794 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:50.315621 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c3411df5-97c2-4726-80d7-73987b6690e9-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"c3411df5-97c2-4726-80d7-73987b6690e9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 15:14:50.315794 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:50.315730 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/c3411df5-97c2-4726-80d7-73987b6690e9-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"c3411df5-97c2-4726-80d7-73987b6690e9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 15:14:50.315943 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:50.315910 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c3411df5-97c2-4726-80d7-73987b6690e9-web-config\") pod \"alertmanager-main-0\" (UID: \"c3411df5-97c2-4726-80d7-73987b6690e9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 15:14:50.316887 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:50.316866 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c3411df5-97c2-4726-80d7-73987b6690e9-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"c3411df5-97c2-4726-80d7-73987b6690e9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 15:14:50.316968 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:50.316902 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/c3411df5-97c2-4726-80d7-73987b6690e9-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"c3411df5-97c2-4726-80d7-73987b6690e9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 15:14:50.317929 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:50.317909 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c3411df5-97c2-4726-80d7-73987b6690e9-tls-assets\") pod \"alertmanager-main-0\" (UID: \"c3411df5-97c2-4726-80d7-73987b6690e9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 15:14:50.317967 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:50.317934 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/c3411df5-97c2-4726-80d7-73987b6690e9-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"c3411df5-97c2-4726-80d7-73987b6690e9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 15:14:50.321735 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:50.321706 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bhpb\" (UniqueName: \"kubernetes.io/projected/c3411df5-97c2-4726-80d7-73987b6690e9-kube-api-access-7bhpb\") pod \"alertmanager-main-0\" (UID: \"c3411df5-97c2-4726-80d7-73987b6690e9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 15:14:50.405366 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:50.405330 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 15:14:50.471821 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:50.471789 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-kw9vq" event={"ID":"8d53317c-dbd7-4c13-ba73-b51e72998223","Type":"ContainerStarted","Data":"f2503965d162b7d8edcdeaef52d2455f040a542584af33f8b08d24b32d46f290"} Apr 16 15:14:50.473112 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:50.473090 2573 generic.go:358] "Generic (PLEG): container finished" podID="a114c53c-19c2-4d2c-8001-45b045bfca65" containerID="30499fa6af4e9d83968f1c18e1cbc5d3adb201746ee14101e6a9d958b1819385" exitCode=0 Apr 16 15:14:50.473219 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:50.473152 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-qz4mc" event={"ID":"a114c53c-19c2-4d2c-8001-45b045bfca65","Type":"ContainerDied","Data":"30499fa6af4e9d83968f1c18e1cbc5d3adb201746ee14101e6a9d958b1819385"} Apr 16 15:14:50.532726 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:50.532705 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 15:14:50.535370 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:14:50.535332 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3411df5_97c2_4726_80d7_73987b6690e9.slice/crio-e485e4a9d0e092374197cd5be464b557fb4c648fc1cea529ba137ab595b914f6 WatchSource:0}: Error finding container e485e4a9d0e092374197cd5be464b557fb4c648fc1cea529ba137ab595b914f6: Status 404 returned error can't find the container with id e485e4a9d0e092374197cd5be464b557fb4c648fc1cea529ba137ab595b914f6 Apr 16 15:14:51.478398 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:51.478352 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-qz4mc" event={"ID":"a114c53c-19c2-4d2c-8001-45b045bfca65","Type":"ContainerStarted","Data":"43d276ddae0ca8a234add0357bcefc4fed99d7362def812fe43f75f65ccbd659"} Apr 16 15:14:51.478398 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:51.478398 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-qz4mc" event={"ID":"a114c53c-19c2-4d2c-8001-45b045bfca65","Type":"ContainerStarted","Data":"20f6f1c024c66de9ab2c9b9aad3a451ddba5e8d09ed5ca6e459cc9842f8f404a"} Apr 16 15:14:51.479614 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:51.479583 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c3411df5-97c2-4726-80d7-73987b6690e9","Type":"ContainerStarted","Data":"e485e4a9d0e092374197cd5be464b557fb4c648fc1cea529ba137ab595b914f6"} Apr 16 15:14:52.018530 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:52.015172 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-qz4mc" podStartSLOduration=2.3613309080000002 podStartE2EDuration="3.015151855s" podCreationTimestamp="2026-04-16 15:14:49 +0000 UTC" firstStartedPulling="2026-04-16 15:14:49.366842181 +0000 UTC m=+167.953045967" lastFinishedPulling="2026-04-16 15:14:50.020663126 +0000 UTC m=+168.606866914" observedRunningTime="2026-04-16 15:14:51.513510025 +0000 UTC m=+170.099713837" watchObservedRunningTime="2026-04-16 15:14:52.015151855 +0000 UTC m=+170.601355665" Apr 16 15:14:52.020726 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:52.020705 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-695cc4d4c4-tn6ql"] Apr 16 15:14:52.026830 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:52.025847 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-695cc4d4c4-tn6ql" Apr 16 15:14:52.031338 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:52.031081 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 16 15:14:52.031338 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:52.031114 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 16 15:14:52.031338 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:52.031138 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 16 15:14:52.031338 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:52.031166 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 16 15:14:52.031338 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:52.031138 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 16 15:14:52.031338 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:52.031215 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-8vak27v8i9thr\"" Apr 16 15:14:52.031720 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:52.031482 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-mp467\"" Apr 16 15:14:52.033635 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:52.033607 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-695cc4d4c4-tn6ql"] Apr 16 15:14:52.129339 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:52.129302 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/fd886ad9-e063-4c34-8522-7e261bdfe21c-secret-grpc-tls\") pod \"thanos-querier-695cc4d4c4-tn6ql\" (UID: \"fd886ad9-e063-4c34-8522-7e261bdfe21c\") " pod="openshift-monitoring/thanos-querier-695cc4d4c4-tn6ql" Apr 16 15:14:52.129492 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:52.129350 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/fd886ad9-e063-4c34-8522-7e261bdfe21c-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-695cc4d4c4-tn6ql\" (UID: \"fd886ad9-e063-4c34-8522-7e261bdfe21c\") " pod="openshift-monitoring/thanos-querier-695cc4d4c4-tn6ql" Apr 16 15:14:52.129492 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:52.129380 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/fd886ad9-e063-4c34-8522-7e261bdfe21c-secret-thanos-querier-tls\") pod \"thanos-querier-695cc4d4c4-tn6ql\" (UID: \"fd886ad9-e063-4c34-8522-7e261bdfe21c\") " pod="openshift-monitoring/thanos-querier-695cc4d4c4-tn6ql" Apr 16 15:14:52.129492 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:52.129402 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fd886ad9-e063-4c34-8522-7e261bdfe21c-metrics-client-ca\") pod \"thanos-querier-695cc4d4c4-tn6ql\" (UID: \"fd886ad9-e063-4c34-8522-7e261bdfe21c\") " pod="openshift-monitoring/thanos-querier-695cc4d4c4-tn6ql" Apr 16 15:14:52.129492 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:52.129452 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrklx\" (UniqueName: \"kubernetes.io/projected/fd886ad9-e063-4c34-8522-7e261bdfe21c-kube-api-access-xrklx\") pod \"thanos-querier-695cc4d4c4-tn6ql\" (UID: \"fd886ad9-e063-4c34-8522-7e261bdfe21c\") " pod="openshift-monitoring/thanos-querier-695cc4d4c4-tn6ql" Apr 16 15:14:52.129492 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:52.129476 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/fd886ad9-e063-4c34-8522-7e261bdfe21c-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-695cc4d4c4-tn6ql\" (UID: \"fd886ad9-e063-4c34-8522-7e261bdfe21c\") " pod="openshift-monitoring/thanos-querier-695cc4d4c4-tn6ql" Apr 16 15:14:52.129492 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:52.129492 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/fd886ad9-e063-4c34-8522-7e261bdfe21c-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-695cc4d4c4-tn6ql\" (UID: \"fd886ad9-e063-4c34-8522-7e261bdfe21c\") " pod="openshift-monitoring/thanos-querier-695cc4d4c4-tn6ql" Apr 16 15:14:52.129774 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:52.129522 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/fd886ad9-e063-4c34-8522-7e261bdfe21c-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-695cc4d4c4-tn6ql\" (UID: \"fd886ad9-e063-4c34-8522-7e261bdfe21c\") " pod="openshift-monitoring/thanos-querier-695cc4d4c4-tn6ql" Apr 16 15:14:52.230801 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:52.230774 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/fd886ad9-e063-4c34-8522-7e261bdfe21c-secret-grpc-tls\") pod \"thanos-querier-695cc4d4c4-tn6ql\" (UID: \"fd886ad9-e063-4c34-8522-7e261bdfe21c\") " pod="openshift-monitoring/thanos-querier-695cc4d4c4-tn6ql" Apr 16 15:14:52.230926 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:52.230807 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/fd886ad9-e063-4c34-8522-7e261bdfe21c-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-695cc4d4c4-tn6ql\" (UID: \"fd886ad9-e063-4c34-8522-7e261bdfe21c\") " pod="openshift-monitoring/thanos-querier-695cc4d4c4-tn6ql" Apr 16 15:14:52.230926 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:52.230826 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/fd886ad9-e063-4c34-8522-7e261bdfe21c-secret-thanos-querier-tls\") pod \"thanos-querier-695cc4d4c4-tn6ql\" (UID: \"fd886ad9-e063-4c34-8522-7e261bdfe21c\") " pod="openshift-monitoring/thanos-querier-695cc4d4c4-tn6ql" Apr 16 15:14:52.230926 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:52.230844 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fd886ad9-e063-4c34-8522-7e261bdfe21c-metrics-client-ca\") pod \"thanos-querier-695cc4d4c4-tn6ql\" (UID: \"fd886ad9-e063-4c34-8522-7e261bdfe21c\") " pod="openshift-monitoring/thanos-querier-695cc4d4c4-tn6ql" Apr 16 15:14:52.231121 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:52.231096 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xrklx\" (UniqueName: \"kubernetes.io/projected/fd886ad9-e063-4c34-8522-7e261bdfe21c-kube-api-access-xrklx\") pod \"thanos-querier-695cc4d4c4-tn6ql\" (UID: \"fd886ad9-e063-4c34-8522-7e261bdfe21c\") " pod="openshift-monitoring/thanos-querier-695cc4d4c4-tn6ql" Apr 16 15:14:52.231155 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:52.231142 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/fd886ad9-e063-4c34-8522-7e261bdfe21c-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-695cc4d4c4-tn6ql\" (UID: \"fd886ad9-e063-4c34-8522-7e261bdfe21c\") " pod="openshift-monitoring/thanos-querier-695cc4d4c4-tn6ql" Apr 16 15:14:52.231202 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:52.231168 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/fd886ad9-e063-4c34-8522-7e261bdfe21c-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-695cc4d4c4-tn6ql\" (UID: \"fd886ad9-e063-4c34-8522-7e261bdfe21c\") " pod="openshift-monitoring/thanos-querier-695cc4d4c4-tn6ql" Apr 16 15:14:52.231249 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:52.231213 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/fd886ad9-e063-4c34-8522-7e261bdfe21c-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-695cc4d4c4-tn6ql\" (UID: \"fd886ad9-e063-4c34-8522-7e261bdfe21c\") " pod="openshift-monitoring/thanos-querier-695cc4d4c4-tn6ql" Apr 16 15:14:52.231485 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:52.231460 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fd886ad9-e063-4c34-8522-7e261bdfe21c-metrics-client-ca\") pod \"thanos-querier-695cc4d4c4-tn6ql\" (UID: \"fd886ad9-e063-4c34-8522-7e261bdfe21c\") " pod="openshift-monitoring/thanos-querier-695cc4d4c4-tn6ql" Apr 16 15:14:52.233590 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:52.233555 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/fd886ad9-e063-4c34-8522-7e261bdfe21c-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-695cc4d4c4-tn6ql\" (UID: \"fd886ad9-e063-4c34-8522-7e261bdfe21c\") " pod="openshift-monitoring/thanos-querier-695cc4d4c4-tn6ql" Apr 16 15:14:52.234035 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:52.234019 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/fd886ad9-e063-4c34-8522-7e261bdfe21c-secret-thanos-querier-tls\") pod \"thanos-querier-695cc4d4c4-tn6ql\" (UID: \"fd886ad9-e063-4c34-8522-7e261bdfe21c\") " pod="openshift-monitoring/thanos-querier-695cc4d4c4-tn6ql" Apr 16 15:14:52.234150 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:52.234126 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/fd886ad9-e063-4c34-8522-7e261bdfe21c-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-695cc4d4c4-tn6ql\" (UID: \"fd886ad9-e063-4c34-8522-7e261bdfe21c\") " pod="openshift-monitoring/thanos-querier-695cc4d4c4-tn6ql" Apr 16 15:14:52.234229 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:52.234205 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/fd886ad9-e063-4c34-8522-7e261bdfe21c-secret-grpc-tls\") pod \"thanos-querier-695cc4d4c4-tn6ql\" (UID: \"fd886ad9-e063-4c34-8522-7e261bdfe21c\") " pod="openshift-monitoring/thanos-querier-695cc4d4c4-tn6ql" Apr 16 15:14:52.234314 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:52.234209 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/fd886ad9-e063-4c34-8522-7e261bdfe21c-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-695cc4d4c4-tn6ql\" (UID: \"fd886ad9-e063-4c34-8522-7e261bdfe21c\") " pod="openshift-monitoring/thanos-querier-695cc4d4c4-tn6ql" Apr 16 15:14:52.234534 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:52.234520 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/fd886ad9-e063-4c34-8522-7e261bdfe21c-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-695cc4d4c4-tn6ql\" (UID: \"fd886ad9-e063-4c34-8522-7e261bdfe21c\") " pod="openshift-monitoring/thanos-querier-695cc4d4c4-tn6ql" Apr 16 15:14:52.240968 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:52.240943 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrklx\" (UniqueName: \"kubernetes.io/projected/fd886ad9-e063-4c34-8522-7e261bdfe21c-kube-api-access-xrklx\") pod \"thanos-querier-695cc4d4c4-tn6ql\" (UID: \"fd886ad9-e063-4c34-8522-7e261bdfe21c\") " pod="openshift-monitoring/thanos-querier-695cc4d4c4-tn6ql" Apr 16 15:14:52.338447 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:52.338408 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-695cc4d4c4-tn6ql" Apr 16 15:14:52.467422 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:52.467342 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-695cc4d4c4-tn6ql"] Apr 16 15:14:52.470581 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:14:52.470557 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd886ad9_e063_4c34_8522_7e261bdfe21c.slice/crio-890b2555b9ed63c64bb2e7445eea08a394a7bdc51399d5e19c83d89c9b95f9e5 WatchSource:0}: Error finding container 890b2555b9ed63c64bb2e7445eea08a394a7bdc51399d5e19c83d89c9b95f9e5: Status 404 returned error can't find the container with id 890b2555b9ed63c64bb2e7445eea08a394a7bdc51399d5e19c83d89c9b95f9e5 Apr 16 15:14:52.482802 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:52.482775 2573 generic.go:358] "Generic (PLEG): container finished" podID="c3411df5-97c2-4726-80d7-73987b6690e9" containerID="cba9a84aec1dae6e991a70729d1430da6e4d17037419e4873d4411c00f0d2498" exitCode=0 Apr 16 15:14:52.483124 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:52.482839 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c3411df5-97c2-4726-80d7-73987b6690e9","Type":"ContainerDied","Data":"cba9a84aec1dae6e991a70729d1430da6e4d17037419e4873d4411c00f0d2498"} Apr 16 15:14:52.484032 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:52.484010 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-695cc4d4c4-tn6ql" event={"ID":"fd886ad9-e063-4c34-8522-7e261bdfe21c","Type":"ContainerStarted","Data":"890b2555b9ed63c64bb2e7445eea08a394a7bdc51399d5e19c83d89c9b95f9e5"} Apr 16 15:14:52.485284 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:52.485261 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-kw9vq" event={"ID":"8d53317c-dbd7-4c13-ba73-b51e72998223","Type":"ContainerStarted","Data":"4e01e19df865c524629970d92b81341726c291f81e5e6cc80dda09aefee89499"} Apr 16 15:14:52.531721 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:52.531679 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-kw9vq" podStartSLOduration=134.730511256 podStartE2EDuration="2m16.531639603s" podCreationTimestamp="2026-04-16 15:12:36 +0000 UTC" firstStartedPulling="2026-04-16 15:14:50.166591592 +0000 UTC m=+168.752795382" lastFinishedPulling="2026-04-16 15:14:51.967719942 +0000 UTC m=+170.553923729" observedRunningTime="2026-04-16 15:14:52.530812415 +0000 UTC m=+171.117016225" watchObservedRunningTime="2026-04-16 15:14:52.531639603 +0000 UTC m=+171.117843413" Apr 16 15:14:53.481304 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:53.481246 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-5b68b96f84-tmrm8"] Apr 16 15:14:53.484909 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:53.484889 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-5b68b96f84-tmrm8" Apr 16 15:14:53.488513 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:53.488488 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 16 15:14:53.490104 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:53.490083 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 16 15:14:53.491116 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:53.491096 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dorlk7i7l92oh\"" Apr 16 15:14:53.491116 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:53.491108 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 16 15:14:53.491282 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:53.491121 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-66gg2\"" Apr 16 15:14:53.491839 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:53.491815 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 15:14:53.503466 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:53.503431 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-5b68b96f84-tmrm8"] Apr 16 15:14:53.542695 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:53.542643 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/fe009bf2-e9e3-43d9-a8e5-fa926d26d0ec-secret-metrics-server-tls\") pod \"metrics-server-5b68b96f84-tmrm8\" (UID: \"fe009bf2-e9e3-43d9-a8e5-fa926d26d0ec\") " pod="openshift-monitoring/metrics-server-5b68b96f84-tmrm8" Apr 16 15:14:53.542872 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:53.542712 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/fe009bf2-e9e3-43d9-a8e5-fa926d26d0ec-audit-log\") pod \"metrics-server-5b68b96f84-tmrm8\" (UID: \"fe009bf2-e9e3-43d9-a8e5-fa926d26d0ec\") " pod="openshift-monitoring/metrics-server-5b68b96f84-tmrm8" Apr 16 15:14:53.542872 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:53.542750 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hm75j\" (UniqueName: \"kubernetes.io/projected/fe009bf2-e9e3-43d9-a8e5-fa926d26d0ec-kube-api-access-hm75j\") pod \"metrics-server-5b68b96f84-tmrm8\" (UID: \"fe009bf2-e9e3-43d9-a8e5-fa926d26d0ec\") " pod="openshift-monitoring/metrics-server-5b68b96f84-tmrm8" Apr 16 15:14:53.542872 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:53.542841 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe009bf2-e9e3-43d9-a8e5-fa926d26d0ec-client-ca-bundle\") pod \"metrics-server-5b68b96f84-tmrm8\" (UID: \"fe009bf2-e9e3-43d9-a8e5-fa926d26d0ec\") " pod="openshift-monitoring/metrics-server-5b68b96f84-tmrm8" Apr 16 15:14:53.543038 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:53.542890 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/fe009bf2-e9e3-43d9-a8e5-fa926d26d0ec-metrics-server-audit-profiles\") pod \"metrics-server-5b68b96f84-tmrm8\" (UID: \"fe009bf2-e9e3-43d9-a8e5-fa926d26d0ec\") " pod="openshift-monitoring/metrics-server-5b68b96f84-tmrm8" Apr 16 15:14:53.543038 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:53.542919 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fe009bf2-e9e3-43d9-a8e5-fa926d26d0ec-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-5b68b96f84-tmrm8\" (UID: \"fe009bf2-e9e3-43d9-a8e5-fa926d26d0ec\") " pod="openshift-monitoring/metrics-server-5b68b96f84-tmrm8" Apr 16 15:14:53.543038 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:53.542953 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/fe009bf2-e9e3-43d9-a8e5-fa926d26d0ec-secret-metrics-server-client-certs\") pod \"metrics-server-5b68b96f84-tmrm8\" (UID: \"fe009bf2-e9e3-43d9-a8e5-fa926d26d0ec\") " pod="openshift-monitoring/metrics-server-5b68b96f84-tmrm8" Apr 16 15:14:53.643584 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:53.643547 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/fe009bf2-e9e3-43d9-a8e5-fa926d26d0ec-secret-metrics-server-tls\") pod \"metrics-server-5b68b96f84-tmrm8\" (UID: \"fe009bf2-e9e3-43d9-a8e5-fa926d26d0ec\") " pod="openshift-monitoring/metrics-server-5b68b96f84-tmrm8" Apr 16 15:14:53.643782 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:53.643596 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/fe009bf2-e9e3-43d9-a8e5-fa926d26d0ec-audit-log\") pod \"metrics-server-5b68b96f84-tmrm8\" (UID: \"fe009bf2-e9e3-43d9-a8e5-fa926d26d0ec\") " pod="openshift-monitoring/metrics-server-5b68b96f84-tmrm8" Apr 16 15:14:53.643782 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:53.643772 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hm75j\" (UniqueName: \"kubernetes.io/projected/fe009bf2-e9e3-43d9-a8e5-fa926d26d0ec-kube-api-access-hm75j\") pod \"metrics-server-5b68b96f84-tmrm8\" (UID: \"fe009bf2-e9e3-43d9-a8e5-fa926d26d0ec\") " pod="openshift-monitoring/metrics-server-5b68b96f84-tmrm8" Apr 16 15:14:53.643913 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:53.643846 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe009bf2-e9e3-43d9-a8e5-fa926d26d0ec-client-ca-bundle\") pod \"metrics-server-5b68b96f84-tmrm8\" (UID: \"fe009bf2-e9e3-43d9-a8e5-fa926d26d0ec\") " pod="openshift-monitoring/metrics-server-5b68b96f84-tmrm8" Apr 16 15:14:53.643913 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:53.643905 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/fe009bf2-e9e3-43d9-a8e5-fa926d26d0ec-metrics-server-audit-profiles\") pod \"metrics-server-5b68b96f84-tmrm8\" (UID: \"fe009bf2-e9e3-43d9-a8e5-fa926d26d0ec\") " pod="openshift-monitoring/metrics-server-5b68b96f84-tmrm8" Apr 16 15:14:53.644010 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:53.643935 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fe009bf2-e9e3-43d9-a8e5-fa926d26d0ec-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-5b68b96f84-tmrm8\" (UID: \"fe009bf2-e9e3-43d9-a8e5-fa926d26d0ec\") " pod="openshift-monitoring/metrics-server-5b68b96f84-tmrm8" Apr 16 15:14:53.644010 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:53.643965 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/fe009bf2-e9e3-43d9-a8e5-fa926d26d0ec-secret-metrics-server-client-certs\") pod \"metrics-server-5b68b96f84-tmrm8\" (UID: \"fe009bf2-e9e3-43d9-a8e5-fa926d26d0ec\") " pod="openshift-monitoring/metrics-server-5b68b96f84-tmrm8" Apr 16 15:14:53.644113 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:53.644051 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/fe009bf2-e9e3-43d9-a8e5-fa926d26d0ec-audit-log\") pod \"metrics-server-5b68b96f84-tmrm8\" (UID: \"fe009bf2-e9e3-43d9-a8e5-fa926d26d0ec\") " pod="openshift-monitoring/metrics-server-5b68b96f84-tmrm8" Apr 16 15:14:53.645321 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:53.645294 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fe009bf2-e9e3-43d9-a8e5-fa926d26d0ec-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-5b68b96f84-tmrm8\" (UID: \"fe009bf2-e9e3-43d9-a8e5-fa926d26d0ec\") " pod="openshift-monitoring/metrics-server-5b68b96f84-tmrm8" Apr 16 15:14:53.645476 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:53.645414 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/fe009bf2-e9e3-43d9-a8e5-fa926d26d0ec-metrics-server-audit-profiles\") pod \"metrics-server-5b68b96f84-tmrm8\" (UID: \"fe009bf2-e9e3-43d9-a8e5-fa926d26d0ec\") " pod="openshift-monitoring/metrics-server-5b68b96f84-tmrm8" Apr 16 15:14:53.647265 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:53.647226 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/fe009bf2-e9e3-43d9-a8e5-fa926d26d0ec-secret-metrics-server-client-certs\") pod \"metrics-server-5b68b96f84-tmrm8\" (UID: \"fe009bf2-e9e3-43d9-a8e5-fa926d26d0ec\") " pod="openshift-monitoring/metrics-server-5b68b96f84-tmrm8" Apr 16 15:14:53.647433 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:53.647409 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/fe009bf2-e9e3-43d9-a8e5-fa926d26d0ec-secret-metrics-server-tls\") pod \"metrics-server-5b68b96f84-tmrm8\" (UID: \"fe009bf2-e9e3-43d9-a8e5-fa926d26d0ec\") " pod="openshift-monitoring/metrics-server-5b68b96f84-tmrm8" Apr 16 15:14:53.648124 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:53.648105 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe009bf2-e9e3-43d9-a8e5-fa926d26d0ec-client-ca-bundle\") pod \"metrics-server-5b68b96f84-tmrm8\" (UID: \"fe009bf2-e9e3-43d9-a8e5-fa926d26d0ec\") " pod="openshift-monitoring/metrics-server-5b68b96f84-tmrm8" Apr 16 15:14:53.653352 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:53.653329 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hm75j\" (UniqueName: \"kubernetes.io/projected/fe009bf2-e9e3-43d9-a8e5-fa926d26d0ec-kube-api-access-hm75j\") pod \"metrics-server-5b68b96f84-tmrm8\" (UID: \"fe009bf2-e9e3-43d9-a8e5-fa926d26d0ec\") " pod="openshift-monitoring/metrics-server-5b68b96f84-tmrm8" Apr 16 15:14:53.796576 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:53.796490 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-5b68b96f84-tmrm8" Apr 16 15:14:54.801302 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:54.801273 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-5b68b96f84-tmrm8"] Apr 16 15:14:54.802825 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:14:54.802796 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe009bf2_e9e3_43d9_a8e5_fa926d26d0ec.slice/crio-fe95aaf9b9bb3e3490de20f4a64642b5f8e44c39f749e23d0334088ed20fcb56 WatchSource:0}: Error finding container fe95aaf9b9bb3e3490de20f4a64642b5f8e44c39f749e23d0334088ed20fcb56: Status 404 returned error can't find the container with id fe95aaf9b9bb3e3490de20f4a64642b5f8e44c39f749e23d0334088ed20fcb56 Apr 16 15:14:55.013781 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:55.013754 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h9c44" Apr 16 15:14:55.502777 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:55.502726 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-5b68b96f84-tmrm8" event={"ID":"fe009bf2-e9e3-43d9-a8e5-fa926d26d0ec","Type":"ContainerStarted","Data":"fe95aaf9b9bb3e3490de20f4a64642b5f8e44c39f749e23d0334088ed20fcb56"} Apr 16 15:14:55.509385 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:55.509269 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c3411df5-97c2-4726-80d7-73987b6690e9","Type":"ContainerStarted","Data":"6f5e09ab4d8a709c1694efdb5d9bf5295b3109f0a4c351b0a5a25cfe117a69c8"} Apr 16 15:14:55.509385 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:55.509307 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c3411df5-97c2-4726-80d7-73987b6690e9","Type":"ContainerStarted","Data":"609ddcfa4456edf45a4452a1da2e173bdef63b8316cf7b013208f72095e7e62e"} Apr 16 15:14:55.509385 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:55.509320 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c3411df5-97c2-4726-80d7-73987b6690e9","Type":"ContainerStarted","Data":"6c7aba47f64d3086eda855a1463791f7668ce9fa575f56d8948aeb17970bc835"} Apr 16 15:14:55.509385 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:55.509331 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c3411df5-97c2-4726-80d7-73987b6690e9","Type":"ContainerStarted","Data":"09e95a46b96c27311c034564825cf4f3d404a234bed43ea79c4accd88ab3ac56"} Apr 16 15:14:55.509385 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:55.509344 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c3411df5-97c2-4726-80d7-73987b6690e9","Type":"ContainerStarted","Data":"d5bccae011c3857c722cd559a0c60f16e515744885dfd516213f8e29082809d4"} Apr 16 15:14:55.512973 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:55.512885 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-695cc4d4c4-tn6ql" event={"ID":"fd886ad9-e063-4c34-8522-7e261bdfe21c","Type":"ContainerStarted","Data":"b27cb23a4863392a7f5de0d660ef68836b5e68a65540dc3c37162d9f815d614e"} Apr 16 15:14:55.512973 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:55.512917 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-695cc4d4c4-tn6ql" event={"ID":"fd886ad9-e063-4c34-8522-7e261bdfe21c","Type":"ContainerStarted","Data":"d59a800f406ed29ddaf3633a65dfe554f8756413aee13fd1d4014b149017f2b3"} Apr 16 15:14:55.512973 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:55.512931 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-695cc4d4c4-tn6ql" event={"ID":"fd886ad9-e063-4c34-8522-7e261bdfe21c","Type":"ContainerStarted","Data":"df9ceb7136c01d34facfe7201e726ed597a58395dcd7bba4ac5dc13bb0992e4b"} Apr 16 15:14:56.517024 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:56.516928 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-5b68b96f84-tmrm8" event={"ID":"fe009bf2-e9e3-43d9-a8e5-fa926d26d0ec","Type":"ContainerStarted","Data":"1d32ba201719c06fa6d48171f50c96d5ea4030c6184fc956ba9c1dd95c02f0da"} Apr 16 15:14:56.520815 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:56.520787 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c3411df5-97c2-4726-80d7-73987b6690e9","Type":"ContainerStarted","Data":"c93ac49c868f9aee7895c24dfce39130b033cf003c279f452b2c477b3180edaf"} Apr 16 15:14:56.524049 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:56.524028 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-695cc4d4c4-tn6ql" event={"ID":"fd886ad9-e063-4c34-8522-7e261bdfe21c","Type":"ContainerStarted","Data":"4bd7f9a54be27a5365f300b80fef01df228e4cd0b4a2033e870443db130409e6"} Apr 16 15:14:56.524146 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:56.524054 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-695cc4d4c4-tn6ql" event={"ID":"fd886ad9-e063-4c34-8522-7e261bdfe21c","Type":"ContainerStarted","Data":"58c17e7bcc4251f3dd3cc7926a94b665ad3434928f2dc849032271f10f7ca02c"} Apr 16 15:14:56.524146 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:56.524063 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-695cc4d4c4-tn6ql" event={"ID":"fd886ad9-e063-4c34-8522-7e261bdfe21c","Type":"ContainerStarted","Data":"d86576a7b8a7e25ae580dd5a5d79078a3af9ca20ff86466e41b9ca410ad77a05"} Apr 16 15:14:56.524219 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:56.524164 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-695cc4d4c4-tn6ql" Apr 16 15:14:56.544693 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:56.544635 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-5b68b96f84-tmrm8" podStartSLOduration=2.117348294 podStartE2EDuration="3.544624164s" podCreationTimestamp="2026-04-16 15:14:53 +0000 UTC" firstStartedPulling="2026-04-16 15:14:54.804774967 +0000 UTC m=+173.390978754" lastFinishedPulling="2026-04-16 15:14:56.232050833 +0000 UTC m=+174.818254624" observedRunningTime="2026-04-16 15:14:56.543043976 +0000 UTC m=+175.129247801" watchObservedRunningTime="2026-04-16 15:14:56.544624164 +0000 UTC m=+175.130827973" Apr 16 15:14:56.582321 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:56.582272 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=0.929026468 podStartE2EDuration="6.582253625s" podCreationTimestamp="2026-04-16 15:14:50 +0000 UTC" firstStartedPulling="2026-04-16 15:14:50.537296622 +0000 UTC m=+169.123500409" lastFinishedPulling="2026-04-16 15:14:56.190523775 +0000 UTC m=+174.776727566" observedRunningTime="2026-04-16 15:14:56.579399 +0000 UTC m=+175.165602810" watchObservedRunningTime="2026-04-16 15:14:56.582253625 +0000 UTC m=+175.168457434" Apr 16 15:14:56.613836 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:56.613777 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-695cc4d4c4-tn6ql" podStartSLOduration=1.894301859 podStartE2EDuration="5.613758032s" podCreationTimestamp="2026-04-16 15:14:51 +0000 UTC" firstStartedPulling="2026-04-16 15:14:52.472531415 +0000 UTC m=+171.058735203" lastFinishedPulling="2026-04-16 15:14:56.191987575 +0000 UTC m=+174.778191376" observedRunningTime="2026-04-16 15:14:56.612305535 +0000 UTC m=+175.198509355" watchObservedRunningTime="2026-04-16 15:14:56.613758032 +0000 UTC m=+175.199961844" Apr 16 15:14:57.465752 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:14:57.465722 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-mgxdh" Apr 16 15:15:02.017070 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:02.017029 2573 scope.go:117] "RemoveContainer" containerID="fd07b0070f6eb2ca78096c2016803510adff02d82e27c460dafdcb06d981e768" Apr 16 15:15:02.534572 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:02.534545 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-695cc4d4c4-tn6ql" Apr 16 15:15:02.545221 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:02.545193 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-xlsw6_22e6f93e-aa56-4c51-85da-4f105af8d76d/console-operator/2.log" Apr 16 15:15:02.545400 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:02.545322 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-xlsw6" event={"ID":"22e6f93e-aa56-4c51-85da-4f105af8d76d","Type":"ContainerStarted","Data":"d08b81570c0e52b46e17cdc465290dff2523b09908cdf036fdcb417e4ab55fa7"} Apr 16 15:15:02.545691 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:02.545640 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-d87b8d5fc-xlsw6" Apr 16 15:15:02.552839 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:02.552812 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-d87b8d5fc-xlsw6" Apr 16 15:15:02.599917 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:02.599869 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-d87b8d5fc-xlsw6" podStartSLOduration=53.689906985 podStartE2EDuration="56.599851922s" podCreationTimestamp="2026-04-16 15:14:06 +0000 UTC" firstStartedPulling="2026-04-16 15:14:06.977153021 +0000 UTC m=+125.563356811" lastFinishedPulling="2026-04-16 15:14:09.887097943 +0000 UTC m=+128.473301748" observedRunningTime="2026-04-16 15:15:02.599495049 +0000 UTC m=+181.185698859" watchObservedRunningTime="2026-04-16 15:15:02.599851922 +0000 UTC m=+181.186055733" Apr 16 15:15:02.710441 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:02.710405 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-586b57c7b4-pggd8"] Apr 16 15:15:02.713841 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:02.713824 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-586b57c7b4-pggd8" Apr 16 15:15:02.716860 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:02.716838 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-nqtjg\"" Apr 16 15:15:02.716860 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:02.716854 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 16 15:15:02.716986 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:02.716860 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 16 15:15:02.722036 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:02.722013 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-586b57c7b4-pggd8"] Apr 16 15:15:02.826876 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:02.826773 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbxwg\" (UniqueName: \"kubernetes.io/projected/3ae8c896-dcf3-4414-83ac-16e14e874dce-kube-api-access-sbxwg\") pod \"downloads-586b57c7b4-pggd8\" (UID: \"3ae8c896-dcf3-4414-83ac-16e14e874dce\") " pod="openshift-console/downloads-586b57c7b4-pggd8" Apr 16 15:15:02.927219 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:02.927174 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sbxwg\" (UniqueName: \"kubernetes.io/projected/3ae8c896-dcf3-4414-83ac-16e14e874dce-kube-api-access-sbxwg\") pod \"downloads-586b57c7b4-pggd8\" (UID: \"3ae8c896-dcf3-4414-83ac-16e14e874dce\") " pod="openshift-console/downloads-586b57c7b4-pggd8" Apr 16 15:15:02.935746 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:02.935720 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbxwg\" (UniqueName: \"kubernetes.io/projected/3ae8c896-dcf3-4414-83ac-16e14e874dce-kube-api-access-sbxwg\") pod \"downloads-586b57c7b4-pggd8\" (UID: \"3ae8c896-dcf3-4414-83ac-16e14e874dce\") " pod="openshift-console/downloads-586b57c7b4-pggd8" Apr 16 15:15:03.023311 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:03.023273 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-586b57c7b4-pggd8" Apr 16 15:15:03.160309 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:03.160150 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-586b57c7b4-pggd8"] Apr 16 15:15:03.162385 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:15:03.162356 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3ae8c896_dcf3_4414_83ac_16e14e874dce.slice/crio-c026b06cb3a891e9e977411767459cd566ca785b09099ec80fe770ed8de95fda WatchSource:0}: Error finding container c026b06cb3a891e9e977411767459cd566ca785b09099ec80fe770ed8de95fda: Status 404 returned error can't find the container with id c026b06cb3a891e9e977411767459cd566ca785b09099ec80fe770ed8de95fda Apr 16 15:15:03.549439 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:03.549400 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-586b57c7b4-pggd8" event={"ID":"3ae8c896-dcf3-4414-83ac-16e14e874dce","Type":"ContainerStarted","Data":"c026b06cb3a891e9e977411767459cd566ca785b09099ec80fe770ed8de95fda"} Apr 16 15:15:07.631361 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:07.631322 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-566c76b5cc-w784h"] Apr 16 15:15:07.634596 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:07.634573 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-566c76b5cc-w784h" Apr 16 15:15:07.637528 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:07.637508 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 16 15:15:07.637660 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:07.637549 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 16 15:15:07.638971 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:07.638767 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 16 15:15:07.638971 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:07.638780 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 16 15:15:07.638971 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:07.638861 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-gpn9b\"" Apr 16 15:15:07.639216 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:07.639043 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 16 15:15:07.643805 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:07.643785 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-566c76b5cc-w784h"] Apr 16 15:15:07.770217 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:07.770181 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1ae9dab3-c42d-47b1-95c7-32208caa61f6-console-oauth-config\") pod \"console-566c76b5cc-w784h\" (UID: \"1ae9dab3-c42d-47b1-95c7-32208caa61f6\") " pod="openshift-console/console-566c76b5cc-w784h" Apr 16 15:15:07.770363 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:07.770231 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1ae9dab3-c42d-47b1-95c7-32208caa61f6-service-ca\") pod \"console-566c76b5cc-w784h\" (UID: \"1ae9dab3-c42d-47b1-95c7-32208caa61f6\") " pod="openshift-console/console-566c76b5cc-w784h" Apr 16 15:15:07.770363 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:07.770304 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1ae9dab3-c42d-47b1-95c7-32208caa61f6-console-serving-cert\") pod \"console-566c76b5cc-w784h\" (UID: \"1ae9dab3-c42d-47b1-95c7-32208caa61f6\") " pod="openshift-console/console-566c76b5cc-w784h" Apr 16 15:15:07.770483 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:07.770382 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1ae9dab3-c42d-47b1-95c7-32208caa61f6-console-config\") pod \"console-566c76b5cc-w784h\" (UID: \"1ae9dab3-c42d-47b1-95c7-32208caa61f6\") " pod="openshift-console/console-566c76b5cc-w784h" Apr 16 15:15:07.770532 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:07.770476 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1ae9dab3-c42d-47b1-95c7-32208caa61f6-oauth-serving-cert\") pod \"console-566c76b5cc-w784h\" (UID: \"1ae9dab3-c42d-47b1-95c7-32208caa61f6\") " pod="openshift-console/console-566c76b5cc-w784h" Apr 16 15:15:07.770587 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:07.770544 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6x4r\" (UniqueName: \"kubernetes.io/projected/1ae9dab3-c42d-47b1-95c7-32208caa61f6-kube-api-access-h6x4r\") pod \"console-566c76b5cc-w784h\" (UID: \"1ae9dab3-c42d-47b1-95c7-32208caa61f6\") " pod="openshift-console/console-566c76b5cc-w784h" Apr 16 15:15:07.871292 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:07.871251 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h6x4r\" (UniqueName: \"kubernetes.io/projected/1ae9dab3-c42d-47b1-95c7-32208caa61f6-kube-api-access-h6x4r\") pod \"console-566c76b5cc-w784h\" (UID: \"1ae9dab3-c42d-47b1-95c7-32208caa61f6\") " pod="openshift-console/console-566c76b5cc-w784h" Apr 16 15:15:07.871292 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:07.871298 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1ae9dab3-c42d-47b1-95c7-32208caa61f6-console-oauth-config\") pod \"console-566c76b5cc-w784h\" (UID: \"1ae9dab3-c42d-47b1-95c7-32208caa61f6\") " pod="openshift-console/console-566c76b5cc-w784h" Apr 16 15:15:07.871525 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:07.871423 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1ae9dab3-c42d-47b1-95c7-32208caa61f6-service-ca\") pod \"console-566c76b5cc-w784h\" (UID: \"1ae9dab3-c42d-47b1-95c7-32208caa61f6\") " pod="openshift-console/console-566c76b5cc-w784h" Apr 16 15:15:07.871525 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:07.871468 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1ae9dab3-c42d-47b1-95c7-32208caa61f6-console-serving-cert\") pod \"console-566c76b5cc-w784h\" (UID: \"1ae9dab3-c42d-47b1-95c7-32208caa61f6\") " pod="openshift-console/console-566c76b5cc-w784h" Apr 16 15:15:07.871635 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:07.871524 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1ae9dab3-c42d-47b1-95c7-32208caa61f6-console-config\") pod \"console-566c76b5cc-w784h\" (UID: \"1ae9dab3-c42d-47b1-95c7-32208caa61f6\") " pod="openshift-console/console-566c76b5cc-w784h" Apr 16 15:15:07.871635 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:07.871576 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1ae9dab3-c42d-47b1-95c7-32208caa61f6-oauth-serving-cert\") pod \"console-566c76b5cc-w784h\" (UID: \"1ae9dab3-c42d-47b1-95c7-32208caa61f6\") " pod="openshift-console/console-566c76b5cc-w784h" Apr 16 15:15:07.872300 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:07.872262 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1ae9dab3-c42d-47b1-95c7-32208caa61f6-service-ca\") pod \"console-566c76b5cc-w784h\" (UID: \"1ae9dab3-c42d-47b1-95c7-32208caa61f6\") " pod="openshift-console/console-566c76b5cc-w784h" Apr 16 15:15:07.872300 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:07.872268 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1ae9dab3-c42d-47b1-95c7-32208caa61f6-oauth-serving-cert\") pod \"console-566c76b5cc-w784h\" (UID: \"1ae9dab3-c42d-47b1-95c7-32208caa61f6\") " pod="openshift-console/console-566c76b5cc-w784h" Apr 16 15:15:07.872521 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:07.872498 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1ae9dab3-c42d-47b1-95c7-32208caa61f6-console-config\") pod \"console-566c76b5cc-w784h\" (UID: \"1ae9dab3-c42d-47b1-95c7-32208caa61f6\") " pod="openshift-console/console-566c76b5cc-w784h" Apr 16 15:15:07.874144 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:07.874121 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1ae9dab3-c42d-47b1-95c7-32208caa61f6-console-oauth-config\") pod \"console-566c76b5cc-w784h\" (UID: \"1ae9dab3-c42d-47b1-95c7-32208caa61f6\") " pod="openshift-console/console-566c76b5cc-w784h" Apr 16 15:15:07.874322 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:07.874299 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1ae9dab3-c42d-47b1-95c7-32208caa61f6-console-serving-cert\") pod \"console-566c76b5cc-w784h\" (UID: \"1ae9dab3-c42d-47b1-95c7-32208caa61f6\") " pod="openshift-console/console-566c76b5cc-w784h" Apr 16 15:15:07.880823 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:07.880796 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6x4r\" (UniqueName: \"kubernetes.io/projected/1ae9dab3-c42d-47b1-95c7-32208caa61f6-kube-api-access-h6x4r\") pod \"console-566c76b5cc-w784h\" (UID: \"1ae9dab3-c42d-47b1-95c7-32208caa61f6\") " pod="openshift-console/console-566c76b5cc-w784h" Apr 16 15:15:07.945938 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:07.945900 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-566c76b5cc-w784h" Apr 16 15:15:08.111128 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:08.111093 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-566c76b5cc-w784h"] Apr 16 15:15:08.114806 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:15:08.114770 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ae9dab3_c42d_47b1_95c7_32208caa61f6.slice/crio-5550dc0f96938558e923fbebfd6f60ca1b6d94fe12eed3a80714e27c34752c74 WatchSource:0}: Error finding container 5550dc0f96938558e923fbebfd6f60ca1b6d94fe12eed3a80714e27c34752c74: Status 404 returned error can't find the container with id 5550dc0f96938558e923fbebfd6f60ca1b6d94fe12eed3a80714e27c34752c74 Apr 16 15:15:08.568028 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:08.567986 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-566c76b5cc-w784h" event={"ID":"1ae9dab3-c42d-47b1-95c7-32208caa61f6","Type":"ContainerStarted","Data":"5550dc0f96938558e923fbebfd6f60ca1b6d94fe12eed3a80714e27c34752c74"} Apr 16 15:15:11.578890 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:11.578812 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-566c76b5cc-w784h" event={"ID":"1ae9dab3-c42d-47b1-95c7-32208caa61f6","Type":"ContainerStarted","Data":"7416f987150fd93211ed7baec00e9f187943c261c8ea4e6469f9ded712b4343e"} Apr 16 15:15:11.606275 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:11.606228 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-566c76b5cc-w784h" podStartSLOduration=1.470063122 podStartE2EDuration="4.606214018s" podCreationTimestamp="2026-04-16 15:15:07 +0000 UTC" firstStartedPulling="2026-04-16 15:15:08.117116278 +0000 UTC m=+186.703320067" lastFinishedPulling="2026-04-16 15:15:11.253267177 +0000 UTC m=+189.839470963" observedRunningTime="2026-04-16 15:15:11.604215895 +0000 UTC m=+190.190419702" watchObservedRunningTime="2026-04-16 15:15:11.606214018 +0000 UTC m=+190.192417827" Apr 16 15:15:13.796971 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:13.796932 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-5b68b96f84-tmrm8" Apr 16 15:15:13.797424 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:13.796999 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-5b68b96f84-tmrm8" Apr 16 15:15:13.820159 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:13.820127 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-dd64565db-45dmz"] Apr 16 15:15:13.824814 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:13.824786 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-dd64565db-45dmz" Apr 16 15:15:13.834232 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:13.834209 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 16 15:15:13.837426 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:13.837405 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-dd64565db-45dmz"] Apr 16 15:15:13.930246 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:13.930205 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4a66b64f-9125-4d76-8a81-4a110f1909d6-console-serving-cert\") pod \"console-dd64565db-45dmz\" (UID: \"4a66b64f-9125-4d76-8a81-4a110f1909d6\") " pod="openshift-console/console-dd64565db-45dmz" Apr 16 15:15:13.930468 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:13.930432 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4a66b64f-9125-4d76-8a81-4a110f1909d6-console-config\") pod \"console-dd64565db-45dmz\" (UID: \"4a66b64f-9125-4d76-8a81-4a110f1909d6\") " pod="openshift-console/console-dd64565db-45dmz" Apr 16 15:15:13.930580 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:13.930491 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4a66b64f-9125-4d76-8a81-4a110f1909d6-service-ca\") pod \"console-dd64565db-45dmz\" (UID: \"4a66b64f-9125-4d76-8a81-4a110f1909d6\") " pod="openshift-console/console-dd64565db-45dmz" Apr 16 15:15:13.930729 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:13.930698 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vg2cr\" (UniqueName: \"kubernetes.io/projected/4a66b64f-9125-4d76-8a81-4a110f1909d6-kube-api-access-vg2cr\") pod \"console-dd64565db-45dmz\" (UID: \"4a66b64f-9125-4d76-8a81-4a110f1909d6\") " pod="openshift-console/console-dd64565db-45dmz" Apr 16 15:15:13.930840 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:13.930801 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4a66b64f-9125-4d76-8a81-4a110f1909d6-console-oauth-config\") pod \"console-dd64565db-45dmz\" (UID: \"4a66b64f-9125-4d76-8a81-4a110f1909d6\") " pod="openshift-console/console-dd64565db-45dmz" Apr 16 15:15:13.930925 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:13.930905 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4a66b64f-9125-4d76-8a81-4a110f1909d6-trusted-ca-bundle\") pod \"console-dd64565db-45dmz\" (UID: \"4a66b64f-9125-4d76-8a81-4a110f1909d6\") " pod="openshift-console/console-dd64565db-45dmz" Apr 16 15:15:13.930987 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:13.930944 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4a66b64f-9125-4d76-8a81-4a110f1909d6-oauth-serving-cert\") pod \"console-dd64565db-45dmz\" (UID: \"4a66b64f-9125-4d76-8a81-4a110f1909d6\") " pod="openshift-console/console-dd64565db-45dmz" Apr 16 15:15:14.031609 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:14.031565 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vg2cr\" (UniqueName: \"kubernetes.io/projected/4a66b64f-9125-4d76-8a81-4a110f1909d6-kube-api-access-vg2cr\") pod \"console-dd64565db-45dmz\" (UID: \"4a66b64f-9125-4d76-8a81-4a110f1909d6\") " pod="openshift-console/console-dd64565db-45dmz" Apr 16 15:15:14.031816 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:14.031626 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4a66b64f-9125-4d76-8a81-4a110f1909d6-console-oauth-config\") pod \"console-dd64565db-45dmz\" (UID: \"4a66b64f-9125-4d76-8a81-4a110f1909d6\") " pod="openshift-console/console-dd64565db-45dmz" Apr 16 15:15:14.031816 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:14.031667 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4a66b64f-9125-4d76-8a81-4a110f1909d6-trusted-ca-bundle\") pod \"console-dd64565db-45dmz\" (UID: \"4a66b64f-9125-4d76-8a81-4a110f1909d6\") " pod="openshift-console/console-dd64565db-45dmz" Apr 16 15:15:14.031816 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:14.031693 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4a66b64f-9125-4d76-8a81-4a110f1909d6-oauth-serving-cert\") pod \"console-dd64565db-45dmz\" (UID: \"4a66b64f-9125-4d76-8a81-4a110f1909d6\") " pod="openshift-console/console-dd64565db-45dmz" Apr 16 15:15:14.031816 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:14.031736 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4a66b64f-9125-4d76-8a81-4a110f1909d6-console-serving-cert\") pod \"console-dd64565db-45dmz\" (UID: \"4a66b64f-9125-4d76-8a81-4a110f1909d6\") " pod="openshift-console/console-dd64565db-45dmz" Apr 16 15:15:14.031816 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:14.031770 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4a66b64f-9125-4d76-8a81-4a110f1909d6-console-config\") pod \"console-dd64565db-45dmz\" (UID: \"4a66b64f-9125-4d76-8a81-4a110f1909d6\") " pod="openshift-console/console-dd64565db-45dmz" Apr 16 15:15:14.031816 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:14.031807 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4a66b64f-9125-4d76-8a81-4a110f1909d6-service-ca\") pod \"console-dd64565db-45dmz\" (UID: \"4a66b64f-9125-4d76-8a81-4a110f1909d6\") " pod="openshift-console/console-dd64565db-45dmz" Apr 16 15:15:14.032671 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:14.032586 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4a66b64f-9125-4d76-8a81-4a110f1909d6-console-config\") pod \"console-dd64565db-45dmz\" (UID: \"4a66b64f-9125-4d76-8a81-4a110f1909d6\") " pod="openshift-console/console-dd64565db-45dmz" Apr 16 15:15:14.032671 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:14.032595 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4a66b64f-9125-4d76-8a81-4a110f1909d6-service-ca\") pod \"console-dd64565db-45dmz\" (UID: \"4a66b64f-9125-4d76-8a81-4a110f1909d6\") " pod="openshift-console/console-dd64565db-45dmz" Apr 16 15:15:14.032860 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:14.032679 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4a66b64f-9125-4d76-8a81-4a110f1909d6-oauth-serving-cert\") pod \"console-dd64565db-45dmz\" (UID: \"4a66b64f-9125-4d76-8a81-4a110f1909d6\") " pod="openshift-console/console-dd64565db-45dmz" Apr 16 15:15:14.032911 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:14.032873 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4a66b64f-9125-4d76-8a81-4a110f1909d6-trusted-ca-bundle\") pod \"console-dd64565db-45dmz\" (UID: \"4a66b64f-9125-4d76-8a81-4a110f1909d6\") " pod="openshift-console/console-dd64565db-45dmz" Apr 16 15:15:14.034632 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:14.034604 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4a66b64f-9125-4d76-8a81-4a110f1909d6-console-oauth-config\") pod \"console-dd64565db-45dmz\" (UID: \"4a66b64f-9125-4d76-8a81-4a110f1909d6\") " pod="openshift-console/console-dd64565db-45dmz" Apr 16 15:15:14.034917 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:14.034891 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4a66b64f-9125-4d76-8a81-4a110f1909d6-console-serving-cert\") pod \"console-dd64565db-45dmz\" (UID: \"4a66b64f-9125-4d76-8a81-4a110f1909d6\") " pod="openshift-console/console-dd64565db-45dmz" Apr 16 15:15:14.040065 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:14.040045 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vg2cr\" (UniqueName: \"kubernetes.io/projected/4a66b64f-9125-4d76-8a81-4a110f1909d6-kube-api-access-vg2cr\") pod \"console-dd64565db-45dmz\" (UID: \"4a66b64f-9125-4d76-8a81-4a110f1909d6\") " pod="openshift-console/console-dd64565db-45dmz" Apr 16 15:15:14.136282 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:14.136183 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-dd64565db-45dmz" Apr 16 15:15:17.946571 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:17.946532 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-566c76b5cc-w784h" Apr 16 15:15:17.946571 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:17.946586 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-566c76b5cc-w784h" Apr 16 15:15:17.952078 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:17.952053 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-566c76b5cc-w784h" Apr 16 15:15:18.602981 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:18.602947 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-566c76b5cc-w784h" Apr 16 15:15:19.162310 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:19.162274 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-dd64565db-45dmz"] Apr 16 15:15:19.165090 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:15:19.165051 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a66b64f_9125_4d76_8a81_4a110f1909d6.slice/crio-79760de1a4e137ccb5b921102e1c509a88e0cf033abae999010dbeafe30dd615 WatchSource:0}: Error finding container 79760de1a4e137ccb5b921102e1c509a88e0cf033abae999010dbeafe30dd615: Status 404 returned error can't find the container with id 79760de1a4e137ccb5b921102e1c509a88e0cf033abae999010dbeafe30dd615 Apr 16 15:15:19.604140 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:19.604101 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-586b57c7b4-pggd8" event={"ID":"3ae8c896-dcf3-4414-83ac-16e14e874dce","Type":"ContainerStarted","Data":"ffc71377a4063b17897b6fd3aa04089c656bf68cc31c7387255150abc5b55593"} Apr 16 15:15:19.604332 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:19.604266 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-586b57c7b4-pggd8" Apr 16 15:15:19.606169 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:19.606142 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-dd64565db-45dmz" event={"ID":"4a66b64f-9125-4d76-8a81-4a110f1909d6","Type":"ContainerStarted","Data":"ca490ed26df6e2684c0f0711eb5f6ec91dc693e7d105e41907695a23c94e34f1"} Apr 16 15:15:19.606280 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:19.606176 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-dd64565db-45dmz" event={"ID":"4a66b64f-9125-4d76-8a81-4a110f1909d6","Type":"ContainerStarted","Data":"79760de1a4e137ccb5b921102e1c509a88e0cf033abae999010dbeafe30dd615"} Apr 16 15:15:19.620798 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:19.620767 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-586b57c7b4-pggd8" Apr 16 15:15:19.641855 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:19.641803 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-586b57c7b4-pggd8" podStartSLOduration=1.665858405 podStartE2EDuration="17.6417899s" podCreationTimestamp="2026-04-16 15:15:02 +0000 UTC" firstStartedPulling="2026-04-16 15:15:03.164265281 +0000 UTC m=+181.750469071" lastFinishedPulling="2026-04-16 15:15:19.14019678 +0000 UTC m=+197.726400566" observedRunningTime="2026-04-16 15:15:19.623227362 +0000 UTC m=+198.209431225" watchObservedRunningTime="2026-04-16 15:15:19.6417899 +0000 UTC m=+198.227993708" Apr 16 15:15:19.665020 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:19.664972 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-dd64565db-45dmz" podStartSLOduration=6.664953752 podStartE2EDuration="6.664953752s" podCreationTimestamp="2026-04-16 15:15:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:15:19.664432993 +0000 UTC m=+198.250636802" watchObservedRunningTime="2026-04-16 15:15:19.664953752 +0000 UTC m=+198.251157562" Apr 16 15:15:24.136336 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:24.136293 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-dd64565db-45dmz" Apr 16 15:15:24.136336 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:24.136343 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-dd64565db-45dmz" Apr 16 15:15:24.142077 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:24.142053 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-dd64565db-45dmz" Apr 16 15:15:24.626408 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:24.626379 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-dd64565db-45dmz" Apr 16 15:15:24.676004 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:24.675971 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-566c76b5cc-w784h"] Apr 16 15:15:31.648988 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:31.648954 2573 generic.go:358] "Generic (PLEG): container finished" podID="03b174e8-ba16-492a-9a00-8245945ade22" containerID="8e83905dc4a28c1a58a033a0efb6e0c8319ea2a08da57c9b6a651ee19fb33846" exitCode=0 Apr 16 15:15:31.649397 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:31.649004 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-fc75r" event={"ID":"03b174e8-ba16-492a-9a00-8245945ade22","Type":"ContainerDied","Data":"8e83905dc4a28c1a58a033a0efb6e0c8319ea2a08da57c9b6a651ee19fb33846"} Apr 16 15:15:31.649397 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:31.649322 2573 scope.go:117] "RemoveContainer" containerID="8e83905dc4a28c1a58a033a0efb6e0c8319ea2a08da57c9b6a651ee19fb33846" Apr 16 15:15:32.654378 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:32.654340 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-fc75r" event={"ID":"03b174e8-ba16-492a-9a00-8245945ade22","Type":"ContainerStarted","Data":"c3a3d765428aaa236273d157fb47316bead787a3367508284a909879a17eb665"} Apr 16 15:15:33.803146 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:33.803117 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-5b68b96f84-tmrm8" Apr 16 15:15:33.806933 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:33.806913 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-5b68b96f84-tmrm8" Apr 16 15:15:40.679437 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:40.679410 2573 generic.go:358] "Generic (PLEG): container finished" podID="8f600d4b-94cc-4150-8ca8-7002385a1b56" containerID="a97176a8a7a5863107af02ee6cc699d53eb83e89ca9f844308834171b6006a16" exitCode=0 Apr 16 15:15:40.679831 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:40.679457 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-qnf9j" event={"ID":"8f600d4b-94cc-4150-8ca8-7002385a1b56","Type":"ContainerDied","Data":"a97176a8a7a5863107af02ee6cc699d53eb83e89ca9f844308834171b6006a16"} Apr 16 15:15:40.679831 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:40.679738 2573 scope.go:117] "RemoveContainer" containerID="a97176a8a7a5863107af02ee6cc699d53eb83e89ca9f844308834171b6006a16" Apr 16 15:15:41.684385 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:41.684346 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-qnf9j" event={"ID":"8f600d4b-94cc-4150-8ca8-7002385a1b56","Type":"ContainerStarted","Data":"7abe034c1d408f59554ce92ec32f09771e313fefdf844b59e8ca9ef4b16b9b2a"} Apr 16 15:15:49.700560 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:49.700491 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-566c76b5cc-w784h" podUID="1ae9dab3-c42d-47b1-95c7-32208caa61f6" containerName="console" containerID="cri-o://7416f987150fd93211ed7baec00e9f187943c261c8ea4e6469f9ded712b4343e" gracePeriod=15 Apr 16 15:15:49.956896 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:49.956845 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-566c76b5cc-w784h_1ae9dab3-c42d-47b1-95c7-32208caa61f6/console/0.log" Apr 16 15:15:49.957019 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:49.956909 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-566c76b5cc-w784h" Apr 16 15:15:50.058044 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:50.058014 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1ae9dab3-c42d-47b1-95c7-32208caa61f6-console-serving-cert\") pod \"1ae9dab3-c42d-47b1-95c7-32208caa61f6\" (UID: \"1ae9dab3-c42d-47b1-95c7-32208caa61f6\") " Apr 16 15:15:50.058208 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:50.058074 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h6x4r\" (UniqueName: \"kubernetes.io/projected/1ae9dab3-c42d-47b1-95c7-32208caa61f6-kube-api-access-h6x4r\") pod \"1ae9dab3-c42d-47b1-95c7-32208caa61f6\" (UID: \"1ae9dab3-c42d-47b1-95c7-32208caa61f6\") " Apr 16 15:15:50.058208 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:50.058123 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1ae9dab3-c42d-47b1-95c7-32208caa61f6-service-ca\") pod \"1ae9dab3-c42d-47b1-95c7-32208caa61f6\" (UID: \"1ae9dab3-c42d-47b1-95c7-32208caa61f6\") " Apr 16 15:15:50.058208 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:50.058168 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1ae9dab3-c42d-47b1-95c7-32208caa61f6-oauth-serving-cert\") pod \"1ae9dab3-c42d-47b1-95c7-32208caa61f6\" (UID: \"1ae9dab3-c42d-47b1-95c7-32208caa61f6\") " Apr 16 15:15:50.058355 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:50.058264 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1ae9dab3-c42d-47b1-95c7-32208caa61f6-console-config\") pod \"1ae9dab3-c42d-47b1-95c7-32208caa61f6\" (UID: \"1ae9dab3-c42d-47b1-95c7-32208caa61f6\") " Apr 16 15:15:50.058355 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:50.058317 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1ae9dab3-c42d-47b1-95c7-32208caa61f6-console-oauth-config\") pod \"1ae9dab3-c42d-47b1-95c7-32208caa61f6\" (UID: \"1ae9dab3-c42d-47b1-95c7-32208caa61f6\") " Apr 16 15:15:50.058609 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:50.058580 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ae9dab3-c42d-47b1-95c7-32208caa61f6-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "1ae9dab3-c42d-47b1-95c7-32208caa61f6" (UID: "1ae9dab3-c42d-47b1-95c7-32208caa61f6"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 15:15:50.058609 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:50.058592 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ae9dab3-c42d-47b1-95c7-32208caa61f6-service-ca" (OuterVolumeSpecName: "service-ca") pod "1ae9dab3-c42d-47b1-95c7-32208caa61f6" (UID: "1ae9dab3-c42d-47b1-95c7-32208caa61f6"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 15:15:50.058801 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:50.058618 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ae9dab3-c42d-47b1-95c7-32208caa61f6-console-config" (OuterVolumeSpecName: "console-config") pod "1ae9dab3-c42d-47b1-95c7-32208caa61f6" (UID: "1ae9dab3-c42d-47b1-95c7-32208caa61f6"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 15:15:50.060491 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:50.060462 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ae9dab3-c42d-47b1-95c7-32208caa61f6-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "1ae9dab3-c42d-47b1-95c7-32208caa61f6" (UID: "1ae9dab3-c42d-47b1-95c7-32208caa61f6"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 15:15:50.060582 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:50.060491 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ae9dab3-c42d-47b1-95c7-32208caa61f6-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "1ae9dab3-c42d-47b1-95c7-32208caa61f6" (UID: "1ae9dab3-c42d-47b1-95c7-32208caa61f6"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 15:15:50.060582 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:50.060533 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ae9dab3-c42d-47b1-95c7-32208caa61f6-kube-api-access-h6x4r" (OuterVolumeSpecName: "kube-api-access-h6x4r") pod "1ae9dab3-c42d-47b1-95c7-32208caa61f6" (UID: "1ae9dab3-c42d-47b1-95c7-32208caa61f6"). InnerVolumeSpecName "kube-api-access-h6x4r". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 15:15:50.159602 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:50.159579 2573 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1ae9dab3-c42d-47b1-95c7-32208caa61f6-console-serving-cert\") on node \"ip-10-0-136-151.ec2.internal\" DevicePath \"\"" Apr 16 15:15:50.159602 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:50.159602 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-h6x4r\" (UniqueName: \"kubernetes.io/projected/1ae9dab3-c42d-47b1-95c7-32208caa61f6-kube-api-access-h6x4r\") on node \"ip-10-0-136-151.ec2.internal\" DevicePath \"\"" Apr 16 15:15:50.159761 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:50.159611 2573 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1ae9dab3-c42d-47b1-95c7-32208caa61f6-service-ca\") on node \"ip-10-0-136-151.ec2.internal\" DevicePath \"\"" Apr 16 15:15:50.159761 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:50.159621 2573 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1ae9dab3-c42d-47b1-95c7-32208caa61f6-oauth-serving-cert\") on node \"ip-10-0-136-151.ec2.internal\" DevicePath \"\"" Apr 16 15:15:50.159761 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:50.159630 2573 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1ae9dab3-c42d-47b1-95c7-32208caa61f6-console-config\") on node \"ip-10-0-136-151.ec2.internal\" DevicePath \"\"" Apr 16 15:15:50.159761 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:50.159638 2573 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1ae9dab3-c42d-47b1-95c7-32208caa61f6-console-oauth-config\") on node \"ip-10-0-136-151.ec2.internal\" DevicePath \"\"" Apr 16 15:15:50.720336 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:50.720311 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-566c76b5cc-w784h_1ae9dab3-c42d-47b1-95c7-32208caa61f6/console/0.log" Apr 16 15:15:50.720782 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:50.720354 2573 generic.go:358] "Generic (PLEG): container finished" podID="1ae9dab3-c42d-47b1-95c7-32208caa61f6" containerID="7416f987150fd93211ed7baec00e9f187943c261c8ea4e6469f9ded712b4343e" exitCode=2 Apr 16 15:15:50.720782 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:50.720426 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-566c76b5cc-w784h" Apr 16 15:15:50.720782 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:50.720452 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-566c76b5cc-w784h" event={"ID":"1ae9dab3-c42d-47b1-95c7-32208caa61f6","Type":"ContainerDied","Data":"7416f987150fd93211ed7baec00e9f187943c261c8ea4e6469f9ded712b4343e"} Apr 16 15:15:50.720782 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:50.720504 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-566c76b5cc-w784h" event={"ID":"1ae9dab3-c42d-47b1-95c7-32208caa61f6","Type":"ContainerDied","Data":"5550dc0f96938558e923fbebfd6f60ca1b6d94fe12eed3a80714e27c34752c74"} Apr 16 15:15:50.720782 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:50.720522 2573 scope.go:117] "RemoveContainer" containerID="7416f987150fd93211ed7baec00e9f187943c261c8ea4e6469f9ded712b4343e" Apr 16 15:15:50.729116 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:50.729099 2573 scope.go:117] "RemoveContainer" containerID="7416f987150fd93211ed7baec00e9f187943c261c8ea4e6469f9ded712b4343e" Apr 16 15:15:50.729361 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:15:50.729342 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7416f987150fd93211ed7baec00e9f187943c261c8ea4e6469f9ded712b4343e\": container with ID starting with 7416f987150fd93211ed7baec00e9f187943c261c8ea4e6469f9ded712b4343e not found: ID does not exist" containerID="7416f987150fd93211ed7baec00e9f187943c261c8ea4e6469f9ded712b4343e" Apr 16 15:15:50.729430 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:50.729376 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7416f987150fd93211ed7baec00e9f187943c261c8ea4e6469f9ded712b4343e"} err="failed to get container status \"7416f987150fd93211ed7baec00e9f187943c261c8ea4e6469f9ded712b4343e\": rpc error: code = NotFound desc = could not find container \"7416f987150fd93211ed7baec00e9f187943c261c8ea4e6469f9ded712b4343e\": container with ID starting with 7416f987150fd93211ed7baec00e9f187943c261c8ea4e6469f9ded712b4343e not found: ID does not exist" Apr 16 15:15:50.756265 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:50.756240 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-566c76b5cc-w784h"] Apr 16 15:15:50.763705 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:50.763682 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-566c76b5cc-w784h"] Apr 16 15:15:52.016957 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:15:52.016929 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ae9dab3-c42d-47b1-95c7-32208caa61f6" path="/var/lib/kubelet/pods/1ae9dab3-c42d-47b1-95c7-32208caa61f6/volumes" Apr 16 15:16:09.434915 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:09.434836 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 15:16:09.435338 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:09.435263 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="c3411df5-97c2-4726-80d7-73987b6690e9" containerName="alertmanager" containerID="cri-o://d5bccae011c3857c722cd559a0c60f16e515744885dfd516213f8e29082809d4" gracePeriod=120 Apr 16 15:16:09.435403 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:09.435359 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="c3411df5-97c2-4726-80d7-73987b6690e9" containerName="kube-rbac-proxy-web" containerID="cri-o://6c7aba47f64d3086eda855a1463791f7668ce9fa575f56d8948aeb17970bc835" gracePeriod=120 Apr 16 15:16:09.435403 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:09.435350 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="c3411df5-97c2-4726-80d7-73987b6690e9" containerName="kube-rbac-proxy-metric" containerID="cri-o://6f5e09ab4d8a709c1694efdb5d9bf5295b3109f0a4c351b0a5a25cfe117a69c8" gracePeriod=120 Apr 16 15:16:09.435510 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:09.435404 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="c3411df5-97c2-4726-80d7-73987b6690e9" containerName="prom-label-proxy" containerID="cri-o://c93ac49c868f9aee7895c24dfce39130b033cf003c279f452b2c477b3180edaf" gracePeriod=120 Apr 16 15:16:09.435510 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:09.435381 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="c3411df5-97c2-4726-80d7-73987b6690e9" containerName="config-reloader" containerID="cri-o://09e95a46b96c27311c034564825cf4f3d404a234bed43ea79c4accd88ab3ac56" gracePeriod=120 Apr 16 15:16:09.435510 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:09.435451 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="c3411df5-97c2-4726-80d7-73987b6690e9" containerName="kube-rbac-proxy" containerID="cri-o://609ddcfa4456edf45a4452a1da2e173bdef63b8316cf7b013208f72095e7e62e" gracePeriod=120 Apr 16 15:16:09.786364 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:09.786270 2573 generic.go:358] "Generic (PLEG): container finished" podID="c3411df5-97c2-4726-80d7-73987b6690e9" containerID="c93ac49c868f9aee7895c24dfce39130b033cf003c279f452b2c477b3180edaf" exitCode=0 Apr 16 15:16:09.786364 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:09.786294 2573 generic.go:358] "Generic (PLEG): container finished" podID="c3411df5-97c2-4726-80d7-73987b6690e9" containerID="609ddcfa4456edf45a4452a1da2e173bdef63b8316cf7b013208f72095e7e62e" exitCode=0 Apr 16 15:16:09.786364 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:09.786303 2573 generic.go:358] "Generic (PLEG): container finished" podID="c3411df5-97c2-4726-80d7-73987b6690e9" containerID="09e95a46b96c27311c034564825cf4f3d404a234bed43ea79c4accd88ab3ac56" exitCode=0 Apr 16 15:16:09.786364 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:09.786311 2573 generic.go:358] "Generic (PLEG): container finished" podID="c3411df5-97c2-4726-80d7-73987b6690e9" containerID="d5bccae011c3857c722cd559a0c60f16e515744885dfd516213f8e29082809d4" exitCode=0 Apr 16 15:16:09.786364 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:09.786349 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c3411df5-97c2-4726-80d7-73987b6690e9","Type":"ContainerDied","Data":"c93ac49c868f9aee7895c24dfce39130b033cf003c279f452b2c477b3180edaf"} Apr 16 15:16:09.786616 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:09.786381 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c3411df5-97c2-4726-80d7-73987b6690e9","Type":"ContainerDied","Data":"609ddcfa4456edf45a4452a1da2e173bdef63b8316cf7b013208f72095e7e62e"} Apr 16 15:16:09.786616 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:09.786391 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c3411df5-97c2-4726-80d7-73987b6690e9","Type":"ContainerDied","Data":"09e95a46b96c27311c034564825cf4f3d404a234bed43ea79c4accd88ab3ac56"} Apr 16 15:16:09.786616 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:09.786401 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c3411df5-97c2-4726-80d7-73987b6690e9","Type":"ContainerDied","Data":"d5bccae011c3857c722cd559a0c60f16e515744885dfd516213f8e29082809d4"} Apr 16 15:16:10.677797 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.677774 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 15:16:10.732978 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.732951 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/c3411df5-97c2-4726-80d7-73987b6690e9-secret-alertmanager-kube-rbac-proxy-metric\") pod \"c3411df5-97c2-4726-80d7-73987b6690e9\" (UID: \"c3411df5-97c2-4726-80d7-73987b6690e9\") " Apr 16 15:16:10.733129 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.733000 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/c3411df5-97c2-4726-80d7-73987b6690e9-secret-alertmanager-main-tls\") pod \"c3411df5-97c2-4726-80d7-73987b6690e9\" (UID: \"c3411df5-97c2-4726-80d7-73987b6690e9\") " Apr 16 15:16:10.733129 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.733045 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c3411df5-97c2-4726-80d7-73987b6690e9-web-config\") pod \"c3411df5-97c2-4726-80d7-73987b6690e9\" (UID: \"c3411df5-97c2-4726-80d7-73987b6690e9\") " Apr 16 15:16:10.733129 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.733076 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c3411df5-97c2-4726-80d7-73987b6690e9-secret-alertmanager-kube-rbac-proxy\") pod \"c3411df5-97c2-4726-80d7-73987b6690e9\" (UID: \"c3411df5-97c2-4726-80d7-73987b6690e9\") " Apr 16 15:16:10.733129 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.733108 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c3411df5-97c2-4726-80d7-73987b6690e9-config-out\") pod \"c3411df5-97c2-4726-80d7-73987b6690e9\" (UID: \"c3411df5-97c2-4726-80d7-73987b6690e9\") " Apr 16 15:16:10.733352 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.733151 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c3411df5-97c2-4726-80d7-73987b6690e9-alertmanager-trusted-ca-bundle\") pod \"c3411df5-97c2-4726-80d7-73987b6690e9\" (UID: \"c3411df5-97c2-4726-80d7-73987b6690e9\") " Apr 16 15:16:10.733352 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.733192 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/c3411df5-97c2-4726-80d7-73987b6690e9-cluster-tls-config\") pod \"c3411df5-97c2-4726-80d7-73987b6690e9\" (UID: \"c3411df5-97c2-4726-80d7-73987b6690e9\") " Apr 16 15:16:10.733352 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.733239 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c3411df5-97c2-4726-80d7-73987b6690e9-metrics-client-ca\") pod \"c3411df5-97c2-4726-80d7-73987b6690e9\" (UID: \"c3411df5-97c2-4726-80d7-73987b6690e9\") " Apr 16 15:16:10.733352 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.733265 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7bhpb\" (UniqueName: \"kubernetes.io/projected/c3411df5-97c2-4726-80d7-73987b6690e9-kube-api-access-7bhpb\") pod \"c3411df5-97c2-4726-80d7-73987b6690e9\" (UID: \"c3411df5-97c2-4726-80d7-73987b6690e9\") " Apr 16 15:16:10.733352 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.733301 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c3411df5-97c2-4726-80d7-73987b6690e9-tls-assets\") pod \"c3411df5-97c2-4726-80d7-73987b6690e9\" (UID: \"c3411df5-97c2-4726-80d7-73987b6690e9\") " Apr 16 15:16:10.733352 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.733334 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/c3411df5-97c2-4726-80d7-73987b6690e9-alertmanager-main-db\") pod \"c3411df5-97c2-4726-80d7-73987b6690e9\" (UID: \"c3411df5-97c2-4726-80d7-73987b6690e9\") " Apr 16 15:16:10.733720 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.733392 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/c3411df5-97c2-4726-80d7-73987b6690e9-config-volume\") pod \"c3411df5-97c2-4726-80d7-73987b6690e9\" (UID: \"c3411df5-97c2-4726-80d7-73987b6690e9\") " Apr 16 15:16:10.733720 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.733429 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c3411df5-97c2-4726-80d7-73987b6690e9-secret-alertmanager-kube-rbac-proxy-web\") pod \"c3411df5-97c2-4726-80d7-73987b6690e9\" (UID: \"c3411df5-97c2-4726-80d7-73987b6690e9\") " Apr 16 15:16:10.733928 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.733830 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3411df5-97c2-4726-80d7-73987b6690e9-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "c3411df5-97c2-4726-80d7-73987b6690e9" (UID: "c3411df5-97c2-4726-80d7-73987b6690e9"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 15:16:10.735464 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.735434 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3411df5-97c2-4726-80d7-73987b6690e9-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "c3411df5-97c2-4726-80d7-73987b6690e9" (UID: "c3411df5-97c2-4726-80d7-73987b6690e9"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 15:16:10.735632 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.735609 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3411df5-97c2-4726-80d7-73987b6690e9-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "c3411df5-97c2-4726-80d7-73987b6690e9" (UID: "c3411df5-97c2-4726-80d7-73987b6690e9"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:16:10.736257 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.736230 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3411df5-97c2-4726-80d7-73987b6690e9-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "c3411df5-97c2-4726-80d7-73987b6690e9" (UID: "c3411df5-97c2-4726-80d7-73987b6690e9"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 15:16:10.736672 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.736610 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3411df5-97c2-4726-80d7-73987b6690e9-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "c3411df5-97c2-4726-80d7-73987b6690e9" (UID: "c3411df5-97c2-4726-80d7-73987b6690e9"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 15:16:10.736887 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.736845 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3411df5-97c2-4726-80d7-73987b6690e9-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "c3411df5-97c2-4726-80d7-73987b6690e9" (UID: "c3411df5-97c2-4726-80d7-73987b6690e9"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 15:16:10.737270 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.737246 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3411df5-97c2-4726-80d7-73987b6690e9-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "c3411df5-97c2-4726-80d7-73987b6690e9" (UID: "c3411df5-97c2-4726-80d7-73987b6690e9"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 15:16:10.737917 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.737886 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3411df5-97c2-4726-80d7-73987b6690e9-kube-api-access-7bhpb" (OuterVolumeSpecName: "kube-api-access-7bhpb") pod "c3411df5-97c2-4726-80d7-73987b6690e9" (UID: "c3411df5-97c2-4726-80d7-73987b6690e9"). InnerVolumeSpecName "kube-api-access-7bhpb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 15:16:10.738600 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.738520 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3411df5-97c2-4726-80d7-73987b6690e9-config-volume" (OuterVolumeSpecName: "config-volume") pod "c3411df5-97c2-4726-80d7-73987b6690e9" (UID: "c3411df5-97c2-4726-80d7-73987b6690e9"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 15:16:10.738600 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.738566 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3411df5-97c2-4726-80d7-73987b6690e9-config-out" (OuterVolumeSpecName: "config-out") pod "c3411df5-97c2-4726-80d7-73987b6690e9" (UID: "c3411df5-97c2-4726-80d7-73987b6690e9"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:16:10.738790 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.738601 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3411df5-97c2-4726-80d7-73987b6690e9-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "c3411df5-97c2-4726-80d7-73987b6690e9" (UID: "c3411df5-97c2-4726-80d7-73987b6690e9"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 15:16:10.741594 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.741568 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3411df5-97c2-4726-80d7-73987b6690e9-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "c3411df5-97c2-4726-80d7-73987b6690e9" (UID: "c3411df5-97c2-4726-80d7-73987b6690e9"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 15:16:10.749012 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.748986 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3411df5-97c2-4726-80d7-73987b6690e9-web-config" (OuterVolumeSpecName: "web-config") pod "c3411df5-97c2-4726-80d7-73987b6690e9" (UID: "c3411df5-97c2-4726-80d7-73987b6690e9"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 15:16:10.791548 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.791470 2573 generic.go:358] "Generic (PLEG): container finished" podID="c3411df5-97c2-4726-80d7-73987b6690e9" containerID="6f5e09ab4d8a709c1694efdb5d9bf5295b3109f0a4c351b0a5a25cfe117a69c8" exitCode=0 Apr 16 15:16:10.791548 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.791496 2573 generic.go:358] "Generic (PLEG): container finished" podID="c3411df5-97c2-4726-80d7-73987b6690e9" containerID="6c7aba47f64d3086eda855a1463791f7668ce9fa575f56d8948aeb17970bc835" exitCode=0 Apr 16 15:16:10.791780 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.791551 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c3411df5-97c2-4726-80d7-73987b6690e9","Type":"ContainerDied","Data":"6f5e09ab4d8a709c1694efdb5d9bf5295b3109f0a4c351b0a5a25cfe117a69c8"} Apr 16 15:16:10.791780 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.791589 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c3411df5-97c2-4726-80d7-73987b6690e9","Type":"ContainerDied","Data":"6c7aba47f64d3086eda855a1463791f7668ce9fa575f56d8948aeb17970bc835"} Apr 16 15:16:10.791780 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.791592 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 15:16:10.791780 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.791600 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c3411df5-97c2-4726-80d7-73987b6690e9","Type":"ContainerDied","Data":"e485e4a9d0e092374197cd5be464b557fb4c648fc1cea529ba137ab595b914f6"} Apr 16 15:16:10.791780 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.791618 2573 scope.go:117] "RemoveContainer" containerID="c93ac49c868f9aee7895c24dfce39130b033cf003c279f452b2c477b3180edaf" Apr 16 15:16:10.799550 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.799535 2573 scope.go:117] "RemoveContainer" containerID="6f5e09ab4d8a709c1694efdb5d9bf5295b3109f0a4c351b0a5a25cfe117a69c8" Apr 16 15:16:10.810820 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.810801 2573 scope.go:117] "RemoveContainer" containerID="609ddcfa4456edf45a4452a1da2e173bdef63b8316cf7b013208f72095e7e62e" Apr 16 15:16:10.816987 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.816970 2573 scope.go:117] "RemoveContainer" containerID="6c7aba47f64d3086eda855a1463791f7668ce9fa575f56d8948aeb17970bc835" Apr 16 15:16:10.820074 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.820053 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 15:16:10.823497 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.823475 2573 scope.go:117] "RemoveContainer" containerID="09e95a46b96c27311c034564825cf4f3d404a234bed43ea79c4accd88ab3ac56" Apr 16 15:16:10.826302 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.826283 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 15:16:10.830407 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.830392 2573 scope.go:117] "RemoveContainer" containerID="d5bccae011c3857c722cd559a0c60f16e515744885dfd516213f8e29082809d4" Apr 16 15:16:10.834548 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.834526 2573 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/c3411df5-97c2-4726-80d7-73987b6690e9-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-136-151.ec2.internal\" DevicePath \"\"" Apr 16 15:16:10.834625 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.834550 2573 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/c3411df5-97c2-4726-80d7-73987b6690e9-secret-alertmanager-main-tls\") on node \"ip-10-0-136-151.ec2.internal\" DevicePath \"\"" Apr 16 15:16:10.834625 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.834562 2573 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c3411df5-97c2-4726-80d7-73987b6690e9-web-config\") on node \"ip-10-0-136-151.ec2.internal\" DevicePath \"\"" Apr 16 15:16:10.834625 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.834573 2573 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c3411df5-97c2-4726-80d7-73987b6690e9-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-136-151.ec2.internal\" DevicePath \"\"" Apr 16 15:16:10.834625 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.834587 2573 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c3411df5-97c2-4726-80d7-73987b6690e9-config-out\") on node \"ip-10-0-136-151.ec2.internal\" DevicePath \"\"" Apr 16 15:16:10.834625 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.834595 2573 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c3411df5-97c2-4726-80d7-73987b6690e9-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-136-151.ec2.internal\" DevicePath \"\"" Apr 16 15:16:10.834625 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.834604 2573 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/c3411df5-97c2-4726-80d7-73987b6690e9-cluster-tls-config\") on node \"ip-10-0-136-151.ec2.internal\" DevicePath \"\"" Apr 16 15:16:10.834625 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.834612 2573 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c3411df5-97c2-4726-80d7-73987b6690e9-metrics-client-ca\") on node \"ip-10-0-136-151.ec2.internal\" DevicePath \"\"" Apr 16 15:16:10.834625 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.834620 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7bhpb\" (UniqueName: \"kubernetes.io/projected/c3411df5-97c2-4726-80d7-73987b6690e9-kube-api-access-7bhpb\") on node \"ip-10-0-136-151.ec2.internal\" DevicePath \"\"" Apr 16 15:16:10.834625 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.834628 2573 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c3411df5-97c2-4726-80d7-73987b6690e9-tls-assets\") on node \"ip-10-0-136-151.ec2.internal\" DevicePath \"\"" Apr 16 15:16:10.834906 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.834637 2573 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/c3411df5-97c2-4726-80d7-73987b6690e9-alertmanager-main-db\") on node \"ip-10-0-136-151.ec2.internal\" DevicePath \"\"" Apr 16 15:16:10.834906 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.834671 2573 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/c3411df5-97c2-4726-80d7-73987b6690e9-config-volume\") on node \"ip-10-0-136-151.ec2.internal\" DevicePath \"\"" Apr 16 15:16:10.834906 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.834682 2573 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c3411df5-97c2-4726-80d7-73987b6690e9-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-136-151.ec2.internal\" DevicePath \"\"" Apr 16 15:16:10.836749 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.836736 2573 scope.go:117] "RemoveContainer" containerID="cba9a84aec1dae6e991a70729d1430da6e4d17037419e4873d4411c00f0d2498" Apr 16 15:16:10.843173 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.843160 2573 scope.go:117] "RemoveContainer" containerID="c93ac49c868f9aee7895c24dfce39130b033cf003c279f452b2c477b3180edaf" Apr 16 15:16:10.843434 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:16:10.843417 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c93ac49c868f9aee7895c24dfce39130b033cf003c279f452b2c477b3180edaf\": container with ID starting with c93ac49c868f9aee7895c24dfce39130b033cf003c279f452b2c477b3180edaf not found: ID does not exist" containerID="c93ac49c868f9aee7895c24dfce39130b033cf003c279f452b2c477b3180edaf" Apr 16 15:16:10.843471 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.843442 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c93ac49c868f9aee7895c24dfce39130b033cf003c279f452b2c477b3180edaf"} err="failed to get container status \"c93ac49c868f9aee7895c24dfce39130b033cf003c279f452b2c477b3180edaf\": rpc error: code = NotFound desc = could not find container \"c93ac49c868f9aee7895c24dfce39130b033cf003c279f452b2c477b3180edaf\": container with ID starting with c93ac49c868f9aee7895c24dfce39130b033cf003c279f452b2c477b3180edaf not found: ID does not exist" Apr 16 15:16:10.843471 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.843461 2573 scope.go:117] "RemoveContainer" containerID="6f5e09ab4d8a709c1694efdb5d9bf5295b3109f0a4c351b0a5a25cfe117a69c8" Apr 16 15:16:10.843729 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:16:10.843711 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f5e09ab4d8a709c1694efdb5d9bf5295b3109f0a4c351b0a5a25cfe117a69c8\": container with ID starting with 6f5e09ab4d8a709c1694efdb5d9bf5295b3109f0a4c351b0a5a25cfe117a69c8 not found: ID does not exist" containerID="6f5e09ab4d8a709c1694efdb5d9bf5295b3109f0a4c351b0a5a25cfe117a69c8" Apr 16 15:16:10.843778 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.843734 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f5e09ab4d8a709c1694efdb5d9bf5295b3109f0a4c351b0a5a25cfe117a69c8"} err="failed to get container status \"6f5e09ab4d8a709c1694efdb5d9bf5295b3109f0a4c351b0a5a25cfe117a69c8\": rpc error: code = NotFound desc = could not find container \"6f5e09ab4d8a709c1694efdb5d9bf5295b3109f0a4c351b0a5a25cfe117a69c8\": container with ID starting with 6f5e09ab4d8a709c1694efdb5d9bf5295b3109f0a4c351b0a5a25cfe117a69c8 not found: ID does not exist" Apr 16 15:16:10.843778 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.843751 2573 scope.go:117] "RemoveContainer" containerID="609ddcfa4456edf45a4452a1da2e173bdef63b8316cf7b013208f72095e7e62e" Apr 16 15:16:10.843979 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:16:10.843962 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"609ddcfa4456edf45a4452a1da2e173bdef63b8316cf7b013208f72095e7e62e\": container with ID starting with 609ddcfa4456edf45a4452a1da2e173bdef63b8316cf7b013208f72095e7e62e not found: ID does not exist" containerID="609ddcfa4456edf45a4452a1da2e173bdef63b8316cf7b013208f72095e7e62e" Apr 16 15:16:10.844014 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.843985 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"609ddcfa4456edf45a4452a1da2e173bdef63b8316cf7b013208f72095e7e62e"} err="failed to get container status \"609ddcfa4456edf45a4452a1da2e173bdef63b8316cf7b013208f72095e7e62e\": rpc error: code = NotFound desc = could not find container \"609ddcfa4456edf45a4452a1da2e173bdef63b8316cf7b013208f72095e7e62e\": container with ID starting with 609ddcfa4456edf45a4452a1da2e173bdef63b8316cf7b013208f72095e7e62e not found: ID does not exist" Apr 16 15:16:10.844014 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.844000 2573 scope.go:117] "RemoveContainer" containerID="6c7aba47f64d3086eda855a1463791f7668ce9fa575f56d8948aeb17970bc835" Apr 16 15:16:10.844226 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:16:10.844202 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c7aba47f64d3086eda855a1463791f7668ce9fa575f56d8948aeb17970bc835\": container with ID starting with 6c7aba47f64d3086eda855a1463791f7668ce9fa575f56d8948aeb17970bc835 not found: ID does not exist" containerID="6c7aba47f64d3086eda855a1463791f7668ce9fa575f56d8948aeb17970bc835" Apr 16 15:16:10.844313 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.844224 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c7aba47f64d3086eda855a1463791f7668ce9fa575f56d8948aeb17970bc835"} err="failed to get container status \"6c7aba47f64d3086eda855a1463791f7668ce9fa575f56d8948aeb17970bc835\": rpc error: code = NotFound desc = could not find container \"6c7aba47f64d3086eda855a1463791f7668ce9fa575f56d8948aeb17970bc835\": container with ID starting with 6c7aba47f64d3086eda855a1463791f7668ce9fa575f56d8948aeb17970bc835 not found: ID does not exist" Apr 16 15:16:10.844313 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.844239 2573 scope.go:117] "RemoveContainer" containerID="09e95a46b96c27311c034564825cf4f3d404a234bed43ea79c4accd88ab3ac56" Apr 16 15:16:10.844479 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:16:10.844463 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09e95a46b96c27311c034564825cf4f3d404a234bed43ea79c4accd88ab3ac56\": container with ID starting with 09e95a46b96c27311c034564825cf4f3d404a234bed43ea79c4accd88ab3ac56 not found: ID does not exist" containerID="09e95a46b96c27311c034564825cf4f3d404a234bed43ea79c4accd88ab3ac56" Apr 16 15:16:10.844537 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.844488 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09e95a46b96c27311c034564825cf4f3d404a234bed43ea79c4accd88ab3ac56"} err="failed to get container status \"09e95a46b96c27311c034564825cf4f3d404a234bed43ea79c4accd88ab3ac56\": rpc error: code = NotFound desc = could not find container \"09e95a46b96c27311c034564825cf4f3d404a234bed43ea79c4accd88ab3ac56\": container with ID starting with 09e95a46b96c27311c034564825cf4f3d404a234bed43ea79c4accd88ab3ac56 not found: ID does not exist" Apr 16 15:16:10.844537 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.844509 2573 scope.go:117] "RemoveContainer" containerID="d5bccae011c3857c722cd559a0c60f16e515744885dfd516213f8e29082809d4" Apr 16 15:16:10.844754 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:16:10.844739 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5bccae011c3857c722cd559a0c60f16e515744885dfd516213f8e29082809d4\": container with ID starting with d5bccae011c3857c722cd559a0c60f16e515744885dfd516213f8e29082809d4 not found: ID does not exist" containerID="d5bccae011c3857c722cd559a0c60f16e515744885dfd516213f8e29082809d4" Apr 16 15:16:10.844811 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.844759 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5bccae011c3857c722cd559a0c60f16e515744885dfd516213f8e29082809d4"} err="failed to get container status \"d5bccae011c3857c722cd559a0c60f16e515744885dfd516213f8e29082809d4\": rpc error: code = NotFound desc = could not find container \"d5bccae011c3857c722cd559a0c60f16e515744885dfd516213f8e29082809d4\": container with ID starting with d5bccae011c3857c722cd559a0c60f16e515744885dfd516213f8e29082809d4 not found: ID does not exist" Apr 16 15:16:10.844811 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.844772 2573 scope.go:117] "RemoveContainer" containerID="cba9a84aec1dae6e991a70729d1430da6e4d17037419e4873d4411c00f0d2498" Apr 16 15:16:10.844982 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:16:10.844966 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cba9a84aec1dae6e991a70729d1430da6e4d17037419e4873d4411c00f0d2498\": container with ID starting with cba9a84aec1dae6e991a70729d1430da6e4d17037419e4873d4411c00f0d2498 not found: ID does not exist" containerID="cba9a84aec1dae6e991a70729d1430da6e4d17037419e4873d4411c00f0d2498" Apr 16 15:16:10.845022 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.844987 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cba9a84aec1dae6e991a70729d1430da6e4d17037419e4873d4411c00f0d2498"} err="failed to get container status \"cba9a84aec1dae6e991a70729d1430da6e4d17037419e4873d4411c00f0d2498\": rpc error: code = NotFound desc = could not find container \"cba9a84aec1dae6e991a70729d1430da6e4d17037419e4873d4411c00f0d2498\": container with ID starting with cba9a84aec1dae6e991a70729d1430da6e4d17037419e4873d4411c00f0d2498 not found: ID does not exist" Apr 16 15:16:10.845022 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.845001 2573 scope.go:117] "RemoveContainer" containerID="c93ac49c868f9aee7895c24dfce39130b033cf003c279f452b2c477b3180edaf" Apr 16 15:16:10.845207 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.845187 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c93ac49c868f9aee7895c24dfce39130b033cf003c279f452b2c477b3180edaf"} err="failed to get container status \"c93ac49c868f9aee7895c24dfce39130b033cf003c279f452b2c477b3180edaf\": rpc error: code = NotFound desc = could not find container \"c93ac49c868f9aee7895c24dfce39130b033cf003c279f452b2c477b3180edaf\": container with ID starting with c93ac49c868f9aee7895c24dfce39130b033cf003c279f452b2c477b3180edaf not found: ID does not exist" Apr 16 15:16:10.845272 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.845208 2573 scope.go:117] "RemoveContainer" containerID="6f5e09ab4d8a709c1694efdb5d9bf5295b3109f0a4c351b0a5a25cfe117a69c8" Apr 16 15:16:10.845433 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.845417 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f5e09ab4d8a709c1694efdb5d9bf5295b3109f0a4c351b0a5a25cfe117a69c8"} err="failed to get container status \"6f5e09ab4d8a709c1694efdb5d9bf5295b3109f0a4c351b0a5a25cfe117a69c8\": rpc error: code = NotFound desc = could not find container \"6f5e09ab4d8a709c1694efdb5d9bf5295b3109f0a4c351b0a5a25cfe117a69c8\": container with ID starting with 6f5e09ab4d8a709c1694efdb5d9bf5295b3109f0a4c351b0a5a25cfe117a69c8 not found: ID does not exist" Apr 16 15:16:10.845481 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.845435 2573 scope.go:117] "RemoveContainer" containerID="609ddcfa4456edf45a4452a1da2e173bdef63b8316cf7b013208f72095e7e62e" Apr 16 15:16:10.845672 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.845633 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"609ddcfa4456edf45a4452a1da2e173bdef63b8316cf7b013208f72095e7e62e"} err="failed to get container status \"609ddcfa4456edf45a4452a1da2e173bdef63b8316cf7b013208f72095e7e62e\": rpc error: code = NotFound desc = could not find container \"609ddcfa4456edf45a4452a1da2e173bdef63b8316cf7b013208f72095e7e62e\": container with ID starting with 609ddcfa4456edf45a4452a1da2e173bdef63b8316cf7b013208f72095e7e62e not found: ID does not exist" Apr 16 15:16:10.845725 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.845672 2573 scope.go:117] "RemoveContainer" containerID="6c7aba47f64d3086eda855a1463791f7668ce9fa575f56d8948aeb17970bc835" Apr 16 15:16:10.845865 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.845848 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c7aba47f64d3086eda855a1463791f7668ce9fa575f56d8948aeb17970bc835"} err="failed to get container status \"6c7aba47f64d3086eda855a1463791f7668ce9fa575f56d8948aeb17970bc835\": rpc error: code = NotFound desc = could not find container \"6c7aba47f64d3086eda855a1463791f7668ce9fa575f56d8948aeb17970bc835\": container with ID starting with 6c7aba47f64d3086eda855a1463791f7668ce9fa575f56d8948aeb17970bc835 not found: ID does not exist" Apr 16 15:16:10.845905 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.845867 2573 scope.go:117] "RemoveContainer" containerID="09e95a46b96c27311c034564825cf4f3d404a234bed43ea79c4accd88ab3ac56" Apr 16 15:16:10.846051 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.846034 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09e95a46b96c27311c034564825cf4f3d404a234bed43ea79c4accd88ab3ac56"} err="failed to get container status \"09e95a46b96c27311c034564825cf4f3d404a234bed43ea79c4accd88ab3ac56\": rpc error: code = NotFound desc = could not find container \"09e95a46b96c27311c034564825cf4f3d404a234bed43ea79c4accd88ab3ac56\": container with ID starting with 09e95a46b96c27311c034564825cf4f3d404a234bed43ea79c4accd88ab3ac56 not found: ID does not exist" Apr 16 15:16:10.846115 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.846052 2573 scope.go:117] "RemoveContainer" containerID="d5bccae011c3857c722cd559a0c60f16e515744885dfd516213f8e29082809d4" Apr 16 15:16:10.846291 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.846276 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5bccae011c3857c722cd559a0c60f16e515744885dfd516213f8e29082809d4"} err="failed to get container status \"d5bccae011c3857c722cd559a0c60f16e515744885dfd516213f8e29082809d4\": rpc error: code = NotFound desc = could not find container \"d5bccae011c3857c722cd559a0c60f16e515744885dfd516213f8e29082809d4\": container with ID starting with d5bccae011c3857c722cd559a0c60f16e515744885dfd516213f8e29082809d4 not found: ID does not exist" Apr 16 15:16:10.846340 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.846292 2573 scope.go:117] "RemoveContainer" containerID="cba9a84aec1dae6e991a70729d1430da6e4d17037419e4873d4411c00f0d2498" Apr 16 15:16:10.846449 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.846434 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cba9a84aec1dae6e991a70729d1430da6e4d17037419e4873d4411c00f0d2498"} err="failed to get container status \"cba9a84aec1dae6e991a70729d1430da6e4d17037419e4873d4411c00f0d2498\": rpc error: code = NotFound desc = could not find container \"cba9a84aec1dae6e991a70729d1430da6e4d17037419e4873d4411c00f0d2498\": container with ID starting with cba9a84aec1dae6e991a70729d1430da6e4d17037419e4873d4411c00f0d2498 not found: ID does not exist" Apr 16 15:16:10.873746 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.873719 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 15:16:10.874014 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.874003 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c3411df5-97c2-4726-80d7-73987b6690e9" containerName="kube-rbac-proxy-web" Apr 16 15:16:10.874051 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.874027 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3411df5-97c2-4726-80d7-73987b6690e9" containerName="kube-rbac-proxy-web" Apr 16 15:16:10.874051 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.874038 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c3411df5-97c2-4726-80d7-73987b6690e9" containerName="alertmanager" Apr 16 15:16:10.874051 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.874044 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3411df5-97c2-4726-80d7-73987b6690e9" containerName="alertmanager" Apr 16 15:16:10.874178 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.874054 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c3411df5-97c2-4726-80d7-73987b6690e9" containerName="config-reloader" Apr 16 15:16:10.874178 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.874061 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3411df5-97c2-4726-80d7-73987b6690e9" containerName="config-reloader" Apr 16 15:16:10.874178 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.874067 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c3411df5-97c2-4726-80d7-73987b6690e9" containerName="prom-label-proxy" Apr 16 15:16:10.874178 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.874073 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3411df5-97c2-4726-80d7-73987b6690e9" containerName="prom-label-proxy" Apr 16 15:16:10.874178 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.874080 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c3411df5-97c2-4726-80d7-73987b6690e9" containerName="init-config-reloader" Apr 16 15:16:10.874178 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.874086 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3411df5-97c2-4726-80d7-73987b6690e9" containerName="init-config-reloader" Apr 16 15:16:10.874178 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.874092 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c3411df5-97c2-4726-80d7-73987b6690e9" containerName="kube-rbac-proxy" Apr 16 15:16:10.874178 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.874096 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3411df5-97c2-4726-80d7-73987b6690e9" containerName="kube-rbac-proxy" Apr 16 15:16:10.874178 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.874103 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1ae9dab3-c42d-47b1-95c7-32208caa61f6" containerName="console" Apr 16 15:16:10.874178 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.874108 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ae9dab3-c42d-47b1-95c7-32208caa61f6" containerName="console" Apr 16 15:16:10.874178 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.874115 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c3411df5-97c2-4726-80d7-73987b6690e9" containerName="kube-rbac-proxy-metric" Apr 16 15:16:10.874178 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.874120 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3411df5-97c2-4726-80d7-73987b6690e9" containerName="kube-rbac-proxy-metric" Apr 16 15:16:10.874178 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.874170 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="c3411df5-97c2-4726-80d7-73987b6690e9" containerName="kube-rbac-proxy-web" Apr 16 15:16:10.874178 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.874177 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="c3411df5-97c2-4726-80d7-73987b6690e9" containerName="kube-rbac-proxy-metric" Apr 16 15:16:10.874178 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.874184 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="c3411df5-97c2-4726-80d7-73987b6690e9" containerName="kube-rbac-proxy" Apr 16 15:16:10.874566 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.874192 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="c3411df5-97c2-4726-80d7-73987b6690e9" containerName="prom-label-proxy" Apr 16 15:16:10.874566 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.874198 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="c3411df5-97c2-4726-80d7-73987b6690e9" containerName="config-reloader" Apr 16 15:16:10.874566 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.874204 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="c3411df5-97c2-4726-80d7-73987b6690e9" containerName="alertmanager" Apr 16 15:16:10.874566 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.874210 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="1ae9dab3-c42d-47b1-95c7-32208caa61f6" containerName="console" Apr 16 15:16:10.878753 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.878735 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 15:16:10.881909 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.881888 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-kgqmd\"" Apr 16 15:16:10.882130 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.882108 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 16 15:16:10.882239 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.882155 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 16 15:16:10.882239 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.882117 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 16 15:16:10.882239 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.882193 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 16 15:16:10.882408 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.882329 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 16 15:16:10.882481 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.882467 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 16 15:16:10.883473 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.883441 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 16 15:16:10.883575 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.883506 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 16 15:16:10.887919 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.887897 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 16 15:16:10.896000 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.895979 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 15:16:10.935955 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.935923 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/85937bc3-314c-43ed-a893-fa81d55b0bc5-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"85937bc3-314c-43ed-a893-fa81d55b0bc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 15:16:10.936095 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.935959 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/85937bc3-314c-43ed-a893-fa81d55b0bc5-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"85937bc3-314c-43ed-a893-fa81d55b0bc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 15:16:10.936095 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.935979 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/85937bc3-314c-43ed-a893-fa81d55b0bc5-tls-assets\") pod \"alertmanager-main-0\" (UID: \"85937bc3-314c-43ed-a893-fa81d55b0bc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 15:16:10.936095 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.936004 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/85937bc3-314c-43ed-a893-fa81d55b0bc5-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"85937bc3-314c-43ed-a893-fa81d55b0bc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 15:16:10.936095 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.936019 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/85937bc3-314c-43ed-a893-fa81d55b0bc5-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"85937bc3-314c-43ed-a893-fa81d55b0bc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 15:16:10.936095 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.936073 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/85937bc3-314c-43ed-a893-fa81d55b0bc5-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"85937bc3-314c-43ed-a893-fa81d55b0bc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 15:16:10.936095 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.936088 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/85937bc3-314c-43ed-a893-fa81d55b0bc5-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"85937bc3-314c-43ed-a893-fa81d55b0bc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 15:16:10.936366 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.936142 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/85937bc3-314c-43ed-a893-fa81d55b0bc5-config-out\") pod \"alertmanager-main-0\" (UID: \"85937bc3-314c-43ed-a893-fa81d55b0bc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 15:16:10.936366 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.936180 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/85937bc3-314c-43ed-a893-fa81d55b0bc5-config-volume\") pod \"alertmanager-main-0\" (UID: \"85937bc3-314c-43ed-a893-fa81d55b0bc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 15:16:10.936366 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.936210 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/85937bc3-314c-43ed-a893-fa81d55b0bc5-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"85937bc3-314c-43ed-a893-fa81d55b0bc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 15:16:10.936366 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.936255 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpmq9\" (UniqueName: \"kubernetes.io/projected/85937bc3-314c-43ed-a893-fa81d55b0bc5-kube-api-access-xpmq9\") pod \"alertmanager-main-0\" (UID: \"85937bc3-314c-43ed-a893-fa81d55b0bc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 15:16:10.936366 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.936292 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/85937bc3-314c-43ed-a893-fa81d55b0bc5-web-config\") pod \"alertmanager-main-0\" (UID: \"85937bc3-314c-43ed-a893-fa81d55b0bc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 15:16:10.936366 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:10.936311 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/85937bc3-314c-43ed-a893-fa81d55b0bc5-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"85937bc3-314c-43ed-a893-fa81d55b0bc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 15:16:11.037426 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:11.037379 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xpmq9\" (UniqueName: \"kubernetes.io/projected/85937bc3-314c-43ed-a893-fa81d55b0bc5-kube-api-access-xpmq9\") pod \"alertmanager-main-0\" (UID: \"85937bc3-314c-43ed-a893-fa81d55b0bc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 15:16:11.037426 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:11.037433 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/85937bc3-314c-43ed-a893-fa81d55b0bc5-web-config\") pod \"alertmanager-main-0\" (UID: \"85937bc3-314c-43ed-a893-fa81d55b0bc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 15:16:11.037735 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:11.037454 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/85937bc3-314c-43ed-a893-fa81d55b0bc5-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"85937bc3-314c-43ed-a893-fa81d55b0bc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 15:16:11.037735 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:11.037484 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/85937bc3-314c-43ed-a893-fa81d55b0bc5-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"85937bc3-314c-43ed-a893-fa81d55b0bc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 15:16:11.037735 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:11.037506 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/85937bc3-314c-43ed-a893-fa81d55b0bc5-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"85937bc3-314c-43ed-a893-fa81d55b0bc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 15:16:11.037735 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:11.037523 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/85937bc3-314c-43ed-a893-fa81d55b0bc5-tls-assets\") pod \"alertmanager-main-0\" (UID: \"85937bc3-314c-43ed-a893-fa81d55b0bc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 15:16:11.037735 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:11.037685 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/85937bc3-314c-43ed-a893-fa81d55b0bc5-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"85937bc3-314c-43ed-a893-fa81d55b0bc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 15:16:11.037735 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:11.037726 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/85937bc3-314c-43ed-a893-fa81d55b0bc5-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"85937bc3-314c-43ed-a893-fa81d55b0bc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 15:16:11.038046 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:11.037777 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/85937bc3-314c-43ed-a893-fa81d55b0bc5-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"85937bc3-314c-43ed-a893-fa81d55b0bc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 15:16:11.038046 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:11.037796 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/85937bc3-314c-43ed-a893-fa81d55b0bc5-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"85937bc3-314c-43ed-a893-fa81d55b0bc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 15:16:11.038046 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:11.037827 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/85937bc3-314c-43ed-a893-fa81d55b0bc5-config-out\") pod \"alertmanager-main-0\" (UID: \"85937bc3-314c-43ed-a893-fa81d55b0bc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 15:16:11.038046 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:11.037861 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/85937bc3-314c-43ed-a893-fa81d55b0bc5-config-volume\") pod \"alertmanager-main-0\" (UID: \"85937bc3-314c-43ed-a893-fa81d55b0bc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 15:16:11.038046 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:11.037889 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/85937bc3-314c-43ed-a893-fa81d55b0bc5-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"85937bc3-314c-43ed-a893-fa81d55b0bc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 15:16:11.038441 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:11.038307 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/85937bc3-314c-43ed-a893-fa81d55b0bc5-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"85937bc3-314c-43ed-a893-fa81d55b0bc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 15:16:11.039531 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:11.039498 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/85937bc3-314c-43ed-a893-fa81d55b0bc5-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"85937bc3-314c-43ed-a893-fa81d55b0bc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 15:16:11.040823 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:11.040799 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/85937bc3-314c-43ed-a893-fa81d55b0bc5-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"85937bc3-314c-43ed-a893-fa81d55b0bc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 15:16:11.040906 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:11.040825 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/85937bc3-314c-43ed-a893-fa81d55b0bc5-tls-assets\") pod \"alertmanager-main-0\" (UID: \"85937bc3-314c-43ed-a893-fa81d55b0bc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 15:16:11.040970 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:11.040950 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/85937bc3-314c-43ed-a893-fa81d55b0bc5-web-config\") pod \"alertmanager-main-0\" (UID: \"85937bc3-314c-43ed-a893-fa81d55b0bc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 15:16:11.041046 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:11.041024 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/85937bc3-314c-43ed-a893-fa81d55b0bc5-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"85937bc3-314c-43ed-a893-fa81d55b0bc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 15:16:11.041165 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:11.041143 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/85937bc3-314c-43ed-a893-fa81d55b0bc5-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"85937bc3-314c-43ed-a893-fa81d55b0bc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 15:16:11.041353 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:11.041333 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/85937bc3-314c-43ed-a893-fa81d55b0bc5-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"85937bc3-314c-43ed-a893-fa81d55b0bc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 15:16:11.041403 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:11.041389 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/85937bc3-314c-43ed-a893-fa81d55b0bc5-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"85937bc3-314c-43ed-a893-fa81d55b0bc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 15:16:11.041801 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:11.041748 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/85937bc3-314c-43ed-a893-fa81d55b0bc5-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"85937bc3-314c-43ed-a893-fa81d55b0bc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 15:16:11.041801 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:11.041765 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/85937bc3-314c-43ed-a893-fa81d55b0bc5-config-out\") pod \"alertmanager-main-0\" (UID: \"85937bc3-314c-43ed-a893-fa81d55b0bc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 15:16:11.042938 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:11.042917 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/85937bc3-314c-43ed-a893-fa81d55b0bc5-config-volume\") pod \"alertmanager-main-0\" (UID: \"85937bc3-314c-43ed-a893-fa81d55b0bc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 15:16:11.046426 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:11.046407 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpmq9\" (UniqueName: \"kubernetes.io/projected/85937bc3-314c-43ed-a893-fa81d55b0bc5-kube-api-access-xpmq9\") pod \"alertmanager-main-0\" (UID: \"85937bc3-314c-43ed-a893-fa81d55b0bc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 15:16:11.188266 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:11.188239 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 15:16:11.320570 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:11.320486 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 15:16:11.323353 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:16:11.323328 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod85937bc3_314c_43ed_a893_fa81d55b0bc5.slice/crio-f5fec0ece0f185c0c174ab381e6181b539c15a242a382bab6968dc61ed3c5803 WatchSource:0}: Error finding container f5fec0ece0f185c0c174ab381e6181b539c15a242a382bab6968dc61ed3c5803: Status 404 returned error can't find the container with id f5fec0ece0f185c0c174ab381e6181b539c15a242a382bab6968dc61ed3c5803 Apr 16 15:16:11.796373 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:11.796337 2573 generic.go:358] "Generic (PLEG): container finished" podID="85937bc3-314c-43ed-a893-fa81d55b0bc5" containerID="08b04752cfa7133d1dbbc0083be869c46c811593c564067c8f8b2409fb7a0354" exitCode=0 Apr 16 15:16:11.796840 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:11.796373 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"85937bc3-314c-43ed-a893-fa81d55b0bc5","Type":"ContainerDied","Data":"08b04752cfa7133d1dbbc0083be869c46c811593c564067c8f8b2409fb7a0354"} Apr 16 15:16:11.796840 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:11.796411 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"85937bc3-314c-43ed-a893-fa81d55b0bc5","Type":"ContainerStarted","Data":"f5fec0ece0f185c0c174ab381e6181b539c15a242a382bab6968dc61ed3c5803"} Apr 16 15:16:12.017211 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:12.017184 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3411df5-97c2-4726-80d7-73987b6690e9" path="/var/lib/kubelet/pods/c3411df5-97c2-4726-80d7-73987b6690e9/volumes" Apr 16 15:16:12.803096 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:12.803063 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"85937bc3-314c-43ed-a893-fa81d55b0bc5","Type":"ContainerStarted","Data":"41048f7ef3258a9ec1477b552b20b2b0e433207f68ce758fe625abebd6869796"} Apr 16 15:16:12.803096 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:12.803096 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"85937bc3-314c-43ed-a893-fa81d55b0bc5","Type":"ContainerStarted","Data":"1763c88be48ac36cdf54845b4a56b89fe2b6a74dcc18b1c55cdd51bf110f7289"} Apr 16 15:16:12.803626 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:12.803107 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"85937bc3-314c-43ed-a893-fa81d55b0bc5","Type":"ContainerStarted","Data":"b97471d41e47c04f4b877cb72d373d93a26d5a34c9bf8ec3416978e551ea9d7c"} Apr 16 15:16:12.803626 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:12.803115 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"85937bc3-314c-43ed-a893-fa81d55b0bc5","Type":"ContainerStarted","Data":"33a415d7f50ef16134260c69bcb2e7ef2007d36015969a0ea799b59113e963c8"} Apr 16 15:16:12.803626 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:12.803123 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"85937bc3-314c-43ed-a893-fa81d55b0bc5","Type":"ContainerStarted","Data":"6da724023530185e236556d53cb43d7b729ca42545fcbd672e8e88073be0b484"} Apr 16 15:16:12.803626 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:12.803131 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"85937bc3-314c-43ed-a893-fa81d55b0bc5","Type":"ContainerStarted","Data":"14d76a0f5da09b337e4ddad61d1a44aa541367e775722b9563c53c66a929b455"} Apr 16 15:16:12.833820 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:12.833774 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.833760477 podStartE2EDuration="2.833760477s" podCreationTimestamp="2026-04-16 15:16:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:16:12.833584563 +0000 UTC m=+251.419788372" watchObservedRunningTime="2026-04-16 15:16:12.833760477 +0000 UTC m=+251.419964286" Apr 16 15:16:13.529823 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:13.529791 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-6db7cb95c8-rz2z4"] Apr 16 15:16:13.533332 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:13.533312 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-6db7cb95c8-rz2z4" Apr 16 15:16:13.536278 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:13.536241 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 16 15:16:13.536493 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:13.536461 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-62nd2\"" Apr 16 15:16:13.536556 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:13.536526 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 16 15:16:13.536556 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:13.536544 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 16 15:16:13.536660 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:13.536562 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 16 15:16:13.536947 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:13.536929 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 16 15:16:13.543182 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:13.543157 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 16 15:16:13.551578 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:13.551558 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-6db7cb95c8-rz2z4"] Apr 16 15:16:13.566799 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:13.566776 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/3be4ec10-6366-4b37-824d-713912b61673-federate-client-tls\") pod \"telemeter-client-6db7cb95c8-rz2z4\" (UID: \"3be4ec10-6366-4b37-824d-713912b61673\") " pod="openshift-monitoring/telemeter-client-6db7cb95c8-rz2z4" Apr 16 15:16:13.566898 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:13.566831 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/3be4ec10-6366-4b37-824d-713912b61673-secret-telemeter-client\") pod \"telemeter-client-6db7cb95c8-rz2z4\" (UID: \"3be4ec10-6366-4b37-824d-713912b61673\") " pod="openshift-monitoring/telemeter-client-6db7cb95c8-rz2z4" Apr 16 15:16:13.566898 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:13.566872 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3be4ec10-6366-4b37-824d-713912b61673-telemeter-trusted-ca-bundle\") pod \"telemeter-client-6db7cb95c8-rz2z4\" (UID: \"3be4ec10-6366-4b37-824d-713912b61673\") " pod="openshift-monitoring/telemeter-client-6db7cb95c8-rz2z4" Apr 16 15:16:13.566898 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:13.566892 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3be4ec10-6366-4b37-824d-713912b61673-serving-certs-ca-bundle\") pod \"telemeter-client-6db7cb95c8-rz2z4\" (UID: \"3be4ec10-6366-4b37-824d-713912b61673\") " pod="openshift-monitoring/telemeter-client-6db7cb95c8-rz2z4" Apr 16 15:16:13.566994 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:13.566906 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/3be4ec10-6366-4b37-824d-713912b61673-telemeter-client-tls\") pod \"telemeter-client-6db7cb95c8-rz2z4\" (UID: \"3be4ec10-6366-4b37-824d-713912b61673\") " pod="openshift-monitoring/telemeter-client-6db7cb95c8-rz2z4" Apr 16 15:16:13.566994 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:13.566939 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlvjl\" (UniqueName: \"kubernetes.io/projected/3be4ec10-6366-4b37-824d-713912b61673-kube-api-access-zlvjl\") pod \"telemeter-client-6db7cb95c8-rz2z4\" (UID: \"3be4ec10-6366-4b37-824d-713912b61673\") " pod="openshift-monitoring/telemeter-client-6db7cb95c8-rz2z4" Apr 16 15:16:13.566994 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:13.566982 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3be4ec10-6366-4b37-824d-713912b61673-metrics-client-ca\") pod \"telemeter-client-6db7cb95c8-rz2z4\" (UID: \"3be4ec10-6366-4b37-824d-713912b61673\") " pod="openshift-monitoring/telemeter-client-6db7cb95c8-rz2z4" Apr 16 15:16:13.567082 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:13.567012 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3be4ec10-6366-4b37-824d-713912b61673-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-6db7cb95c8-rz2z4\" (UID: \"3be4ec10-6366-4b37-824d-713912b61673\") " pod="openshift-monitoring/telemeter-client-6db7cb95c8-rz2z4" Apr 16 15:16:13.667561 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:13.667526 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3be4ec10-6366-4b37-824d-713912b61673-serving-certs-ca-bundle\") pod \"telemeter-client-6db7cb95c8-rz2z4\" (UID: \"3be4ec10-6366-4b37-824d-713912b61673\") " pod="openshift-monitoring/telemeter-client-6db7cb95c8-rz2z4" Apr 16 15:16:13.667561 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:13.667563 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/3be4ec10-6366-4b37-824d-713912b61673-telemeter-client-tls\") pod \"telemeter-client-6db7cb95c8-rz2z4\" (UID: \"3be4ec10-6366-4b37-824d-713912b61673\") " pod="openshift-monitoring/telemeter-client-6db7cb95c8-rz2z4" Apr 16 15:16:13.667821 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:13.667593 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zlvjl\" (UniqueName: \"kubernetes.io/projected/3be4ec10-6366-4b37-824d-713912b61673-kube-api-access-zlvjl\") pod \"telemeter-client-6db7cb95c8-rz2z4\" (UID: \"3be4ec10-6366-4b37-824d-713912b61673\") " pod="openshift-monitoring/telemeter-client-6db7cb95c8-rz2z4" Apr 16 15:16:13.667821 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:13.667618 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3be4ec10-6366-4b37-824d-713912b61673-metrics-client-ca\") pod \"telemeter-client-6db7cb95c8-rz2z4\" (UID: \"3be4ec10-6366-4b37-824d-713912b61673\") " pod="openshift-monitoring/telemeter-client-6db7cb95c8-rz2z4" Apr 16 15:16:13.667821 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:13.667666 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3be4ec10-6366-4b37-824d-713912b61673-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-6db7cb95c8-rz2z4\" (UID: \"3be4ec10-6366-4b37-824d-713912b61673\") " pod="openshift-monitoring/telemeter-client-6db7cb95c8-rz2z4" Apr 16 15:16:13.667821 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:13.667715 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/3be4ec10-6366-4b37-824d-713912b61673-federate-client-tls\") pod \"telemeter-client-6db7cb95c8-rz2z4\" (UID: \"3be4ec10-6366-4b37-824d-713912b61673\") " pod="openshift-monitoring/telemeter-client-6db7cb95c8-rz2z4" Apr 16 15:16:13.667821 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:13.667787 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/3be4ec10-6366-4b37-824d-713912b61673-secret-telemeter-client\") pod \"telemeter-client-6db7cb95c8-rz2z4\" (UID: \"3be4ec10-6366-4b37-824d-713912b61673\") " pod="openshift-monitoring/telemeter-client-6db7cb95c8-rz2z4" Apr 16 15:16:13.667821 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:13.667814 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3be4ec10-6366-4b37-824d-713912b61673-telemeter-trusted-ca-bundle\") pod \"telemeter-client-6db7cb95c8-rz2z4\" (UID: \"3be4ec10-6366-4b37-824d-713912b61673\") " pod="openshift-monitoring/telemeter-client-6db7cb95c8-rz2z4" Apr 16 15:16:13.668430 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:13.668400 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3be4ec10-6366-4b37-824d-713912b61673-metrics-client-ca\") pod \"telemeter-client-6db7cb95c8-rz2z4\" (UID: \"3be4ec10-6366-4b37-824d-713912b61673\") " pod="openshift-monitoring/telemeter-client-6db7cb95c8-rz2z4" Apr 16 15:16:13.668855 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:13.668833 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3be4ec10-6366-4b37-824d-713912b61673-telemeter-trusted-ca-bundle\") pod \"telemeter-client-6db7cb95c8-rz2z4\" (UID: \"3be4ec10-6366-4b37-824d-713912b61673\") " pod="openshift-monitoring/telemeter-client-6db7cb95c8-rz2z4" Apr 16 15:16:13.668948 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:13.668855 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3be4ec10-6366-4b37-824d-713912b61673-serving-certs-ca-bundle\") pod \"telemeter-client-6db7cb95c8-rz2z4\" (UID: \"3be4ec10-6366-4b37-824d-713912b61673\") " pod="openshift-monitoring/telemeter-client-6db7cb95c8-rz2z4" Apr 16 15:16:13.670496 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:13.670476 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/3be4ec10-6366-4b37-824d-713912b61673-federate-client-tls\") pod \"telemeter-client-6db7cb95c8-rz2z4\" (UID: \"3be4ec10-6366-4b37-824d-713912b61673\") " pod="openshift-monitoring/telemeter-client-6db7cb95c8-rz2z4" Apr 16 15:16:13.670921 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:13.670890 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/3be4ec10-6366-4b37-824d-713912b61673-telemeter-client-tls\") pod \"telemeter-client-6db7cb95c8-rz2z4\" (UID: \"3be4ec10-6366-4b37-824d-713912b61673\") " pod="openshift-monitoring/telemeter-client-6db7cb95c8-rz2z4" Apr 16 15:16:13.670976 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:13.670927 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/3be4ec10-6366-4b37-824d-713912b61673-secret-telemeter-client\") pod \"telemeter-client-6db7cb95c8-rz2z4\" (UID: \"3be4ec10-6366-4b37-824d-713912b61673\") " pod="openshift-monitoring/telemeter-client-6db7cb95c8-rz2z4" Apr 16 15:16:13.671052 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:13.671032 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3be4ec10-6366-4b37-824d-713912b61673-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-6db7cb95c8-rz2z4\" (UID: \"3be4ec10-6366-4b37-824d-713912b61673\") " pod="openshift-monitoring/telemeter-client-6db7cb95c8-rz2z4" Apr 16 15:16:13.677604 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:13.677582 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlvjl\" (UniqueName: \"kubernetes.io/projected/3be4ec10-6366-4b37-824d-713912b61673-kube-api-access-zlvjl\") pod \"telemeter-client-6db7cb95c8-rz2z4\" (UID: \"3be4ec10-6366-4b37-824d-713912b61673\") " pod="openshift-monitoring/telemeter-client-6db7cb95c8-rz2z4" Apr 16 15:16:13.843820 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:13.843721 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-6db7cb95c8-rz2z4" Apr 16 15:16:13.870529 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:13.870215 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/414282ac-76dc-4992-819a-bdf05043c09d-metrics-certs\") pod \"network-metrics-daemon-h9c44\" (UID: \"414282ac-76dc-4992-819a-bdf05043c09d\") " pod="openshift-multus/network-metrics-daemon-h9c44" Apr 16 15:16:13.872736 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:13.872708 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/414282ac-76dc-4992-819a-bdf05043c09d-metrics-certs\") pod \"network-metrics-daemon-h9c44\" (UID: \"414282ac-76dc-4992-819a-bdf05043c09d\") " pod="openshift-multus/network-metrics-daemon-h9c44" Apr 16 15:16:13.917577 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:13.917548 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-5mf5n\"" Apr 16 15:16:13.925115 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:13.925089 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h9c44" Apr 16 15:16:13.987406 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:13.987000 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-6db7cb95c8-rz2z4"] Apr 16 15:16:14.061980 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:14.061948 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-h9c44"] Apr 16 15:16:14.065003 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:16:14.064974 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod414282ac_76dc_4992_819a_bdf05043c09d.slice/crio-cf53ba2a7839d4157099e8336352e9c762da7c84e2da32f6d1616119f8992d0a WatchSource:0}: Error finding container cf53ba2a7839d4157099e8336352e9c762da7c84e2da32f6d1616119f8992d0a: Status 404 returned error can't find the container with id cf53ba2a7839d4157099e8336352e9c762da7c84e2da32f6d1616119f8992d0a Apr 16 15:16:14.812165 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:14.812125 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-h9c44" event={"ID":"414282ac-76dc-4992-819a-bdf05043c09d","Type":"ContainerStarted","Data":"cf53ba2a7839d4157099e8336352e9c762da7c84e2da32f6d1616119f8992d0a"} Apr 16 15:16:14.813220 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:14.813192 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6db7cb95c8-rz2z4" event={"ID":"3be4ec10-6366-4b37-824d-713912b61673","Type":"ContainerStarted","Data":"36db61476584af1cbd4e38b035c11367685f5dde360551b4c8ee0431f4acdc2f"} Apr 16 15:16:15.819411 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:15.819318 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-h9c44" event={"ID":"414282ac-76dc-4992-819a-bdf05043c09d","Type":"ContainerStarted","Data":"8835b82e8415e220a6babbba5d795bbbcead6efcd8bdf63e3488d9ddcfe65066"} Apr 16 15:16:15.819411 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:15.819365 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-h9c44" event={"ID":"414282ac-76dc-4992-819a-bdf05043c09d","Type":"ContainerStarted","Data":"a4a79dbb30722a89ab1e1fa16db2d275e1f758a67ac7761073a5c801e80d2bd8"} Apr 16 15:16:16.824150 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:16.824114 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6db7cb95c8-rz2z4" event={"ID":"3be4ec10-6366-4b37-824d-713912b61673","Type":"ContainerStarted","Data":"8fab29e143ec7624f1bc349e36c23afb258ddf19be441f75e97f0f9938aed646"} Apr 16 15:16:16.824150 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:16.824156 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6db7cb95c8-rz2z4" event={"ID":"3be4ec10-6366-4b37-824d-713912b61673","Type":"ContainerStarted","Data":"67e090fcbb388d63a140787af287bb40b71b1f239a54b87717ca2996b20b653a"} Apr 16 15:16:16.824721 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:16.824170 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6db7cb95c8-rz2z4" event={"ID":"3be4ec10-6366-4b37-824d-713912b61673","Type":"ContainerStarted","Data":"e2fd379b89d5ff4906c68ddae2b079d1de8615252a2ba4e3256252592d1c8e88"} Apr 16 15:16:16.848291 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:16.848246 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-6db7cb95c8-rz2z4" podStartSLOduration=1.914006166 podStartE2EDuration="3.84822873s" podCreationTimestamp="2026-04-16 15:16:13 +0000 UTC" firstStartedPulling="2026-04-16 15:16:13.995186109 +0000 UTC m=+252.581389899" lastFinishedPulling="2026-04-16 15:16:15.929408665 +0000 UTC m=+254.515612463" observedRunningTime="2026-04-16 15:16:16.846792635 +0000 UTC m=+255.432996444" watchObservedRunningTime="2026-04-16 15:16:16.84822873 +0000 UTC m=+255.434432542" Apr 16 15:16:16.848578 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:16.848369 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-h9c44" podStartSLOduration=252.769908754 podStartE2EDuration="4m13.848364003s" podCreationTimestamp="2026-04-16 15:12:03 +0000 UTC" firstStartedPulling="2026-04-16 15:16:14.066911148 +0000 UTC m=+252.653114937" lastFinishedPulling="2026-04-16 15:16:15.145366399 +0000 UTC m=+253.731570186" observedRunningTime="2026-04-16 15:16:15.845124253 +0000 UTC m=+254.431328063" watchObservedRunningTime="2026-04-16 15:16:16.848364003 +0000 UTC m=+255.434567819" Apr 16 15:16:17.656061 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:17.656028 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-9cc94ddb-lr2gk"] Apr 16 15:16:17.660200 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:17.660173 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-9cc94ddb-lr2gk" Apr 16 15:16:17.675427 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:17.675403 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-9cc94ddb-lr2gk"] Apr 16 15:16:17.702980 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:17.702953 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/080a249d-4649-460c-9f75-f93f01388488-service-ca\") pod \"console-9cc94ddb-lr2gk\" (UID: \"080a249d-4649-460c-9f75-f93f01388488\") " pod="openshift-console/console-9cc94ddb-lr2gk" Apr 16 15:16:17.703156 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:17.702991 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/080a249d-4649-460c-9f75-f93f01388488-oauth-serving-cert\") pod \"console-9cc94ddb-lr2gk\" (UID: \"080a249d-4649-460c-9f75-f93f01388488\") " pod="openshift-console/console-9cc94ddb-lr2gk" Apr 16 15:16:17.703156 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:17.703011 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtxzs\" (UniqueName: \"kubernetes.io/projected/080a249d-4649-460c-9f75-f93f01388488-kube-api-access-mtxzs\") pod \"console-9cc94ddb-lr2gk\" (UID: \"080a249d-4649-460c-9f75-f93f01388488\") " pod="openshift-console/console-9cc94ddb-lr2gk" Apr 16 15:16:17.703156 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:17.703072 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/080a249d-4649-460c-9f75-f93f01388488-console-oauth-config\") pod \"console-9cc94ddb-lr2gk\" (UID: \"080a249d-4649-460c-9f75-f93f01388488\") " pod="openshift-console/console-9cc94ddb-lr2gk" Apr 16 15:16:17.703156 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:17.703119 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/080a249d-4649-460c-9f75-f93f01388488-trusted-ca-bundle\") pod \"console-9cc94ddb-lr2gk\" (UID: \"080a249d-4649-460c-9f75-f93f01388488\") " pod="openshift-console/console-9cc94ddb-lr2gk" Apr 16 15:16:17.703156 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:17.703152 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/080a249d-4649-460c-9f75-f93f01388488-console-config\") pod \"console-9cc94ddb-lr2gk\" (UID: \"080a249d-4649-460c-9f75-f93f01388488\") " pod="openshift-console/console-9cc94ddb-lr2gk" Apr 16 15:16:17.703406 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:17.703218 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/080a249d-4649-460c-9f75-f93f01388488-console-serving-cert\") pod \"console-9cc94ddb-lr2gk\" (UID: \"080a249d-4649-460c-9f75-f93f01388488\") " pod="openshift-console/console-9cc94ddb-lr2gk" Apr 16 15:16:17.804368 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:17.804332 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/080a249d-4649-460c-9f75-f93f01388488-service-ca\") pod \"console-9cc94ddb-lr2gk\" (UID: \"080a249d-4649-460c-9f75-f93f01388488\") " pod="openshift-console/console-9cc94ddb-lr2gk" Apr 16 15:16:17.804669 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:17.804628 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/080a249d-4649-460c-9f75-f93f01388488-oauth-serving-cert\") pod \"console-9cc94ddb-lr2gk\" (UID: \"080a249d-4649-460c-9f75-f93f01388488\") " pod="openshift-console/console-9cc94ddb-lr2gk" Apr 16 15:16:17.804872 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:17.804821 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mtxzs\" (UniqueName: \"kubernetes.io/projected/080a249d-4649-460c-9f75-f93f01388488-kube-api-access-mtxzs\") pod \"console-9cc94ddb-lr2gk\" (UID: \"080a249d-4649-460c-9f75-f93f01388488\") " pod="openshift-console/console-9cc94ddb-lr2gk" Apr 16 15:16:17.805019 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:17.805005 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/080a249d-4649-460c-9f75-f93f01388488-console-oauth-config\") pod \"console-9cc94ddb-lr2gk\" (UID: \"080a249d-4649-460c-9f75-f93f01388488\") " pod="openshift-console/console-9cc94ddb-lr2gk" Apr 16 15:16:17.805147 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:17.805134 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/080a249d-4649-460c-9f75-f93f01388488-trusted-ca-bundle\") pod \"console-9cc94ddb-lr2gk\" (UID: \"080a249d-4649-460c-9f75-f93f01388488\") " pod="openshift-console/console-9cc94ddb-lr2gk" Apr 16 15:16:17.806103 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:17.805335 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/080a249d-4649-460c-9f75-f93f01388488-oauth-serving-cert\") pod \"console-9cc94ddb-lr2gk\" (UID: \"080a249d-4649-460c-9f75-f93f01388488\") " pod="openshift-console/console-9cc94ddb-lr2gk" Apr 16 15:16:17.806103 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:17.805769 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/080a249d-4649-460c-9f75-f93f01388488-service-ca\") pod \"console-9cc94ddb-lr2gk\" (UID: \"080a249d-4649-460c-9f75-f93f01388488\") " pod="openshift-console/console-9cc94ddb-lr2gk" Apr 16 15:16:17.806103 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:17.805946 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/080a249d-4649-460c-9f75-f93f01388488-trusted-ca-bundle\") pod \"console-9cc94ddb-lr2gk\" (UID: \"080a249d-4649-460c-9f75-f93f01388488\") " pod="openshift-console/console-9cc94ddb-lr2gk" Apr 16 15:16:17.806666 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:17.806617 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/080a249d-4649-460c-9f75-f93f01388488-console-config\") pod \"console-9cc94ddb-lr2gk\" (UID: \"080a249d-4649-460c-9f75-f93f01388488\") " pod="openshift-console/console-9cc94ddb-lr2gk" Apr 16 15:16:17.806770 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:17.806712 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/080a249d-4649-460c-9f75-f93f01388488-console-serving-cert\") pod \"console-9cc94ddb-lr2gk\" (UID: \"080a249d-4649-460c-9f75-f93f01388488\") " pod="openshift-console/console-9cc94ddb-lr2gk" Apr 16 15:16:17.806886 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:17.806865 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/080a249d-4649-460c-9f75-f93f01388488-console-config\") pod \"console-9cc94ddb-lr2gk\" (UID: \"080a249d-4649-460c-9f75-f93f01388488\") " pod="openshift-console/console-9cc94ddb-lr2gk" Apr 16 15:16:17.807937 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:17.807916 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/080a249d-4649-460c-9f75-f93f01388488-console-oauth-config\") pod \"console-9cc94ddb-lr2gk\" (UID: \"080a249d-4649-460c-9f75-f93f01388488\") " pod="openshift-console/console-9cc94ddb-lr2gk" Apr 16 15:16:17.809047 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:17.809029 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/080a249d-4649-460c-9f75-f93f01388488-console-serving-cert\") pod \"console-9cc94ddb-lr2gk\" (UID: \"080a249d-4649-460c-9f75-f93f01388488\") " pod="openshift-console/console-9cc94ddb-lr2gk" Apr 16 15:16:17.816683 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:17.816635 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtxzs\" (UniqueName: \"kubernetes.io/projected/080a249d-4649-460c-9f75-f93f01388488-kube-api-access-mtxzs\") pod \"console-9cc94ddb-lr2gk\" (UID: \"080a249d-4649-460c-9f75-f93f01388488\") " pod="openshift-console/console-9cc94ddb-lr2gk" Apr 16 15:16:17.972343 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:17.972242 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-9cc94ddb-lr2gk" Apr 16 15:16:18.118036 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:18.117995 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-9cc94ddb-lr2gk"] Apr 16 15:16:18.121360 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:16:18.121331 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod080a249d_4649_460c_9f75_f93f01388488.slice/crio-505182d1622abde6e502d3a916a98b178b115a54801f8fe253f52c44be1b4951 WatchSource:0}: Error finding container 505182d1622abde6e502d3a916a98b178b115a54801f8fe253f52c44be1b4951: Status 404 returned error can't find the container with id 505182d1622abde6e502d3a916a98b178b115a54801f8fe253f52c44be1b4951 Apr 16 15:16:18.831575 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:18.831534 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-9cc94ddb-lr2gk" event={"ID":"080a249d-4649-460c-9f75-f93f01388488","Type":"ContainerStarted","Data":"d3ca95ab8714a221f027faf70677ea4f372f8ad7757721da512324e4e09e836f"} Apr 16 15:16:18.831727 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:18.831582 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-9cc94ddb-lr2gk" event={"ID":"080a249d-4649-460c-9f75-f93f01388488","Type":"ContainerStarted","Data":"505182d1622abde6e502d3a916a98b178b115a54801f8fe253f52c44be1b4951"} Apr 16 15:16:18.851830 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:18.851784 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-9cc94ddb-lr2gk" podStartSLOduration=1.851770112 podStartE2EDuration="1.851770112s" podCreationTimestamp="2026-04-16 15:16:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:16:18.850235466 +0000 UTC m=+257.436439312" watchObservedRunningTime="2026-04-16 15:16:18.851770112 +0000 UTC m=+257.437973923" Apr 16 15:16:27.972485 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:27.972435 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-9cc94ddb-lr2gk" Apr 16 15:16:27.972485 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:27.972494 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-9cc94ddb-lr2gk" Apr 16 15:16:27.977291 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:27.977269 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-9cc94ddb-lr2gk" Apr 16 15:16:28.867171 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:28.867141 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-9cc94ddb-lr2gk" Apr 16 15:16:28.912950 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:28.912921 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-dd64565db-45dmz"] Apr 16 15:16:53.936133 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:53.936063 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-dd64565db-45dmz" podUID="4a66b64f-9125-4d76-8a81-4a110f1909d6" containerName="console" containerID="cri-o://ca490ed26df6e2684c0f0711eb5f6ec91dc693e7d105e41907695a23c94e34f1" gracePeriod=15 Apr 16 15:16:54.181754 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:54.181731 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-dd64565db-45dmz_4a66b64f-9125-4d76-8a81-4a110f1909d6/console/0.log" Apr 16 15:16:54.181887 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:54.181798 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-dd64565db-45dmz" Apr 16 15:16:54.211642 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:54.211574 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4a66b64f-9125-4d76-8a81-4a110f1909d6-console-serving-cert\") pod \"4a66b64f-9125-4d76-8a81-4a110f1909d6\" (UID: \"4a66b64f-9125-4d76-8a81-4a110f1909d6\") " Apr 16 15:16:54.211769 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:54.211684 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4a66b64f-9125-4d76-8a81-4a110f1909d6-trusted-ca-bundle\") pod \"4a66b64f-9125-4d76-8a81-4a110f1909d6\" (UID: \"4a66b64f-9125-4d76-8a81-4a110f1909d6\") " Apr 16 15:16:54.211769 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:54.211738 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4a66b64f-9125-4d76-8a81-4a110f1909d6-service-ca\") pod \"4a66b64f-9125-4d76-8a81-4a110f1909d6\" (UID: \"4a66b64f-9125-4d76-8a81-4a110f1909d6\") " Apr 16 15:16:54.211875 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:54.211798 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4a66b64f-9125-4d76-8a81-4a110f1909d6-console-config\") pod \"4a66b64f-9125-4d76-8a81-4a110f1909d6\" (UID: \"4a66b64f-9125-4d76-8a81-4a110f1909d6\") " Apr 16 15:16:54.211875 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:54.211831 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4a66b64f-9125-4d76-8a81-4a110f1909d6-oauth-serving-cert\") pod \"4a66b64f-9125-4d76-8a81-4a110f1909d6\" (UID: \"4a66b64f-9125-4d76-8a81-4a110f1909d6\") " Apr 16 15:16:54.211875 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:54.211863 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vg2cr\" (UniqueName: \"kubernetes.io/projected/4a66b64f-9125-4d76-8a81-4a110f1909d6-kube-api-access-vg2cr\") pod \"4a66b64f-9125-4d76-8a81-4a110f1909d6\" (UID: \"4a66b64f-9125-4d76-8a81-4a110f1909d6\") " Apr 16 15:16:54.212025 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:54.211887 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4a66b64f-9125-4d76-8a81-4a110f1909d6-console-oauth-config\") pod \"4a66b64f-9125-4d76-8a81-4a110f1909d6\" (UID: \"4a66b64f-9125-4d76-8a81-4a110f1909d6\") " Apr 16 15:16:54.212181 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:54.212094 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a66b64f-9125-4d76-8a81-4a110f1909d6-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "4a66b64f-9125-4d76-8a81-4a110f1909d6" (UID: "4a66b64f-9125-4d76-8a81-4a110f1909d6"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 15:16:54.212181 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:54.212137 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a66b64f-9125-4d76-8a81-4a110f1909d6-service-ca" (OuterVolumeSpecName: "service-ca") pod "4a66b64f-9125-4d76-8a81-4a110f1909d6" (UID: "4a66b64f-9125-4d76-8a81-4a110f1909d6"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 15:16:54.212340 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:54.212208 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a66b64f-9125-4d76-8a81-4a110f1909d6-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "4a66b64f-9125-4d76-8a81-4a110f1909d6" (UID: "4a66b64f-9125-4d76-8a81-4a110f1909d6"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 15:16:54.212340 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:54.212262 2573 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4a66b64f-9125-4d76-8a81-4a110f1909d6-trusted-ca-bundle\") on node \"ip-10-0-136-151.ec2.internal\" DevicePath \"\"" Apr 16 15:16:54.212340 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:54.212285 2573 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4a66b64f-9125-4d76-8a81-4a110f1909d6-service-ca\") on node \"ip-10-0-136-151.ec2.internal\" DevicePath \"\"" Apr 16 15:16:54.212872 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:54.212829 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a66b64f-9125-4d76-8a81-4a110f1909d6-console-config" (OuterVolumeSpecName: "console-config") pod "4a66b64f-9125-4d76-8a81-4a110f1909d6" (UID: "4a66b64f-9125-4d76-8a81-4a110f1909d6"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 15:16:54.214001 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:54.213980 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a66b64f-9125-4d76-8a81-4a110f1909d6-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "4a66b64f-9125-4d76-8a81-4a110f1909d6" (UID: "4a66b64f-9125-4d76-8a81-4a110f1909d6"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 15:16:54.214290 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:54.214270 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a66b64f-9125-4d76-8a81-4a110f1909d6-kube-api-access-vg2cr" (OuterVolumeSpecName: "kube-api-access-vg2cr") pod "4a66b64f-9125-4d76-8a81-4a110f1909d6" (UID: "4a66b64f-9125-4d76-8a81-4a110f1909d6"). InnerVolumeSpecName "kube-api-access-vg2cr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 15:16:54.214364 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:54.214324 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a66b64f-9125-4d76-8a81-4a110f1909d6-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "4a66b64f-9125-4d76-8a81-4a110f1909d6" (UID: "4a66b64f-9125-4d76-8a81-4a110f1909d6"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 15:16:54.313115 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:54.313079 2573 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4a66b64f-9125-4d76-8a81-4a110f1909d6-console-config\") on node \"ip-10-0-136-151.ec2.internal\" DevicePath \"\"" Apr 16 15:16:54.313115 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:54.313108 2573 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4a66b64f-9125-4d76-8a81-4a110f1909d6-oauth-serving-cert\") on node \"ip-10-0-136-151.ec2.internal\" DevicePath \"\"" Apr 16 15:16:54.313115 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:54.313117 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vg2cr\" (UniqueName: \"kubernetes.io/projected/4a66b64f-9125-4d76-8a81-4a110f1909d6-kube-api-access-vg2cr\") on node \"ip-10-0-136-151.ec2.internal\" DevicePath \"\"" Apr 16 15:16:54.313337 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:54.313128 2573 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4a66b64f-9125-4d76-8a81-4a110f1909d6-console-oauth-config\") on node \"ip-10-0-136-151.ec2.internal\" DevicePath \"\"" Apr 16 15:16:54.313337 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:54.313143 2573 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4a66b64f-9125-4d76-8a81-4a110f1909d6-console-serving-cert\") on node \"ip-10-0-136-151.ec2.internal\" DevicePath \"\"" Apr 16 15:16:54.944483 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:54.944457 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-dd64565db-45dmz_4a66b64f-9125-4d76-8a81-4a110f1909d6/console/0.log" Apr 16 15:16:54.944868 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:54.944499 2573 generic.go:358] "Generic (PLEG): container finished" podID="4a66b64f-9125-4d76-8a81-4a110f1909d6" containerID="ca490ed26df6e2684c0f0711eb5f6ec91dc693e7d105e41907695a23c94e34f1" exitCode=2 Apr 16 15:16:54.944868 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:54.944531 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-dd64565db-45dmz" event={"ID":"4a66b64f-9125-4d76-8a81-4a110f1909d6","Type":"ContainerDied","Data":"ca490ed26df6e2684c0f0711eb5f6ec91dc693e7d105e41907695a23c94e34f1"} Apr 16 15:16:54.944868 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:54.944553 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-dd64565db-45dmz" event={"ID":"4a66b64f-9125-4d76-8a81-4a110f1909d6","Type":"ContainerDied","Data":"79760de1a4e137ccb5b921102e1c509a88e0cf033abae999010dbeafe30dd615"} Apr 16 15:16:54.944868 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:54.944563 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-dd64565db-45dmz" Apr 16 15:16:54.944868 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:54.944567 2573 scope.go:117] "RemoveContainer" containerID="ca490ed26df6e2684c0f0711eb5f6ec91dc693e7d105e41907695a23c94e34f1" Apr 16 15:16:54.953203 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:54.953186 2573 scope.go:117] "RemoveContainer" containerID="ca490ed26df6e2684c0f0711eb5f6ec91dc693e7d105e41907695a23c94e34f1" Apr 16 15:16:54.953437 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:16:54.953415 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca490ed26df6e2684c0f0711eb5f6ec91dc693e7d105e41907695a23c94e34f1\": container with ID starting with ca490ed26df6e2684c0f0711eb5f6ec91dc693e7d105e41907695a23c94e34f1 not found: ID does not exist" containerID="ca490ed26df6e2684c0f0711eb5f6ec91dc693e7d105e41907695a23c94e34f1" Apr 16 15:16:54.953505 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:54.953448 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca490ed26df6e2684c0f0711eb5f6ec91dc693e7d105e41907695a23c94e34f1"} err="failed to get container status \"ca490ed26df6e2684c0f0711eb5f6ec91dc693e7d105e41907695a23c94e34f1\": rpc error: code = NotFound desc = could not find container \"ca490ed26df6e2684c0f0711eb5f6ec91dc693e7d105e41907695a23c94e34f1\": container with ID starting with ca490ed26df6e2684c0f0711eb5f6ec91dc693e7d105e41907695a23c94e34f1 not found: ID does not exist" Apr 16 15:16:54.968844 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:54.968821 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-dd64565db-45dmz"] Apr 16 15:16:54.974830 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:54.974808 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-dd64565db-45dmz"] Apr 16 15:16:56.018277 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:16:56.018245 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a66b64f-9125-4d76-8a81-4a110f1909d6" path="/var/lib/kubelet/pods/4a66b64f-9125-4d76-8a81-4a110f1909d6/volumes" Apr 16 15:17:01.890460 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:17:01.890427 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-xlsw6_22e6f93e-aa56-4c51-85da-4f105af8d76d/console-operator/2.log" Apr 16 15:17:01.893834 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:17:01.893806 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-xlsw6_22e6f93e-aa56-4c51-85da-4f105af8d76d/console-operator/2.log" Apr 16 15:17:01.895863 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:17:01.895841 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n96lw_edf7b8a9-878b-451c-86c3-6a78d2ece872/ovn-acl-logging/0.log" Apr 16 15:17:01.899925 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:17:01.899904 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n96lw_edf7b8a9-878b-451c-86c3-6a78d2ece872/ovn-acl-logging/0.log" Apr 16 15:17:01.901603 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:17:01.901587 2573 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 15:17:29.838312 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:17:29.838279 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6bbf4d9849-clvm8"] Apr 16 15:17:29.840312 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:17:29.838607 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4a66b64f-9125-4d76-8a81-4a110f1909d6" containerName="console" Apr 16 15:17:29.840312 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:17:29.838619 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a66b64f-9125-4d76-8a81-4a110f1909d6" containerName="console" Apr 16 15:17:29.840312 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:17:29.838698 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="4a66b64f-9125-4d76-8a81-4a110f1909d6" containerName="console" Apr 16 15:17:29.841398 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:17:29.841379 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6bbf4d9849-clvm8" Apr 16 15:17:29.854200 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:17:29.854179 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6bbf4d9849-clvm8"] Apr 16 15:17:30.021772 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:17:30.021745 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a7635360-cbb1-4161-9e1f-d65f7cef6c61-console-serving-cert\") pod \"console-6bbf4d9849-clvm8\" (UID: \"a7635360-cbb1-4161-9e1f-d65f7cef6c61\") " pod="openshift-console/console-6bbf4d9849-clvm8" Apr 16 15:17:30.021927 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:17:30.021777 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vvhf\" (UniqueName: \"kubernetes.io/projected/a7635360-cbb1-4161-9e1f-d65f7cef6c61-kube-api-access-9vvhf\") pod \"console-6bbf4d9849-clvm8\" (UID: \"a7635360-cbb1-4161-9e1f-d65f7cef6c61\") " pod="openshift-console/console-6bbf4d9849-clvm8" Apr 16 15:17:30.021927 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:17:30.021799 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a7635360-cbb1-4161-9e1f-d65f7cef6c61-service-ca\") pod \"console-6bbf4d9849-clvm8\" (UID: \"a7635360-cbb1-4161-9e1f-d65f7cef6c61\") " pod="openshift-console/console-6bbf4d9849-clvm8" Apr 16 15:17:30.021927 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:17:30.021825 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a7635360-cbb1-4161-9e1f-d65f7cef6c61-console-oauth-config\") pod \"console-6bbf4d9849-clvm8\" (UID: \"a7635360-cbb1-4161-9e1f-d65f7cef6c61\") " pod="openshift-console/console-6bbf4d9849-clvm8" Apr 16 15:17:30.021927 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:17:30.021879 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a7635360-cbb1-4161-9e1f-d65f7cef6c61-trusted-ca-bundle\") pod \"console-6bbf4d9849-clvm8\" (UID: \"a7635360-cbb1-4161-9e1f-d65f7cef6c61\") " pod="openshift-console/console-6bbf4d9849-clvm8" Apr 16 15:17:30.021927 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:17:30.021924 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a7635360-cbb1-4161-9e1f-d65f7cef6c61-console-config\") pod \"console-6bbf4d9849-clvm8\" (UID: \"a7635360-cbb1-4161-9e1f-d65f7cef6c61\") " pod="openshift-console/console-6bbf4d9849-clvm8" Apr 16 15:17:30.022131 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:17:30.021946 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a7635360-cbb1-4161-9e1f-d65f7cef6c61-oauth-serving-cert\") pod \"console-6bbf4d9849-clvm8\" (UID: \"a7635360-cbb1-4161-9e1f-d65f7cef6c61\") " pod="openshift-console/console-6bbf4d9849-clvm8" Apr 16 15:17:30.122996 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:17:30.122911 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a7635360-cbb1-4161-9e1f-d65f7cef6c61-console-config\") pod \"console-6bbf4d9849-clvm8\" (UID: \"a7635360-cbb1-4161-9e1f-d65f7cef6c61\") " pod="openshift-console/console-6bbf4d9849-clvm8" Apr 16 15:17:30.122996 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:17:30.122948 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a7635360-cbb1-4161-9e1f-d65f7cef6c61-oauth-serving-cert\") pod \"console-6bbf4d9849-clvm8\" (UID: \"a7635360-cbb1-4161-9e1f-d65f7cef6c61\") " pod="openshift-console/console-6bbf4d9849-clvm8" Apr 16 15:17:30.122996 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:17:30.122976 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a7635360-cbb1-4161-9e1f-d65f7cef6c61-console-serving-cert\") pod \"console-6bbf4d9849-clvm8\" (UID: \"a7635360-cbb1-4161-9e1f-d65f7cef6c61\") " pod="openshift-console/console-6bbf4d9849-clvm8" Apr 16 15:17:30.122996 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:17:30.122996 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9vvhf\" (UniqueName: \"kubernetes.io/projected/a7635360-cbb1-4161-9e1f-d65f7cef6c61-kube-api-access-9vvhf\") pod \"console-6bbf4d9849-clvm8\" (UID: \"a7635360-cbb1-4161-9e1f-d65f7cef6c61\") " pod="openshift-console/console-6bbf4d9849-clvm8" Apr 16 15:17:30.123313 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:17:30.123019 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a7635360-cbb1-4161-9e1f-d65f7cef6c61-service-ca\") pod \"console-6bbf4d9849-clvm8\" (UID: \"a7635360-cbb1-4161-9e1f-d65f7cef6c61\") " pod="openshift-console/console-6bbf4d9849-clvm8" Apr 16 15:17:30.123313 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:17:30.123057 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a7635360-cbb1-4161-9e1f-d65f7cef6c61-console-oauth-config\") pod \"console-6bbf4d9849-clvm8\" (UID: \"a7635360-cbb1-4161-9e1f-d65f7cef6c61\") " pod="openshift-console/console-6bbf4d9849-clvm8" Apr 16 15:17:30.123313 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:17:30.123118 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a7635360-cbb1-4161-9e1f-d65f7cef6c61-trusted-ca-bundle\") pod \"console-6bbf4d9849-clvm8\" (UID: \"a7635360-cbb1-4161-9e1f-d65f7cef6c61\") " pod="openshift-console/console-6bbf4d9849-clvm8" Apr 16 15:17:30.123733 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:17:30.123707 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a7635360-cbb1-4161-9e1f-d65f7cef6c61-oauth-serving-cert\") pod \"console-6bbf4d9849-clvm8\" (UID: \"a7635360-cbb1-4161-9e1f-d65f7cef6c61\") " pod="openshift-console/console-6bbf4d9849-clvm8" Apr 16 15:17:30.123868 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:17:30.123796 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a7635360-cbb1-4161-9e1f-d65f7cef6c61-console-config\") pod \"console-6bbf4d9849-clvm8\" (UID: \"a7635360-cbb1-4161-9e1f-d65f7cef6c61\") " pod="openshift-console/console-6bbf4d9849-clvm8" Apr 16 15:17:30.123868 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:17:30.123819 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a7635360-cbb1-4161-9e1f-d65f7cef6c61-service-ca\") pod \"console-6bbf4d9849-clvm8\" (UID: \"a7635360-cbb1-4161-9e1f-d65f7cef6c61\") " pod="openshift-console/console-6bbf4d9849-clvm8" Apr 16 15:17:30.124093 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:17:30.124077 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a7635360-cbb1-4161-9e1f-d65f7cef6c61-trusted-ca-bundle\") pod \"console-6bbf4d9849-clvm8\" (UID: \"a7635360-cbb1-4161-9e1f-d65f7cef6c61\") " pod="openshift-console/console-6bbf4d9849-clvm8" Apr 16 15:17:30.126226 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:17:30.126204 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a7635360-cbb1-4161-9e1f-d65f7cef6c61-console-serving-cert\") pod \"console-6bbf4d9849-clvm8\" (UID: \"a7635360-cbb1-4161-9e1f-d65f7cef6c61\") " pod="openshift-console/console-6bbf4d9849-clvm8" Apr 16 15:17:30.126353 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:17:30.126331 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a7635360-cbb1-4161-9e1f-d65f7cef6c61-console-oauth-config\") pod \"console-6bbf4d9849-clvm8\" (UID: \"a7635360-cbb1-4161-9e1f-d65f7cef6c61\") " pod="openshift-console/console-6bbf4d9849-clvm8" Apr 16 15:17:30.132027 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:17:30.132008 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vvhf\" (UniqueName: \"kubernetes.io/projected/a7635360-cbb1-4161-9e1f-d65f7cef6c61-kube-api-access-9vvhf\") pod \"console-6bbf4d9849-clvm8\" (UID: \"a7635360-cbb1-4161-9e1f-d65f7cef6c61\") " pod="openshift-console/console-6bbf4d9849-clvm8" Apr 16 15:17:30.150934 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:17:30.150912 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6bbf4d9849-clvm8" Apr 16 15:17:30.277416 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:17:30.277390 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6bbf4d9849-clvm8"] Apr 16 15:17:30.279585 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:17:30.279552 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7635360_cbb1_4161_9e1f_d65f7cef6c61.slice/crio-e3d6d87c52e3f6d35183f06528dd0ea1a6d10625e6cdd20da9e7733037d2d28f WatchSource:0}: Error finding container e3d6d87c52e3f6d35183f06528dd0ea1a6d10625e6cdd20da9e7733037d2d28f: Status 404 returned error can't find the container with id e3d6d87c52e3f6d35183f06528dd0ea1a6d10625e6cdd20da9e7733037d2d28f Apr 16 15:17:30.281448 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:17:30.281432 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 15:17:31.051502 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:17:31.051469 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6bbf4d9849-clvm8" event={"ID":"a7635360-cbb1-4161-9e1f-d65f7cef6c61","Type":"ContainerStarted","Data":"5bc1eefc72a224b20ebca13abff795a736f7e48760d37d56c79571a974b24d4e"} Apr 16 15:17:31.051502 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:17:31.051503 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6bbf4d9849-clvm8" event={"ID":"a7635360-cbb1-4161-9e1f-d65f7cef6c61","Type":"ContainerStarted","Data":"e3d6d87c52e3f6d35183f06528dd0ea1a6d10625e6cdd20da9e7733037d2d28f"} Apr 16 15:17:31.070023 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:17:31.069971 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6bbf4d9849-clvm8" podStartSLOduration=2.069953613 podStartE2EDuration="2.069953613s" podCreationTimestamp="2026-04-16 15:17:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:17:31.069047535 +0000 UTC m=+329.655251357" watchObservedRunningTime="2026-04-16 15:17:31.069953613 +0000 UTC m=+329.656157422" Apr 16 15:17:40.151875 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:17:40.151838 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6bbf4d9849-clvm8" Apr 16 15:17:40.151875 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:17:40.151884 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6bbf4d9849-clvm8" Apr 16 15:17:40.156516 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:17:40.156490 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6bbf4d9849-clvm8" Apr 16 15:17:41.083541 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:17:41.083511 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6bbf4d9849-clvm8" Apr 16 15:17:41.132157 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:17:41.132126 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-9cc94ddb-lr2gk"] Apr 16 15:18:06.153072 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:18:06.153009 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-9cc94ddb-lr2gk" podUID="080a249d-4649-460c-9f75-f93f01388488" containerName="console" containerID="cri-o://d3ca95ab8714a221f027faf70677ea4f372f8ad7757721da512324e4e09e836f" gracePeriod=15 Apr 16 15:18:06.390968 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:18:06.390947 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-9cc94ddb-lr2gk_080a249d-4649-460c-9f75-f93f01388488/console/0.log" Apr 16 15:18:06.391073 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:18:06.391012 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-9cc94ddb-lr2gk" Apr 16 15:18:06.523747 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:18:06.523671 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/080a249d-4649-460c-9f75-f93f01388488-trusted-ca-bundle\") pod \"080a249d-4649-460c-9f75-f93f01388488\" (UID: \"080a249d-4649-460c-9f75-f93f01388488\") " Apr 16 15:18:06.523747 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:18:06.523715 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/080a249d-4649-460c-9f75-f93f01388488-oauth-serving-cert\") pod \"080a249d-4649-460c-9f75-f93f01388488\" (UID: \"080a249d-4649-460c-9f75-f93f01388488\") " Apr 16 15:18:06.523952 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:18:06.523756 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/080a249d-4649-460c-9f75-f93f01388488-console-serving-cert\") pod \"080a249d-4649-460c-9f75-f93f01388488\" (UID: \"080a249d-4649-460c-9f75-f93f01388488\") " Apr 16 15:18:06.523952 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:18:06.523779 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/080a249d-4649-460c-9f75-f93f01388488-service-ca\") pod \"080a249d-4649-460c-9f75-f93f01388488\" (UID: \"080a249d-4649-460c-9f75-f93f01388488\") " Apr 16 15:18:06.523952 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:18:06.523797 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/080a249d-4649-460c-9f75-f93f01388488-console-oauth-config\") pod \"080a249d-4649-460c-9f75-f93f01388488\" (UID: \"080a249d-4649-460c-9f75-f93f01388488\") " Apr 16 15:18:06.523952 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:18:06.523849 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mtxzs\" (UniqueName: \"kubernetes.io/projected/080a249d-4649-460c-9f75-f93f01388488-kube-api-access-mtxzs\") pod \"080a249d-4649-460c-9f75-f93f01388488\" (UID: \"080a249d-4649-460c-9f75-f93f01388488\") " Apr 16 15:18:06.523952 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:18:06.523903 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/080a249d-4649-460c-9f75-f93f01388488-console-config\") pod \"080a249d-4649-460c-9f75-f93f01388488\" (UID: \"080a249d-4649-460c-9f75-f93f01388488\") " Apr 16 15:18:06.524183 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:18:06.524154 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/080a249d-4649-460c-9f75-f93f01388488-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "080a249d-4649-460c-9f75-f93f01388488" (UID: "080a249d-4649-460c-9f75-f93f01388488"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 15:18:06.524238 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:18:06.524186 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/080a249d-4649-460c-9f75-f93f01388488-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "080a249d-4649-460c-9f75-f93f01388488" (UID: "080a249d-4649-460c-9f75-f93f01388488"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 15:18:06.524320 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:18:06.524295 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/080a249d-4649-460c-9f75-f93f01388488-service-ca" (OuterVolumeSpecName: "service-ca") pod "080a249d-4649-460c-9f75-f93f01388488" (UID: "080a249d-4649-460c-9f75-f93f01388488"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 15:18:06.524419 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:18:06.524391 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/080a249d-4649-460c-9f75-f93f01388488-console-config" (OuterVolumeSpecName: "console-config") pod "080a249d-4649-460c-9f75-f93f01388488" (UID: "080a249d-4649-460c-9f75-f93f01388488"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 15:18:06.526023 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:18:06.525993 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/080a249d-4649-460c-9f75-f93f01388488-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "080a249d-4649-460c-9f75-f93f01388488" (UID: "080a249d-4649-460c-9f75-f93f01388488"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 15:18:06.526129 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:18:06.526107 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/080a249d-4649-460c-9f75-f93f01388488-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "080a249d-4649-460c-9f75-f93f01388488" (UID: "080a249d-4649-460c-9f75-f93f01388488"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 15:18:06.526192 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:18:06.526165 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/080a249d-4649-460c-9f75-f93f01388488-kube-api-access-mtxzs" (OuterVolumeSpecName: "kube-api-access-mtxzs") pod "080a249d-4649-460c-9f75-f93f01388488" (UID: "080a249d-4649-460c-9f75-f93f01388488"). InnerVolumeSpecName "kube-api-access-mtxzs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 15:18:06.625288 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:18:06.625264 2573 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/080a249d-4649-460c-9f75-f93f01388488-console-serving-cert\") on node \"ip-10-0-136-151.ec2.internal\" DevicePath \"\"" Apr 16 15:18:06.625288 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:18:06.625288 2573 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/080a249d-4649-460c-9f75-f93f01388488-service-ca\") on node \"ip-10-0-136-151.ec2.internal\" DevicePath \"\"" Apr 16 15:18:06.625482 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:18:06.625298 2573 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/080a249d-4649-460c-9f75-f93f01388488-console-oauth-config\") on node \"ip-10-0-136-151.ec2.internal\" DevicePath \"\"" Apr 16 15:18:06.625482 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:18:06.625308 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mtxzs\" (UniqueName: \"kubernetes.io/projected/080a249d-4649-460c-9f75-f93f01388488-kube-api-access-mtxzs\") on node \"ip-10-0-136-151.ec2.internal\" DevicePath \"\"" Apr 16 15:18:06.625482 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:18:06.625317 2573 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/080a249d-4649-460c-9f75-f93f01388488-console-config\") on node \"ip-10-0-136-151.ec2.internal\" DevicePath \"\"" Apr 16 15:18:06.625482 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:18:06.625326 2573 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/080a249d-4649-460c-9f75-f93f01388488-trusted-ca-bundle\") on node \"ip-10-0-136-151.ec2.internal\" DevicePath \"\"" Apr 16 15:18:06.625482 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:18:06.625334 2573 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/080a249d-4649-460c-9f75-f93f01388488-oauth-serving-cert\") on node \"ip-10-0-136-151.ec2.internal\" DevicePath \"\"" Apr 16 15:18:07.153442 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:18:07.153414 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-9cc94ddb-lr2gk_080a249d-4649-460c-9f75-f93f01388488/console/0.log" Apr 16 15:18:07.153866 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:18:07.153453 2573 generic.go:358] "Generic (PLEG): container finished" podID="080a249d-4649-460c-9f75-f93f01388488" containerID="d3ca95ab8714a221f027faf70677ea4f372f8ad7757721da512324e4e09e836f" exitCode=2 Apr 16 15:18:07.153866 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:18:07.153504 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-9cc94ddb-lr2gk" event={"ID":"080a249d-4649-460c-9f75-f93f01388488","Type":"ContainerDied","Data":"d3ca95ab8714a221f027faf70677ea4f372f8ad7757721da512324e4e09e836f"} Apr 16 15:18:07.153866 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:18:07.153514 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-9cc94ddb-lr2gk" Apr 16 15:18:07.153866 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:18:07.153525 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-9cc94ddb-lr2gk" event={"ID":"080a249d-4649-460c-9f75-f93f01388488","Type":"ContainerDied","Data":"505182d1622abde6e502d3a916a98b178b115a54801f8fe253f52c44be1b4951"} Apr 16 15:18:07.153866 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:18:07.153539 2573 scope.go:117] "RemoveContainer" containerID="d3ca95ab8714a221f027faf70677ea4f372f8ad7757721da512324e4e09e836f" Apr 16 15:18:07.162279 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:18:07.162250 2573 scope.go:117] "RemoveContainer" containerID="d3ca95ab8714a221f027faf70677ea4f372f8ad7757721da512324e4e09e836f" Apr 16 15:18:07.162501 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:18:07.162486 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3ca95ab8714a221f027faf70677ea4f372f8ad7757721da512324e4e09e836f\": container with ID starting with d3ca95ab8714a221f027faf70677ea4f372f8ad7757721da512324e4e09e836f not found: ID does not exist" containerID="d3ca95ab8714a221f027faf70677ea4f372f8ad7757721da512324e4e09e836f" Apr 16 15:18:07.162554 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:18:07.162508 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3ca95ab8714a221f027faf70677ea4f372f8ad7757721da512324e4e09e836f"} err="failed to get container status \"d3ca95ab8714a221f027faf70677ea4f372f8ad7757721da512324e4e09e836f\": rpc error: code = NotFound desc = could not find container \"d3ca95ab8714a221f027faf70677ea4f372f8ad7757721da512324e4e09e836f\": container with ID starting with d3ca95ab8714a221f027faf70677ea4f372f8ad7757721da512324e4e09e836f not found: ID does not exist" Apr 16 15:18:07.177113 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:18:07.177088 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-9cc94ddb-lr2gk"] Apr 16 15:18:07.182003 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:18:07.181983 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-9cc94ddb-lr2gk"] Apr 16 15:18:08.018723 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:18:08.018691 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="080a249d-4649-460c-9f75-f93f01388488" path="/var/lib/kubelet/pods/080a249d-4649-460c-9f75-f93f01388488/volumes" Apr 16 15:18:24.410818 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:18:24.410784 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-mspjk"] Apr 16 15:18:24.411392 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:18:24.411239 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="080a249d-4649-460c-9f75-f93f01388488" containerName="console" Apr 16 15:18:24.411392 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:18:24.411258 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="080a249d-4649-460c-9f75-f93f01388488" containerName="console" Apr 16 15:18:24.411392 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:18:24.411358 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="080a249d-4649-460c-9f75-f93f01388488" containerName="console" Apr 16 15:18:24.413293 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:18:24.413273 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mspjk" Apr 16 15:18:24.415882 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:18:24.415865 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 15:18:24.420982 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:18:24.420959 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-mspjk"] Apr 16 15:18:24.586026 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:18:24.585988 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d685ae4d-d7ff-407e-bf27-ec6402f786a0-original-pull-secret\") pod \"global-pull-secret-syncer-mspjk\" (UID: \"d685ae4d-d7ff-407e-bf27-ec6402f786a0\") " pod="kube-system/global-pull-secret-syncer-mspjk" Apr 16 15:18:24.586199 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:18:24.586046 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/d685ae4d-d7ff-407e-bf27-ec6402f786a0-dbus\") pod \"global-pull-secret-syncer-mspjk\" (UID: \"d685ae4d-d7ff-407e-bf27-ec6402f786a0\") " pod="kube-system/global-pull-secret-syncer-mspjk" Apr 16 15:18:24.586199 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:18:24.586162 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/d685ae4d-d7ff-407e-bf27-ec6402f786a0-kubelet-config\") pod \"global-pull-secret-syncer-mspjk\" (UID: \"d685ae4d-d7ff-407e-bf27-ec6402f786a0\") " pod="kube-system/global-pull-secret-syncer-mspjk" Apr 16 15:18:24.686823 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:18:24.686732 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/d685ae4d-d7ff-407e-bf27-ec6402f786a0-dbus\") pod \"global-pull-secret-syncer-mspjk\" (UID: \"d685ae4d-d7ff-407e-bf27-ec6402f786a0\") " pod="kube-system/global-pull-secret-syncer-mspjk" Apr 16 15:18:24.686823 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:18:24.686806 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/d685ae4d-d7ff-407e-bf27-ec6402f786a0-kubelet-config\") pod \"global-pull-secret-syncer-mspjk\" (UID: \"d685ae4d-d7ff-407e-bf27-ec6402f786a0\") " pod="kube-system/global-pull-secret-syncer-mspjk" Apr 16 15:18:24.687006 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:18:24.686838 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d685ae4d-d7ff-407e-bf27-ec6402f786a0-original-pull-secret\") pod \"global-pull-secret-syncer-mspjk\" (UID: \"d685ae4d-d7ff-407e-bf27-ec6402f786a0\") " pod="kube-system/global-pull-secret-syncer-mspjk" Apr 16 15:18:24.687006 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:18:24.686918 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/d685ae4d-d7ff-407e-bf27-ec6402f786a0-kubelet-config\") pod \"global-pull-secret-syncer-mspjk\" (UID: \"d685ae4d-d7ff-407e-bf27-ec6402f786a0\") " pod="kube-system/global-pull-secret-syncer-mspjk" Apr 16 15:18:24.687006 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:18:24.686919 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/d685ae4d-d7ff-407e-bf27-ec6402f786a0-dbus\") pod \"global-pull-secret-syncer-mspjk\" (UID: \"d685ae4d-d7ff-407e-bf27-ec6402f786a0\") " pod="kube-system/global-pull-secret-syncer-mspjk" Apr 16 15:18:24.689302 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:18:24.689280 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d685ae4d-d7ff-407e-bf27-ec6402f786a0-original-pull-secret\") pod \"global-pull-secret-syncer-mspjk\" (UID: \"d685ae4d-d7ff-407e-bf27-ec6402f786a0\") " pod="kube-system/global-pull-secret-syncer-mspjk" Apr 16 15:18:24.723228 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:18:24.723205 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mspjk" Apr 16 15:18:24.846739 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:18:24.846716 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-mspjk"] Apr 16 15:18:24.849182 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:18:24.849154 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd685ae4d_d7ff_407e_bf27_ec6402f786a0.slice/crio-b324863d0ec393fe2a824953a2aac710e72211959d7b97a15a237c3552988d65 WatchSource:0}: Error finding container b324863d0ec393fe2a824953a2aac710e72211959d7b97a15a237c3552988d65: Status 404 returned error can't find the container with id b324863d0ec393fe2a824953a2aac710e72211959d7b97a15a237c3552988d65 Apr 16 15:18:25.212576 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:18:25.212542 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-mspjk" event={"ID":"d685ae4d-d7ff-407e-bf27-ec6402f786a0","Type":"ContainerStarted","Data":"b324863d0ec393fe2a824953a2aac710e72211959d7b97a15a237c3552988d65"} Apr 16 15:18:30.230122 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:18:30.230091 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-mspjk" event={"ID":"d685ae4d-d7ff-407e-bf27-ec6402f786a0","Type":"ContainerStarted","Data":"6e89d08ee352f9138d034094a368aca85ffd688164abc65c87ce4a2aab21017b"} Apr 16 15:18:30.248955 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:18:30.248859 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-mspjk" podStartSLOduration=1.014954375 podStartE2EDuration="6.248828515s" podCreationTimestamp="2026-04-16 15:18:24 +0000 UTC" firstStartedPulling="2026-04-16 15:18:24.850742079 +0000 UTC m=+383.436945869" lastFinishedPulling="2026-04-16 15:18:30.084616221 +0000 UTC m=+388.670820009" observedRunningTime="2026-04-16 15:18:30.248197297 +0000 UTC m=+388.834401109" watchObservedRunningTime="2026-04-16 15:18:30.248828515 +0000 UTC m=+388.835032324" Apr 16 15:18:53.895897 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:18:53.895866 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kg8rx"] Apr 16 15:18:53.898113 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:18:53.898098 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kg8rx" Apr 16 15:18:53.901712 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:18:53.901692 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 15:18:53.901817 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:18:53.901803 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-98fps\"" Apr 16 15:18:53.903057 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:18:53.903042 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 15:18:53.911442 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:18:53.911421 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kg8rx"] Apr 16 15:18:54.025958 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:18:54.025929 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/41d0be64-2569-4872-9e89-a8f072745b11-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kg8rx\" (UID: \"41d0be64-2569-4872-9e89-a8f072745b11\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kg8rx" Apr 16 15:18:54.026126 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:18:54.025993 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/41d0be64-2569-4872-9e89-a8f072745b11-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kg8rx\" (UID: \"41d0be64-2569-4872-9e89-a8f072745b11\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kg8rx" Apr 16 15:18:54.026126 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:18:54.026069 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkk4z\" (UniqueName: \"kubernetes.io/projected/41d0be64-2569-4872-9e89-a8f072745b11-kube-api-access-kkk4z\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kg8rx\" (UID: \"41d0be64-2569-4872-9e89-a8f072745b11\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kg8rx" Apr 16 15:18:54.126767 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:18:54.126736 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/41d0be64-2569-4872-9e89-a8f072745b11-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kg8rx\" (UID: \"41d0be64-2569-4872-9e89-a8f072745b11\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kg8rx" Apr 16 15:18:54.126921 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:18:54.126810 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kkk4z\" (UniqueName: \"kubernetes.io/projected/41d0be64-2569-4872-9e89-a8f072745b11-kube-api-access-kkk4z\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kg8rx\" (UID: \"41d0be64-2569-4872-9e89-a8f072745b11\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kg8rx" Apr 16 15:18:54.126921 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:18:54.126846 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/41d0be64-2569-4872-9e89-a8f072745b11-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kg8rx\" (UID: \"41d0be64-2569-4872-9e89-a8f072745b11\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kg8rx" Apr 16 15:18:54.127103 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:18:54.127082 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/41d0be64-2569-4872-9e89-a8f072745b11-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kg8rx\" (UID: \"41d0be64-2569-4872-9e89-a8f072745b11\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kg8rx" Apr 16 15:18:54.127167 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:18:54.127140 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/41d0be64-2569-4872-9e89-a8f072745b11-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kg8rx\" (UID: \"41d0be64-2569-4872-9e89-a8f072745b11\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kg8rx" Apr 16 15:18:54.137902 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:18:54.137870 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkk4z\" (UniqueName: \"kubernetes.io/projected/41d0be64-2569-4872-9e89-a8f072745b11-kube-api-access-kkk4z\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kg8rx\" (UID: \"41d0be64-2569-4872-9e89-a8f072745b11\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kg8rx" Apr 16 15:18:54.207470 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:18:54.207426 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kg8rx" Apr 16 15:18:54.330109 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:18:54.330071 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kg8rx"] Apr 16 15:18:54.333666 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:18:54.333621 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41d0be64_2569_4872_9e89_a8f072745b11.slice/crio-6b0a92e05e227660daf5f1b2a030312a94bc37f4e812cdd19db1bea93c319acc WatchSource:0}: Error finding container 6b0a92e05e227660daf5f1b2a030312a94bc37f4e812cdd19db1bea93c319acc: Status 404 returned error can't find the container with id 6b0a92e05e227660daf5f1b2a030312a94bc37f4e812cdd19db1bea93c319acc Apr 16 15:18:55.309307 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:18:55.309249 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kg8rx" event={"ID":"41d0be64-2569-4872-9e89-a8f072745b11","Type":"ContainerStarted","Data":"6b0a92e05e227660daf5f1b2a030312a94bc37f4e812cdd19db1bea93c319acc"} Apr 16 15:18:59.328973 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:18:59.328871 2573 generic.go:358] "Generic (PLEG): container finished" podID="41d0be64-2569-4872-9e89-a8f072745b11" containerID="d10c97076539a54e09617fb94a32cabd7a08afd83e11039e83278a044c850be1" exitCode=0 Apr 16 15:18:59.328973 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:18:59.328945 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kg8rx" event={"ID":"41d0be64-2569-4872-9e89-a8f072745b11","Type":"ContainerDied","Data":"d10c97076539a54e09617fb94a32cabd7a08afd83e11039e83278a044c850be1"} Apr 16 15:19:02.339904 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:02.339819 2573 generic.go:358] "Generic (PLEG): container finished" podID="41d0be64-2569-4872-9e89-a8f072745b11" containerID="10b349d71080e64cde730ee6205d1c5a64bfd1ab0c1a00bb7d8958cd1007ba3e" exitCode=0 Apr 16 15:19:02.340269 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:02.339904 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kg8rx" event={"ID":"41d0be64-2569-4872-9e89-a8f072745b11","Type":"ContainerDied","Data":"10b349d71080e64cde730ee6205d1c5a64bfd1ab0c1a00bb7d8958cd1007ba3e"} Apr 16 15:19:09.363545 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:09.363509 2573 generic.go:358] "Generic (PLEG): container finished" podID="41d0be64-2569-4872-9e89-a8f072745b11" containerID="6913cc53caf6a3285dfbd3a79b8b50f4ba13b51f9c75a037f820dd32b7b3f9aa" exitCode=0 Apr 16 15:19:09.363928 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:09.363603 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kg8rx" event={"ID":"41d0be64-2569-4872-9e89-a8f072745b11","Type":"ContainerDied","Data":"6913cc53caf6a3285dfbd3a79b8b50f4ba13b51f9c75a037f820dd32b7b3f9aa"} Apr 16 15:19:10.492010 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:10.491988 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kg8rx" Apr 16 15:19:10.577478 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:10.577445 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kkk4z\" (UniqueName: \"kubernetes.io/projected/41d0be64-2569-4872-9e89-a8f072745b11-kube-api-access-kkk4z\") pod \"41d0be64-2569-4872-9e89-a8f072745b11\" (UID: \"41d0be64-2569-4872-9e89-a8f072745b11\") " Apr 16 15:19:10.577625 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:10.577515 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/41d0be64-2569-4872-9e89-a8f072745b11-bundle\") pod \"41d0be64-2569-4872-9e89-a8f072745b11\" (UID: \"41d0be64-2569-4872-9e89-a8f072745b11\") " Apr 16 15:19:10.577625 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:10.577575 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/41d0be64-2569-4872-9e89-a8f072745b11-util\") pod \"41d0be64-2569-4872-9e89-a8f072745b11\" (UID: \"41d0be64-2569-4872-9e89-a8f072745b11\") " Apr 16 15:19:10.578136 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:10.578105 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41d0be64-2569-4872-9e89-a8f072745b11-bundle" (OuterVolumeSpecName: "bundle") pod "41d0be64-2569-4872-9e89-a8f072745b11" (UID: "41d0be64-2569-4872-9e89-a8f072745b11"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:19:10.579816 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:10.579795 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41d0be64-2569-4872-9e89-a8f072745b11-kube-api-access-kkk4z" (OuterVolumeSpecName: "kube-api-access-kkk4z") pod "41d0be64-2569-4872-9e89-a8f072745b11" (UID: "41d0be64-2569-4872-9e89-a8f072745b11"). InnerVolumeSpecName "kube-api-access-kkk4z". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 15:19:10.581430 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:10.581398 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41d0be64-2569-4872-9e89-a8f072745b11-util" (OuterVolumeSpecName: "util") pod "41d0be64-2569-4872-9e89-a8f072745b11" (UID: "41d0be64-2569-4872-9e89-a8f072745b11"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:19:10.678392 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:10.678367 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kkk4z\" (UniqueName: \"kubernetes.io/projected/41d0be64-2569-4872-9e89-a8f072745b11-kube-api-access-kkk4z\") on node \"ip-10-0-136-151.ec2.internal\" DevicePath \"\"" Apr 16 15:19:10.678392 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:10.678392 2573 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/41d0be64-2569-4872-9e89-a8f072745b11-bundle\") on node \"ip-10-0-136-151.ec2.internal\" DevicePath \"\"" Apr 16 15:19:10.678585 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:10.678402 2573 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/41d0be64-2569-4872-9e89-a8f072745b11-util\") on node \"ip-10-0-136-151.ec2.internal\" DevicePath \"\"" Apr 16 15:19:11.376952 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:11.376915 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kg8rx" event={"ID":"41d0be64-2569-4872-9e89-a8f072745b11","Type":"ContainerDied","Data":"6b0a92e05e227660daf5f1b2a030312a94bc37f4e812cdd19db1bea93c319acc"} Apr 16 15:19:11.376952 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:11.376953 2573 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b0a92e05e227660daf5f1b2a030312a94bc37f4e812cdd19db1bea93c319acc" Apr 16 15:19:11.377149 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:11.376984 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kg8rx" Apr 16 15:19:16.341777 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:16.341741 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-nl2sh"] Apr 16 15:19:16.342213 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:16.342058 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="41d0be64-2569-4872-9e89-a8f072745b11" containerName="pull" Apr 16 15:19:16.342213 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:16.342068 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="41d0be64-2569-4872-9e89-a8f072745b11" containerName="pull" Apr 16 15:19:16.342213 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:16.342077 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="41d0be64-2569-4872-9e89-a8f072745b11" containerName="extract" Apr 16 15:19:16.342213 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:16.342082 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="41d0be64-2569-4872-9e89-a8f072745b11" containerName="extract" Apr 16 15:19:16.342213 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:16.342096 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="41d0be64-2569-4872-9e89-a8f072745b11" containerName="util" Apr 16 15:19:16.342213 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:16.342102 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="41d0be64-2569-4872-9e89-a8f072745b11" containerName="util" Apr 16 15:19:16.342213 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:16.342152 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="41d0be64-2569-4872-9e89-a8f072745b11" containerName="extract" Apr 16 15:19:16.345304 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:16.345288 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-nl2sh" Apr 16 15:19:16.348106 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:16.348082 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 16 15:19:16.348584 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:16.348566 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 16 15:19:16.348873 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:16.348857 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-mtqzg\"" Apr 16 15:19:16.357716 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:16.357695 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-nl2sh"] Apr 16 15:19:16.528518 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:16.528482 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0fb02d30-fe8d-4b5a-914e-13b4916bfbdb-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-nl2sh\" (UID: \"0fb02d30-fe8d-4b5a-914e-13b4916bfbdb\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-nl2sh" Apr 16 15:19:16.528714 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:16.528539 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnwhz\" (UniqueName: \"kubernetes.io/projected/0fb02d30-fe8d-4b5a-914e-13b4916bfbdb-kube-api-access-gnwhz\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-nl2sh\" (UID: \"0fb02d30-fe8d-4b5a-914e-13b4916bfbdb\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-nl2sh" Apr 16 15:19:16.629897 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:16.629809 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0fb02d30-fe8d-4b5a-914e-13b4916bfbdb-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-nl2sh\" (UID: \"0fb02d30-fe8d-4b5a-914e-13b4916bfbdb\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-nl2sh" Apr 16 15:19:16.629897 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:16.629864 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gnwhz\" (UniqueName: \"kubernetes.io/projected/0fb02d30-fe8d-4b5a-914e-13b4916bfbdb-kube-api-access-gnwhz\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-nl2sh\" (UID: \"0fb02d30-fe8d-4b5a-914e-13b4916bfbdb\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-nl2sh" Apr 16 15:19:16.630228 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:16.630203 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0fb02d30-fe8d-4b5a-914e-13b4916bfbdb-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-nl2sh\" (UID: \"0fb02d30-fe8d-4b5a-914e-13b4916bfbdb\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-nl2sh" Apr 16 15:19:16.640153 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:16.640125 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnwhz\" (UniqueName: \"kubernetes.io/projected/0fb02d30-fe8d-4b5a-914e-13b4916bfbdb-kube-api-access-gnwhz\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-nl2sh\" (UID: \"0fb02d30-fe8d-4b5a-914e-13b4916bfbdb\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-nl2sh" Apr 16 15:19:16.654901 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:16.654877 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-nl2sh" Apr 16 15:19:16.800503 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:16.800479 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-nl2sh"] Apr 16 15:19:16.803509 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:19:16.803477 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0fb02d30_fe8d_4b5a_914e_13b4916bfbdb.slice/crio-02b392ef278a6559eb3775957625774b89a97530294d13c9f3a3f96e10247eef WatchSource:0}: Error finding container 02b392ef278a6559eb3775957625774b89a97530294d13c9f3a3f96e10247eef: Status 404 returned error can't find the container with id 02b392ef278a6559eb3775957625774b89a97530294d13c9f3a3f96e10247eef Apr 16 15:19:17.399019 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:17.398965 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-nl2sh" event={"ID":"0fb02d30-fe8d-4b5a-914e-13b4916bfbdb","Type":"ContainerStarted","Data":"02b392ef278a6559eb3775957625774b89a97530294d13c9f3a3f96e10247eef"} Apr 16 15:19:19.408315 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:19.408285 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-nl2sh" event={"ID":"0fb02d30-fe8d-4b5a-914e-13b4916bfbdb","Type":"ContainerStarted","Data":"d545c9e1c406dd96ca20ac9c3ec0e470ff9416a55551edb541f9586c57a0860b"} Apr 16 15:19:19.432031 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:19.431972 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-nl2sh" podStartSLOduration=1.749694297 podStartE2EDuration="3.431956675s" podCreationTimestamp="2026-04-16 15:19:16 +0000 UTC" firstStartedPulling="2026-04-16 15:19:16.805932942 +0000 UTC m=+435.392136732" lastFinishedPulling="2026-04-16 15:19:18.488195323 +0000 UTC m=+437.074399110" observedRunningTime="2026-04-16 15:19:19.429292485 +0000 UTC m=+438.015496292" watchObservedRunningTime="2026-04-16 15:19:19.431956675 +0000 UTC m=+438.018160484" Apr 16 15:19:20.570989 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:20.570956 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpgfgw"] Apr 16 15:19:20.574398 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:20.574383 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpgfgw" Apr 16 15:19:20.577366 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:20.577343 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 15:19:20.577777 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:20.577750 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 15:19:20.578739 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:20.578724 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-98fps\"" Apr 16 15:19:20.584713 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:20.584694 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpgfgw"] Apr 16 15:19:20.661935 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:20.661904 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/70b06ddc-369a-4221-9512-0144b1a96c9c-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpgfgw\" (UID: \"70b06ddc-369a-4221-9512-0144b1a96c9c\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpgfgw" Apr 16 15:19:20.662093 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:20.661939 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mgf2\" (UniqueName: \"kubernetes.io/projected/70b06ddc-369a-4221-9512-0144b1a96c9c-kube-api-access-7mgf2\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpgfgw\" (UID: \"70b06ddc-369a-4221-9512-0144b1a96c9c\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpgfgw" Apr 16 15:19:20.662093 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:20.661984 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/70b06ddc-369a-4221-9512-0144b1a96c9c-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpgfgw\" (UID: \"70b06ddc-369a-4221-9512-0144b1a96c9c\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpgfgw" Apr 16 15:19:20.763346 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:20.763312 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/70b06ddc-369a-4221-9512-0144b1a96c9c-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpgfgw\" (UID: \"70b06ddc-369a-4221-9512-0144b1a96c9c\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpgfgw" Apr 16 15:19:20.763346 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:20.763351 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7mgf2\" (UniqueName: \"kubernetes.io/projected/70b06ddc-369a-4221-9512-0144b1a96c9c-kube-api-access-7mgf2\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpgfgw\" (UID: \"70b06ddc-369a-4221-9512-0144b1a96c9c\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpgfgw" Apr 16 15:19:20.763547 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:20.763375 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/70b06ddc-369a-4221-9512-0144b1a96c9c-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpgfgw\" (UID: \"70b06ddc-369a-4221-9512-0144b1a96c9c\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpgfgw" Apr 16 15:19:20.763708 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:20.763686 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/70b06ddc-369a-4221-9512-0144b1a96c9c-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpgfgw\" (UID: \"70b06ddc-369a-4221-9512-0144b1a96c9c\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpgfgw" Apr 16 15:19:20.763772 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:20.763756 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/70b06ddc-369a-4221-9512-0144b1a96c9c-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpgfgw\" (UID: \"70b06ddc-369a-4221-9512-0144b1a96c9c\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpgfgw" Apr 16 15:19:20.774285 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:20.774258 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mgf2\" (UniqueName: \"kubernetes.io/projected/70b06ddc-369a-4221-9512-0144b1a96c9c-kube-api-access-7mgf2\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpgfgw\" (UID: \"70b06ddc-369a-4221-9512-0144b1a96c9c\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpgfgw" Apr 16 15:19:20.883947 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:20.883865 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpgfgw" Apr 16 15:19:21.026227 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:21.026204 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpgfgw"] Apr 16 15:19:21.028097 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:19:21.028067 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod70b06ddc_369a_4221_9512_0144b1a96c9c.slice/crio-c07c629df55912adb9e77bad963905cd52beb58c1ac676fa5086a46e6873891e WatchSource:0}: Error finding container c07c629df55912adb9e77bad963905cd52beb58c1ac676fa5086a46e6873891e: Status 404 returned error can't find the container with id c07c629df55912adb9e77bad963905cd52beb58c1ac676fa5086a46e6873891e Apr 16 15:19:21.416596 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:21.416561 2573 generic.go:358] "Generic (PLEG): container finished" podID="70b06ddc-369a-4221-9512-0144b1a96c9c" containerID="0e5b9eaf796848ce8bfa1cf5a5998a57f73892e8d3160c0712a092bd9b346d7f" exitCode=0 Apr 16 15:19:21.416816 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:21.416673 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpgfgw" event={"ID":"70b06ddc-369a-4221-9512-0144b1a96c9c","Type":"ContainerDied","Data":"0e5b9eaf796848ce8bfa1cf5a5998a57f73892e8d3160c0712a092bd9b346d7f"} Apr 16 15:19:21.416816 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:21.416710 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpgfgw" event={"ID":"70b06ddc-369a-4221-9512-0144b1a96c9c","Type":"ContainerStarted","Data":"c07c629df55912adb9e77bad963905cd52beb58c1ac676fa5086a46e6873891e"} Apr 16 15:19:23.594363 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:23.594329 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-zdxd8"] Apr 16 15:19:23.597586 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:23.597569 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-zdxd8" Apr 16 15:19:23.612987 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:23.612963 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 16 15:19:23.613420 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:23.613407 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 16 15:19:23.641906 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:23.641878 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-zdxd8"] Apr 16 15:19:23.684219 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:23.684183 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24bt7\" (UniqueName: \"kubernetes.io/projected/bc6aaf20-00c3-4808-a3de-1e1b56fae4cf-kube-api-access-24bt7\") pod \"cert-manager-cainjector-8966b78d4-zdxd8\" (UID: \"bc6aaf20-00c3-4808-a3de-1e1b56fae4cf\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-zdxd8" Apr 16 15:19:23.684219 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:23.684221 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bc6aaf20-00c3-4808-a3de-1e1b56fae4cf-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-zdxd8\" (UID: \"bc6aaf20-00c3-4808-a3de-1e1b56fae4cf\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-zdxd8" Apr 16 15:19:23.784937 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:23.784903 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-24bt7\" (UniqueName: \"kubernetes.io/projected/bc6aaf20-00c3-4808-a3de-1e1b56fae4cf-kube-api-access-24bt7\") pod \"cert-manager-cainjector-8966b78d4-zdxd8\" (UID: \"bc6aaf20-00c3-4808-a3de-1e1b56fae4cf\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-zdxd8" Apr 16 15:19:23.784937 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:23.784939 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bc6aaf20-00c3-4808-a3de-1e1b56fae4cf-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-zdxd8\" (UID: \"bc6aaf20-00c3-4808-a3de-1e1b56fae4cf\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-zdxd8" Apr 16 15:19:23.797431 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:23.797396 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bc6aaf20-00c3-4808-a3de-1e1b56fae4cf-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-zdxd8\" (UID: \"bc6aaf20-00c3-4808-a3de-1e1b56fae4cf\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-zdxd8" Apr 16 15:19:23.798266 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:23.798243 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-24bt7\" (UniqueName: \"kubernetes.io/projected/bc6aaf20-00c3-4808-a3de-1e1b56fae4cf-kube-api-access-24bt7\") pod \"cert-manager-cainjector-8966b78d4-zdxd8\" (UID: \"bc6aaf20-00c3-4808-a3de-1e1b56fae4cf\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-zdxd8" Apr 16 15:19:23.918420 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:23.918387 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-zdxd8" Apr 16 15:19:24.051713 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:24.051690 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-zdxd8"] Apr 16 15:19:24.053682 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:19:24.053628 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbc6aaf20_00c3_4808_a3de_1e1b56fae4cf.slice/crio-66f20afe0680259f7c160d714055d43dd200b4c1885bae61760db673911c1380 WatchSource:0}: Error finding container 66f20afe0680259f7c160d714055d43dd200b4c1885bae61760db673911c1380: Status 404 returned error can't find the container with id 66f20afe0680259f7c160d714055d43dd200b4c1885bae61760db673911c1380 Apr 16 15:19:24.428314 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:24.428269 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-zdxd8" event={"ID":"bc6aaf20-00c3-4808-a3de-1e1b56fae4cf","Type":"ContainerStarted","Data":"66f20afe0680259f7c160d714055d43dd200b4c1885bae61760db673911c1380"} Apr 16 15:19:27.440709 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:27.440673 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-zdxd8" event={"ID":"bc6aaf20-00c3-4808-a3de-1e1b56fae4cf","Type":"ContainerStarted","Data":"5026d3f29092decf3230841ef659fe02ced46351c0a225a90b545199c6e3f144"} Apr 16 15:19:27.466538 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:27.466481 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-8966b78d4-zdxd8" podStartSLOduration=1.4200231429999999 podStartE2EDuration="4.466467469s" podCreationTimestamp="2026-04-16 15:19:23 +0000 UTC" firstStartedPulling="2026-04-16 15:19:24.055438598 +0000 UTC m=+442.641642403" lastFinishedPulling="2026-04-16 15:19:27.101882942 +0000 UTC m=+445.688086729" observedRunningTime="2026-04-16 15:19:27.465335164 +0000 UTC m=+446.051538973" watchObservedRunningTime="2026-04-16 15:19:27.466467469 +0000 UTC m=+446.052671277" Apr 16 15:19:31.455967 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:31.455938 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpgfgw" event={"ID":"70b06ddc-369a-4221-9512-0144b1a96c9c","Type":"ContainerStarted","Data":"d1cc980ec75f048b2bb52ec7ce17fa3f056393ac70499e4e01ab8436642a314a"} Apr 16 15:19:32.460909 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:32.460873 2573 generic.go:358] "Generic (PLEG): container finished" podID="70b06ddc-369a-4221-9512-0144b1a96c9c" containerID="d1cc980ec75f048b2bb52ec7ce17fa3f056393ac70499e4e01ab8436642a314a" exitCode=0 Apr 16 15:19:32.461371 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:32.460927 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpgfgw" event={"ID":"70b06ddc-369a-4221-9512-0144b1a96c9c","Type":"ContainerDied","Data":"d1cc980ec75f048b2bb52ec7ce17fa3f056393ac70499e4e01ab8436642a314a"} Apr 16 15:19:33.466762 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:33.466729 2573 generic.go:358] "Generic (PLEG): container finished" podID="70b06ddc-369a-4221-9512-0144b1a96c9c" containerID="d6db6c64724c0eea464dd3a44420bc840c4f97d110d9d874c2360ac459d8630a" exitCode=0 Apr 16 15:19:33.467115 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:33.466807 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpgfgw" event={"ID":"70b06ddc-369a-4221-9512-0144b1a96c9c","Type":"ContainerDied","Data":"d6db6c64724c0eea464dd3a44420bc840c4f97d110d9d874c2360ac459d8630a"} Apr 16 15:19:34.591269 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:34.591242 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpgfgw" Apr 16 15:19:34.677034 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:34.677001 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/70b06ddc-369a-4221-9512-0144b1a96c9c-util\") pod \"70b06ddc-369a-4221-9512-0144b1a96c9c\" (UID: \"70b06ddc-369a-4221-9512-0144b1a96c9c\") " Apr 16 15:19:34.677198 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:34.677088 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7mgf2\" (UniqueName: \"kubernetes.io/projected/70b06ddc-369a-4221-9512-0144b1a96c9c-kube-api-access-7mgf2\") pod \"70b06ddc-369a-4221-9512-0144b1a96c9c\" (UID: \"70b06ddc-369a-4221-9512-0144b1a96c9c\") " Apr 16 15:19:34.677198 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:34.677123 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/70b06ddc-369a-4221-9512-0144b1a96c9c-bundle\") pod \"70b06ddc-369a-4221-9512-0144b1a96c9c\" (UID: \"70b06ddc-369a-4221-9512-0144b1a96c9c\") " Apr 16 15:19:34.677552 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:34.677527 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70b06ddc-369a-4221-9512-0144b1a96c9c-bundle" (OuterVolumeSpecName: "bundle") pod "70b06ddc-369a-4221-9512-0144b1a96c9c" (UID: "70b06ddc-369a-4221-9512-0144b1a96c9c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:19:34.679310 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:34.679290 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70b06ddc-369a-4221-9512-0144b1a96c9c-kube-api-access-7mgf2" (OuterVolumeSpecName: "kube-api-access-7mgf2") pod "70b06ddc-369a-4221-9512-0144b1a96c9c" (UID: "70b06ddc-369a-4221-9512-0144b1a96c9c"). InnerVolumeSpecName "kube-api-access-7mgf2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 15:19:34.682728 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:34.682700 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70b06ddc-369a-4221-9512-0144b1a96c9c-util" (OuterVolumeSpecName: "util") pod "70b06ddc-369a-4221-9512-0144b1a96c9c" (UID: "70b06ddc-369a-4221-9512-0144b1a96c9c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:19:34.778403 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:34.778312 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7mgf2\" (UniqueName: \"kubernetes.io/projected/70b06ddc-369a-4221-9512-0144b1a96c9c-kube-api-access-7mgf2\") on node \"ip-10-0-136-151.ec2.internal\" DevicePath \"\"" Apr 16 15:19:34.778403 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:34.778348 2573 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/70b06ddc-369a-4221-9512-0144b1a96c9c-bundle\") on node \"ip-10-0-136-151.ec2.internal\" DevicePath \"\"" Apr 16 15:19:34.778403 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:34.778361 2573 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/70b06ddc-369a-4221-9512-0144b1a96c9c-util\") on node \"ip-10-0-136-151.ec2.internal\" DevicePath \"\"" Apr 16 15:19:35.475887 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:35.475837 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpgfgw" event={"ID":"70b06ddc-369a-4221-9512-0144b1a96c9c","Type":"ContainerDied","Data":"c07c629df55912adb9e77bad963905cd52beb58c1ac676fa5086a46e6873891e"} Apr 16 15:19:35.475887 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:35.475887 2573 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c07c629df55912adb9e77bad963905cd52beb58c1ac676fa5086a46e6873891e" Apr 16 15:19:35.475887 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:35.475889 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpgfgw" Apr 16 15:19:47.029234 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:47.029187 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c55cmd7"] Apr 16 15:19:47.029994 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:47.029971 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="70b06ddc-369a-4221-9512-0144b1a96c9c" containerName="pull" Apr 16 15:19:47.030077 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:47.029999 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="70b06ddc-369a-4221-9512-0144b1a96c9c" containerName="pull" Apr 16 15:19:47.030077 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:47.030040 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="70b06ddc-369a-4221-9512-0144b1a96c9c" containerName="extract" Apr 16 15:19:47.030077 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:47.030055 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="70b06ddc-369a-4221-9512-0144b1a96c9c" containerName="extract" Apr 16 15:19:47.030210 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:47.030088 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="70b06ddc-369a-4221-9512-0144b1a96c9c" containerName="util" Apr 16 15:19:47.030210 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:47.030096 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="70b06ddc-369a-4221-9512-0144b1a96c9c" containerName="util" Apr 16 15:19:47.030299 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:47.030243 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="70b06ddc-369a-4221-9512-0144b1a96c9c" containerName="extract" Apr 16 15:19:47.034670 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:47.034628 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c55cmd7" Apr 16 15:19:47.038013 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:47.037992 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-98fps\"" Apr 16 15:19:47.038537 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:47.038520 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 15:19:47.039177 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:47.039160 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 15:19:47.042753 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:47.042635 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c55cmd7"] Apr 16 15:19:47.187489 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:47.187450 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/10b54af7-3ace-4d52-b9b0-044a659ba4f6-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c55cmd7\" (UID: \"10b54af7-3ace-4d52-b9b0-044a659ba4f6\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c55cmd7" Apr 16 15:19:47.187712 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:47.187521 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/10b54af7-3ace-4d52-b9b0-044a659ba4f6-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c55cmd7\" (UID: \"10b54af7-3ace-4d52-b9b0-044a659ba4f6\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c55cmd7" Apr 16 15:19:47.187712 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:47.187549 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rwz5\" (UniqueName: \"kubernetes.io/projected/10b54af7-3ace-4d52-b9b0-044a659ba4f6-kube-api-access-6rwz5\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c55cmd7\" (UID: \"10b54af7-3ace-4d52-b9b0-044a659ba4f6\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c55cmd7" Apr 16 15:19:47.288994 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:47.288910 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/10b54af7-3ace-4d52-b9b0-044a659ba4f6-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c55cmd7\" (UID: \"10b54af7-3ace-4d52-b9b0-044a659ba4f6\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c55cmd7" Apr 16 15:19:47.288994 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:47.288948 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6rwz5\" (UniqueName: \"kubernetes.io/projected/10b54af7-3ace-4d52-b9b0-044a659ba4f6-kube-api-access-6rwz5\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c55cmd7\" (UID: \"10b54af7-3ace-4d52-b9b0-044a659ba4f6\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c55cmd7" Apr 16 15:19:47.289186 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:47.289003 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/10b54af7-3ace-4d52-b9b0-044a659ba4f6-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c55cmd7\" (UID: \"10b54af7-3ace-4d52-b9b0-044a659ba4f6\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c55cmd7" Apr 16 15:19:47.289372 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:47.289350 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/10b54af7-3ace-4d52-b9b0-044a659ba4f6-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c55cmd7\" (UID: \"10b54af7-3ace-4d52-b9b0-044a659ba4f6\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c55cmd7" Apr 16 15:19:47.289441 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:47.289360 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/10b54af7-3ace-4d52-b9b0-044a659ba4f6-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c55cmd7\" (UID: \"10b54af7-3ace-4d52-b9b0-044a659ba4f6\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c55cmd7" Apr 16 15:19:47.298886 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:47.298859 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rwz5\" (UniqueName: \"kubernetes.io/projected/10b54af7-3ace-4d52-b9b0-044a659ba4f6-kube-api-access-6rwz5\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c55cmd7\" (UID: \"10b54af7-3ace-4d52-b9b0-044a659ba4f6\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c55cmd7" Apr 16 15:19:47.345621 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:47.345594 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c55cmd7" Apr 16 15:19:47.469604 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:47.469580 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c55cmd7"] Apr 16 15:19:47.471868 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:19:47.471835 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10b54af7_3ace_4d52_b9b0_044a659ba4f6.slice/crio-a28daf012685c4e68296b0b1064db1ac5043642aab045b0f91bfd5f99c0ca179 WatchSource:0}: Error finding container a28daf012685c4e68296b0b1064db1ac5043642aab045b0f91bfd5f99c0ca179: Status 404 returned error can't find the container with id a28daf012685c4e68296b0b1064db1ac5043642aab045b0f91bfd5f99c0ca179 Apr 16 15:19:47.516258 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:47.516230 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c55cmd7" event={"ID":"10b54af7-3ace-4d52-b9b0-044a659ba4f6","Type":"ContainerStarted","Data":"a28daf012685c4e68296b0b1064db1ac5043642aab045b0f91bfd5f99c0ca179"} Apr 16 15:19:48.521043 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:48.521008 2573 generic.go:358] "Generic (PLEG): container finished" podID="10b54af7-3ace-4d52-b9b0-044a659ba4f6" containerID="7985bcfe319ab355744bee87c3e82cb25ebff54a8677763cbcd0b20cd2111533" exitCode=0 Apr 16 15:19:48.521398 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:48.521091 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c55cmd7" event={"ID":"10b54af7-3ace-4d52-b9b0-044a659ba4f6","Type":"ContainerDied","Data":"7985bcfe319ab355744bee87c3e82cb25ebff54a8677763cbcd0b20cd2111533"} Apr 16 15:19:49.525887 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:49.525853 2573 generic.go:358] "Generic (PLEG): container finished" podID="10b54af7-3ace-4d52-b9b0-044a659ba4f6" containerID="e4d71d407936639d22f7db9f4772c8d8441335f29af3b77a67130f847c2a67d2" exitCode=0 Apr 16 15:19:49.526250 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:49.525940 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c55cmd7" event={"ID":"10b54af7-3ace-4d52-b9b0-044a659ba4f6","Type":"ContainerDied","Data":"e4d71d407936639d22f7db9f4772c8d8441335f29af3b77a67130f847c2a67d2"} Apr 16 15:19:50.531383 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:50.531349 2573 generic.go:358] "Generic (PLEG): container finished" podID="10b54af7-3ace-4d52-b9b0-044a659ba4f6" containerID="623da920ed9e0acb1e7a6d0bbd904a0eecf0ead43c257172e60adf92283b2ef4" exitCode=0 Apr 16 15:19:50.531813 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:50.531406 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c55cmd7" event={"ID":"10b54af7-3ace-4d52-b9b0-044a659ba4f6","Type":"ContainerDied","Data":"623da920ed9e0acb1e7a6d0bbd904a0eecf0ead43c257172e60adf92283b2ef4"} Apr 16 15:19:51.658488 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:51.658468 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c55cmd7" Apr 16 15:19:51.729606 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:51.729572 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/10b54af7-3ace-4d52-b9b0-044a659ba4f6-util\") pod \"10b54af7-3ace-4d52-b9b0-044a659ba4f6\" (UID: \"10b54af7-3ace-4d52-b9b0-044a659ba4f6\") " Apr 16 15:19:51.729789 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:51.729642 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6rwz5\" (UniqueName: \"kubernetes.io/projected/10b54af7-3ace-4d52-b9b0-044a659ba4f6-kube-api-access-6rwz5\") pod \"10b54af7-3ace-4d52-b9b0-044a659ba4f6\" (UID: \"10b54af7-3ace-4d52-b9b0-044a659ba4f6\") " Apr 16 15:19:51.729789 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:51.729749 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/10b54af7-3ace-4d52-b9b0-044a659ba4f6-bundle\") pod \"10b54af7-3ace-4d52-b9b0-044a659ba4f6\" (UID: \"10b54af7-3ace-4d52-b9b0-044a659ba4f6\") " Apr 16 15:19:51.730555 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:51.730523 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10b54af7-3ace-4d52-b9b0-044a659ba4f6-bundle" (OuterVolumeSpecName: "bundle") pod "10b54af7-3ace-4d52-b9b0-044a659ba4f6" (UID: "10b54af7-3ace-4d52-b9b0-044a659ba4f6"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:19:51.731882 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:51.731849 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10b54af7-3ace-4d52-b9b0-044a659ba4f6-kube-api-access-6rwz5" (OuterVolumeSpecName: "kube-api-access-6rwz5") pod "10b54af7-3ace-4d52-b9b0-044a659ba4f6" (UID: "10b54af7-3ace-4d52-b9b0-044a659ba4f6"). InnerVolumeSpecName "kube-api-access-6rwz5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 15:19:51.735761 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:51.735736 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10b54af7-3ace-4d52-b9b0-044a659ba4f6-util" (OuterVolumeSpecName: "util") pod "10b54af7-3ace-4d52-b9b0-044a659ba4f6" (UID: "10b54af7-3ace-4d52-b9b0-044a659ba4f6"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:19:51.831339 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:51.831272 2573 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/10b54af7-3ace-4d52-b9b0-044a659ba4f6-bundle\") on node \"ip-10-0-136-151.ec2.internal\" DevicePath \"\"" Apr 16 15:19:51.831339 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:51.831297 2573 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/10b54af7-3ace-4d52-b9b0-044a659ba4f6-util\") on node \"ip-10-0-136-151.ec2.internal\" DevicePath \"\"" Apr 16 15:19:51.831339 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:51.831308 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6rwz5\" (UniqueName: \"kubernetes.io/projected/10b54af7-3ace-4d52-b9b0-044a659ba4f6-kube-api-access-6rwz5\") on node \"ip-10-0-136-151.ec2.internal\" DevicePath \"\"" Apr 16 15:19:52.540070 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:52.540036 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c55cmd7" event={"ID":"10b54af7-3ace-4d52-b9b0-044a659ba4f6","Type":"ContainerDied","Data":"a28daf012685c4e68296b0b1064db1ac5043642aab045b0f91bfd5f99c0ca179"} Apr 16 15:19:52.540070 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:52.540065 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c55cmd7" Apr 16 15:19:52.540279 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:19:52.540071 2573 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a28daf012685c4e68296b0b1064db1ac5043642aab045b0f91bfd5f99c0ca179" Apr 16 15:20:00.983454 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:00.983415 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c995qb8"] Apr 16 15:20:00.983866 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:00.983757 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="10b54af7-3ace-4d52-b9b0-044a659ba4f6" containerName="extract" Apr 16 15:20:00.983866 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:00.983768 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="10b54af7-3ace-4d52-b9b0-044a659ba4f6" containerName="extract" Apr 16 15:20:00.983866 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:00.983778 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="10b54af7-3ace-4d52-b9b0-044a659ba4f6" containerName="util" Apr 16 15:20:00.983866 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:00.983783 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="10b54af7-3ace-4d52-b9b0-044a659ba4f6" containerName="util" Apr 16 15:20:00.983866 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:00.983804 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="10b54af7-3ace-4d52-b9b0-044a659ba4f6" containerName="pull" Apr 16 15:20:00.983866 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:00.983810 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="10b54af7-3ace-4d52-b9b0-044a659ba4f6" containerName="pull" Apr 16 15:20:00.983866 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:00.983857 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="10b54af7-3ace-4d52-b9b0-044a659ba4f6" containerName="extract" Apr 16 15:20:00.990881 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:00.990855 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c995qb8" Apr 16 15:20:00.999854 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:00.999824 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 15:20:01.000773 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:01.000749 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 15:20:01.001004 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:01.000863 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-98fps\"" Apr 16 15:20:01.002120 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:01.002096 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c995qb8"] Apr 16 15:20:01.110791 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:01.110753 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e9414917-f119-49c7-9c34-9726119aef26-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c995qb8\" (UID: \"e9414917-f119-49c7-9c34-9726119aef26\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c995qb8" Apr 16 15:20:01.110993 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:01.110802 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e9414917-f119-49c7-9c34-9726119aef26-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c995qb8\" (UID: \"e9414917-f119-49c7-9c34-9726119aef26\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c995qb8" Apr 16 15:20:01.110993 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:01.110946 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwncl\" (UniqueName: \"kubernetes.io/projected/e9414917-f119-49c7-9c34-9726119aef26-kube-api-access-zwncl\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c995qb8\" (UID: \"e9414917-f119-49c7-9c34-9726119aef26\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c995qb8" Apr 16 15:20:01.211615 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:01.211573 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zwncl\" (UniqueName: \"kubernetes.io/projected/e9414917-f119-49c7-9c34-9726119aef26-kube-api-access-zwncl\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c995qb8\" (UID: \"e9414917-f119-49c7-9c34-9726119aef26\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c995qb8" Apr 16 15:20:01.211856 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:01.211624 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e9414917-f119-49c7-9c34-9726119aef26-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c995qb8\" (UID: \"e9414917-f119-49c7-9c34-9726119aef26\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c995qb8" Apr 16 15:20:01.211856 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:01.211693 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e9414917-f119-49c7-9c34-9726119aef26-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c995qb8\" (UID: \"e9414917-f119-49c7-9c34-9726119aef26\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c995qb8" Apr 16 15:20:01.212019 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:01.211996 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e9414917-f119-49c7-9c34-9726119aef26-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c995qb8\" (UID: \"e9414917-f119-49c7-9c34-9726119aef26\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c995qb8" Apr 16 15:20:01.212081 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:01.212046 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e9414917-f119-49c7-9c34-9726119aef26-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c995qb8\" (UID: \"e9414917-f119-49c7-9c34-9726119aef26\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c995qb8" Apr 16 15:20:01.225318 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:01.225295 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwncl\" (UniqueName: \"kubernetes.io/projected/e9414917-f119-49c7-9c34-9726119aef26-kube-api-access-zwncl\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c995qb8\" (UID: \"e9414917-f119-49c7-9c34-9726119aef26\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c995qb8" Apr 16 15:20:01.300295 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:01.300198 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c995qb8" Apr 16 15:20:01.437230 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:01.437193 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c995qb8"] Apr 16 15:20:01.439290 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:20:01.439262 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9414917_f119_49c7_9c34_9726119aef26.slice/crio-777be7bdd1ac2dfa050c6e49b94d181e94ba235c8248c864972b2073d1d5abb4 WatchSource:0}: Error finding container 777be7bdd1ac2dfa050c6e49b94d181e94ba235c8248c864972b2073d1d5abb4: Status 404 returned error can't find the container with id 777be7bdd1ac2dfa050c6e49b94d181e94ba235c8248c864972b2073d1d5abb4 Apr 16 15:20:01.570546 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:01.570476 2573 generic.go:358] "Generic (PLEG): container finished" podID="e9414917-f119-49c7-9c34-9726119aef26" containerID="25adf8067bf2cdc5064b80942b86eb5d2de5f2bd8af8dfbce35c70539cea2e7f" exitCode=0 Apr 16 15:20:01.570678 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:01.570543 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c995qb8" event={"ID":"e9414917-f119-49c7-9c34-9726119aef26","Type":"ContainerDied","Data":"25adf8067bf2cdc5064b80942b86eb5d2de5f2bd8af8dfbce35c70539cea2e7f"} Apr 16 15:20:01.570678 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:01.570577 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c995qb8" event={"ID":"e9414917-f119-49c7-9c34-9726119aef26","Type":"ContainerStarted","Data":"777be7bdd1ac2dfa050c6e49b94d181e94ba235c8248c864972b2073d1d5abb4"} Apr 16 15:20:02.582337 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:02.582304 2573 generic.go:358] "Generic (PLEG): container finished" podID="e9414917-f119-49c7-9c34-9726119aef26" containerID="0bafbf32da65dc241f849b4c3fae83b311b9151313d27d600307ce0d59ca9328" exitCode=0 Apr 16 15:20:02.582814 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:02.582397 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c995qb8" event={"ID":"e9414917-f119-49c7-9c34-9726119aef26","Type":"ContainerDied","Data":"0bafbf32da65dc241f849b4c3fae83b311b9151313d27d600307ce0d59ca9328"} Apr 16 15:20:03.183686 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:03.183637 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-68df4b58f7-fsbq9"] Apr 16 15:20:03.198710 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:03.198678 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-68df4b58f7-fsbq9" Apr 16 15:20:03.202505 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:03.202475 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-68df4b58f7-fsbq9"] Apr 16 15:20:03.202744 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:03.202720 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-tfg64\"" Apr 16 15:20:03.202826 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:03.202785 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 16 15:20:03.202900 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:03.202732 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 16 15:20:03.202985 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:03.202970 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 16 15:20:03.203315 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:03.203295 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 16 15:20:03.326334 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:03.326296 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jtr7\" (UniqueName: \"kubernetes.io/projected/426be61e-17e1-4f01-b19f-3ab462b45ca7-kube-api-access-8jtr7\") pod \"opendatahub-operator-controller-manager-68df4b58f7-fsbq9\" (UID: \"426be61e-17e1-4f01-b19f-3ab462b45ca7\") " pod="opendatahub/opendatahub-operator-controller-manager-68df4b58f7-fsbq9" Apr 16 15:20:03.326334 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:03.326336 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/426be61e-17e1-4f01-b19f-3ab462b45ca7-webhook-cert\") pod \"opendatahub-operator-controller-manager-68df4b58f7-fsbq9\" (UID: \"426be61e-17e1-4f01-b19f-3ab462b45ca7\") " pod="opendatahub/opendatahub-operator-controller-manager-68df4b58f7-fsbq9" Apr 16 15:20:03.326540 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:03.326366 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/426be61e-17e1-4f01-b19f-3ab462b45ca7-apiservice-cert\") pod \"opendatahub-operator-controller-manager-68df4b58f7-fsbq9\" (UID: \"426be61e-17e1-4f01-b19f-3ab462b45ca7\") " pod="opendatahub/opendatahub-operator-controller-manager-68df4b58f7-fsbq9" Apr 16 15:20:03.427852 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:03.427806 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8jtr7\" (UniqueName: \"kubernetes.io/projected/426be61e-17e1-4f01-b19f-3ab462b45ca7-kube-api-access-8jtr7\") pod \"opendatahub-operator-controller-manager-68df4b58f7-fsbq9\" (UID: \"426be61e-17e1-4f01-b19f-3ab462b45ca7\") " pod="opendatahub/opendatahub-operator-controller-manager-68df4b58f7-fsbq9" Apr 16 15:20:03.427852 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:03.427856 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/426be61e-17e1-4f01-b19f-3ab462b45ca7-webhook-cert\") pod \"opendatahub-operator-controller-manager-68df4b58f7-fsbq9\" (UID: \"426be61e-17e1-4f01-b19f-3ab462b45ca7\") " pod="opendatahub/opendatahub-operator-controller-manager-68df4b58f7-fsbq9" Apr 16 15:20:03.428058 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:03.427891 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/426be61e-17e1-4f01-b19f-3ab462b45ca7-apiservice-cert\") pod \"opendatahub-operator-controller-manager-68df4b58f7-fsbq9\" (UID: \"426be61e-17e1-4f01-b19f-3ab462b45ca7\") " pod="opendatahub/opendatahub-operator-controller-manager-68df4b58f7-fsbq9" Apr 16 15:20:03.430549 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:03.430524 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/426be61e-17e1-4f01-b19f-3ab462b45ca7-apiservice-cert\") pod \"opendatahub-operator-controller-manager-68df4b58f7-fsbq9\" (UID: \"426be61e-17e1-4f01-b19f-3ab462b45ca7\") " pod="opendatahub/opendatahub-operator-controller-manager-68df4b58f7-fsbq9" Apr 16 15:20:03.430691 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:03.430668 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/426be61e-17e1-4f01-b19f-3ab462b45ca7-webhook-cert\") pod \"opendatahub-operator-controller-manager-68df4b58f7-fsbq9\" (UID: \"426be61e-17e1-4f01-b19f-3ab462b45ca7\") " pod="opendatahub/opendatahub-operator-controller-manager-68df4b58f7-fsbq9" Apr 16 15:20:03.437623 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:03.437543 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jtr7\" (UniqueName: \"kubernetes.io/projected/426be61e-17e1-4f01-b19f-3ab462b45ca7-kube-api-access-8jtr7\") pod \"opendatahub-operator-controller-manager-68df4b58f7-fsbq9\" (UID: \"426be61e-17e1-4f01-b19f-3ab462b45ca7\") " pod="opendatahub/opendatahub-operator-controller-manager-68df4b58f7-fsbq9" Apr 16 15:20:03.517802 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:03.517776 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-68df4b58f7-fsbq9" Apr 16 15:20:03.588236 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:03.588202 2573 generic.go:358] "Generic (PLEG): container finished" podID="e9414917-f119-49c7-9c34-9726119aef26" containerID="78b951bda0b644e5881cca1b3015fe6ac9fba04ebd4bfeb5ad96796a7aa47606" exitCode=0 Apr 16 15:20:03.588575 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:03.588286 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c995qb8" event={"ID":"e9414917-f119-49c7-9c34-9726119aef26","Type":"ContainerDied","Data":"78b951bda0b644e5881cca1b3015fe6ac9fba04ebd4bfeb5ad96796a7aa47606"} Apr 16 15:20:03.665667 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:03.665607 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-68df4b58f7-fsbq9"] Apr 16 15:20:03.670912 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:20:03.670876 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod426be61e_17e1_4f01_b19f_3ab462b45ca7.slice/crio-630fa6eabe4b29aa0214cfb7e9237cfa83d443912e9a1decaa8b0b3013bb3a5f WatchSource:0}: Error finding container 630fa6eabe4b29aa0214cfb7e9237cfa83d443912e9a1decaa8b0b3013bb3a5f: Status 404 returned error can't find the container with id 630fa6eabe4b29aa0214cfb7e9237cfa83d443912e9a1decaa8b0b3013bb3a5f Apr 16 15:20:03.795296 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:03.795223 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-6fc585dfcd-2r765"] Apr 16 15:20:03.800007 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:03.799976 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-6fc585dfcd-2r765" Apr 16 15:20:03.804249 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:03.804227 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 16 15:20:03.804385 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:03.804294 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 16 15:20:03.804385 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:03.804295 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 16 15:20:03.804516 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:03.804501 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 16 15:20:03.804714 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:03.804698 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-gqln7\"" Apr 16 15:20:03.804810 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:03.804727 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 16 15:20:03.811758 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:03.811734 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-6fc585dfcd-2r765"] Apr 16 15:20:03.932980 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:03.932940 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2c8cc64e-869f-4398-8880-ccca3d33543f-cert\") pod \"lws-controller-manager-6fc585dfcd-2r765\" (UID: \"2c8cc64e-869f-4398-8880-ccca3d33543f\") " pod="openshift-lws-operator/lws-controller-manager-6fc585dfcd-2r765" Apr 16 15:20:03.932980 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:03.932985 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7twst\" (UniqueName: \"kubernetes.io/projected/2c8cc64e-869f-4398-8880-ccca3d33543f-kube-api-access-7twst\") pod \"lws-controller-manager-6fc585dfcd-2r765\" (UID: \"2c8cc64e-869f-4398-8880-ccca3d33543f\") " pod="openshift-lws-operator/lws-controller-manager-6fc585dfcd-2r765" Apr 16 15:20:03.933188 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:03.933032 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/2c8cc64e-869f-4398-8880-ccca3d33543f-metrics-cert\") pod \"lws-controller-manager-6fc585dfcd-2r765\" (UID: \"2c8cc64e-869f-4398-8880-ccca3d33543f\") " pod="openshift-lws-operator/lws-controller-manager-6fc585dfcd-2r765" Apr 16 15:20:03.933188 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:03.933079 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/2c8cc64e-869f-4398-8880-ccca3d33543f-manager-config\") pod \"lws-controller-manager-6fc585dfcd-2r765\" (UID: \"2c8cc64e-869f-4398-8880-ccca3d33543f\") " pod="openshift-lws-operator/lws-controller-manager-6fc585dfcd-2r765" Apr 16 15:20:04.034243 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:04.034206 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2c8cc64e-869f-4398-8880-ccca3d33543f-cert\") pod \"lws-controller-manager-6fc585dfcd-2r765\" (UID: \"2c8cc64e-869f-4398-8880-ccca3d33543f\") " pod="openshift-lws-operator/lws-controller-manager-6fc585dfcd-2r765" Apr 16 15:20:04.034243 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:04.034244 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7twst\" (UniqueName: \"kubernetes.io/projected/2c8cc64e-869f-4398-8880-ccca3d33543f-kube-api-access-7twst\") pod \"lws-controller-manager-6fc585dfcd-2r765\" (UID: \"2c8cc64e-869f-4398-8880-ccca3d33543f\") " pod="openshift-lws-operator/lws-controller-manager-6fc585dfcd-2r765" Apr 16 15:20:04.034441 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:04.034274 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/2c8cc64e-869f-4398-8880-ccca3d33543f-metrics-cert\") pod \"lws-controller-manager-6fc585dfcd-2r765\" (UID: \"2c8cc64e-869f-4398-8880-ccca3d33543f\") " pod="openshift-lws-operator/lws-controller-manager-6fc585dfcd-2r765" Apr 16 15:20:04.034441 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:04.034303 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/2c8cc64e-869f-4398-8880-ccca3d33543f-manager-config\") pod \"lws-controller-manager-6fc585dfcd-2r765\" (UID: \"2c8cc64e-869f-4398-8880-ccca3d33543f\") " pod="openshift-lws-operator/lws-controller-manager-6fc585dfcd-2r765" Apr 16 15:20:04.035019 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:04.034991 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/2c8cc64e-869f-4398-8880-ccca3d33543f-manager-config\") pod \"lws-controller-manager-6fc585dfcd-2r765\" (UID: \"2c8cc64e-869f-4398-8880-ccca3d33543f\") " pod="openshift-lws-operator/lws-controller-manager-6fc585dfcd-2r765" Apr 16 15:20:04.036930 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:04.036905 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2c8cc64e-869f-4398-8880-ccca3d33543f-cert\") pod \"lws-controller-manager-6fc585dfcd-2r765\" (UID: \"2c8cc64e-869f-4398-8880-ccca3d33543f\") " pod="openshift-lws-operator/lws-controller-manager-6fc585dfcd-2r765" Apr 16 15:20:04.037032 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:04.036951 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/2c8cc64e-869f-4398-8880-ccca3d33543f-metrics-cert\") pod \"lws-controller-manager-6fc585dfcd-2r765\" (UID: \"2c8cc64e-869f-4398-8880-ccca3d33543f\") " pod="openshift-lws-operator/lws-controller-manager-6fc585dfcd-2r765" Apr 16 15:20:04.054281 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:04.054225 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7twst\" (UniqueName: \"kubernetes.io/projected/2c8cc64e-869f-4398-8880-ccca3d33543f-kube-api-access-7twst\") pod \"lws-controller-manager-6fc585dfcd-2r765\" (UID: \"2c8cc64e-869f-4398-8880-ccca3d33543f\") " pod="openshift-lws-operator/lws-controller-manager-6fc585dfcd-2r765" Apr 16 15:20:04.110071 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:04.110016 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-6fc585dfcd-2r765" Apr 16 15:20:04.265679 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:04.265633 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-6fc585dfcd-2r765"] Apr 16 15:20:04.268011 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:20:04.267977 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c8cc64e_869f_4398_8880_ccca3d33543f.slice/crio-fced3f36ba2c8e6a7db27ed69c726edb16f5a78e3f8b1f1e46ddb74b5fe85486 WatchSource:0}: Error finding container fced3f36ba2c8e6a7db27ed69c726edb16f5a78e3f8b1f1e46ddb74b5fe85486: Status 404 returned error can't find the container with id fced3f36ba2c8e6a7db27ed69c726edb16f5a78e3f8b1f1e46ddb74b5fe85486 Apr 16 15:20:04.594103 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:04.594063 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-6fc585dfcd-2r765" event={"ID":"2c8cc64e-869f-4398-8880-ccca3d33543f","Type":"ContainerStarted","Data":"fced3f36ba2c8e6a7db27ed69c726edb16f5a78e3f8b1f1e46ddb74b5fe85486"} Apr 16 15:20:04.596336 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:04.596275 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-68df4b58f7-fsbq9" event={"ID":"426be61e-17e1-4f01-b19f-3ab462b45ca7","Type":"ContainerStarted","Data":"630fa6eabe4b29aa0214cfb7e9237cfa83d443912e9a1decaa8b0b3013bb3a5f"} Apr 16 15:20:04.767423 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:04.766799 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c995qb8" Apr 16 15:20:04.944206 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:04.944168 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zwncl\" (UniqueName: \"kubernetes.io/projected/e9414917-f119-49c7-9c34-9726119aef26-kube-api-access-zwncl\") pod \"e9414917-f119-49c7-9c34-9726119aef26\" (UID: \"e9414917-f119-49c7-9c34-9726119aef26\") " Apr 16 15:20:04.944394 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:04.944257 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e9414917-f119-49c7-9c34-9726119aef26-bundle\") pod \"e9414917-f119-49c7-9c34-9726119aef26\" (UID: \"e9414917-f119-49c7-9c34-9726119aef26\") " Apr 16 15:20:04.944394 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:04.944311 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e9414917-f119-49c7-9c34-9726119aef26-util\") pod \"e9414917-f119-49c7-9c34-9726119aef26\" (UID: \"e9414917-f119-49c7-9c34-9726119aef26\") " Apr 16 15:20:04.945618 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:04.945586 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9414917-f119-49c7-9c34-9726119aef26-bundle" (OuterVolumeSpecName: "bundle") pod "e9414917-f119-49c7-9c34-9726119aef26" (UID: "e9414917-f119-49c7-9c34-9726119aef26"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:20:04.947097 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:04.947063 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9414917-f119-49c7-9c34-9726119aef26-kube-api-access-zwncl" (OuterVolumeSpecName: "kube-api-access-zwncl") pod "e9414917-f119-49c7-9c34-9726119aef26" (UID: "e9414917-f119-49c7-9c34-9726119aef26"). InnerVolumeSpecName "kube-api-access-zwncl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 15:20:04.951960 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:04.951929 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9414917-f119-49c7-9c34-9726119aef26-util" (OuterVolumeSpecName: "util") pod "e9414917-f119-49c7-9c34-9726119aef26" (UID: "e9414917-f119-49c7-9c34-9726119aef26"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:20:05.045544 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:05.045500 2573 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e9414917-f119-49c7-9c34-9726119aef26-util\") on node \"ip-10-0-136-151.ec2.internal\" DevicePath \"\"" Apr 16 15:20:05.045544 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:05.045538 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zwncl\" (UniqueName: \"kubernetes.io/projected/e9414917-f119-49c7-9c34-9726119aef26-kube-api-access-zwncl\") on node \"ip-10-0-136-151.ec2.internal\" DevicePath \"\"" Apr 16 15:20:05.045765 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:05.045554 2573 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e9414917-f119-49c7-9c34-9726119aef26-bundle\") on node \"ip-10-0-136-151.ec2.internal\" DevicePath \"\"" Apr 16 15:20:05.601643 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:05.601610 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c995qb8" Apr 16 15:20:05.602076 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:05.601616 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c995qb8" event={"ID":"e9414917-f119-49c7-9c34-9726119aef26","Type":"ContainerDied","Data":"777be7bdd1ac2dfa050c6e49b94d181e94ba235c8248c864972b2073d1d5abb4"} Apr 16 15:20:05.602076 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:05.601706 2573 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="777be7bdd1ac2dfa050c6e49b94d181e94ba235c8248c864972b2073d1d5abb4" Apr 16 15:20:06.607100 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:06.607007 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-68df4b58f7-fsbq9" event={"ID":"426be61e-17e1-4f01-b19f-3ab462b45ca7","Type":"ContainerStarted","Data":"ee9de7bef4d4a3b72f93abbb58bc7abc9ea25db8f85b074fd108eea8a7dd8288"} Apr 16 15:20:06.607532 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:06.607229 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-68df4b58f7-fsbq9" Apr 16 15:20:06.629782 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:06.629724 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-68df4b58f7-fsbq9" podStartSLOduration=1.102345224 podStartE2EDuration="3.629707938s" podCreationTimestamp="2026-04-16 15:20:03 +0000 UTC" firstStartedPulling="2026-04-16 15:20:03.672780139 +0000 UTC m=+482.258983931" lastFinishedPulling="2026-04-16 15:20:06.200142855 +0000 UTC m=+484.786346645" observedRunningTime="2026-04-16 15:20:06.627089098 +0000 UTC m=+485.213292906" watchObservedRunningTime="2026-04-16 15:20:06.629707938 +0000 UTC m=+485.215911749" Apr 16 15:20:07.613258 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:07.613210 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-6fc585dfcd-2r765" event={"ID":"2c8cc64e-869f-4398-8880-ccca3d33543f","Type":"ContainerStarted","Data":"3dd67d6b54d760d8aba2062561a62ebed985e871fa2cf8bb36ceb4c7fc405811"} Apr 16 15:20:07.613719 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:07.613434 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-6fc585dfcd-2r765" Apr 16 15:20:07.634493 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:07.634449 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-6fc585dfcd-2r765" podStartSLOduration=1.844760172 podStartE2EDuration="4.634438739s" podCreationTimestamp="2026-04-16 15:20:03 +0000 UTC" firstStartedPulling="2026-04-16 15:20:04.270747397 +0000 UTC m=+482.856951196" lastFinishedPulling="2026-04-16 15:20:07.060425976 +0000 UTC m=+485.646629763" observedRunningTime="2026-04-16 15:20:07.631722005 +0000 UTC m=+486.217925816" watchObservedRunningTime="2026-04-16 15:20:07.634438739 +0000 UTC m=+486.220642547" Apr 16 15:20:17.615714 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:17.615682 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-68df4b58f7-fsbq9" Apr 16 15:20:18.620207 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:18.620176 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-6fc585dfcd-2r765" Apr 16 15:20:23.405502 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:23.405468 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483597h5k"] Apr 16 15:20:23.405908 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:23.405816 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e9414917-f119-49c7-9c34-9726119aef26" containerName="extract" Apr 16 15:20:23.405908 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:23.405827 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9414917-f119-49c7-9c34-9726119aef26" containerName="extract" Apr 16 15:20:23.405908 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:23.405843 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e9414917-f119-49c7-9c34-9726119aef26" containerName="pull" Apr 16 15:20:23.405908 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:23.405849 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9414917-f119-49c7-9c34-9726119aef26" containerName="pull" Apr 16 15:20:23.405908 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:23.405866 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e9414917-f119-49c7-9c34-9726119aef26" containerName="util" Apr 16 15:20:23.405908 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:23.405872 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9414917-f119-49c7-9c34-9726119aef26" containerName="util" Apr 16 15:20:23.406091 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:23.405923 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="e9414917-f119-49c7-9c34-9726119aef26" containerName="extract" Apr 16 15:20:23.410525 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:23.410495 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483597h5k" Apr 16 15:20:23.413863 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:23.413840 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-98fps\"" Apr 16 15:20:23.413979 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:23.413843 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 15:20:23.415317 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:23.415297 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 15:20:23.423320 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:23.423300 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483597h5k"] Apr 16 15:20:23.491782 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:23.491746 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zp6j\" (UniqueName: \"kubernetes.io/projected/9d72ff65-9208-4cbd-82a4-2e821f1303a7-kube-api-access-7zp6j\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483597h5k\" (UID: \"9d72ff65-9208-4cbd-82a4-2e821f1303a7\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483597h5k" Apr 16 15:20:23.491943 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:23.491800 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9d72ff65-9208-4cbd-82a4-2e821f1303a7-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483597h5k\" (UID: \"9d72ff65-9208-4cbd-82a4-2e821f1303a7\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483597h5k" Apr 16 15:20:23.491943 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:23.491887 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9d72ff65-9208-4cbd-82a4-2e821f1303a7-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483597h5k\" (UID: \"9d72ff65-9208-4cbd-82a4-2e821f1303a7\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483597h5k" Apr 16 15:20:23.592875 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:23.592840 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7zp6j\" (UniqueName: \"kubernetes.io/projected/9d72ff65-9208-4cbd-82a4-2e821f1303a7-kube-api-access-7zp6j\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483597h5k\" (UID: \"9d72ff65-9208-4cbd-82a4-2e821f1303a7\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483597h5k" Apr 16 15:20:23.592875 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:23.592881 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9d72ff65-9208-4cbd-82a4-2e821f1303a7-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483597h5k\" (UID: \"9d72ff65-9208-4cbd-82a4-2e821f1303a7\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483597h5k" Apr 16 15:20:23.593126 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:23.592926 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9d72ff65-9208-4cbd-82a4-2e821f1303a7-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483597h5k\" (UID: \"9d72ff65-9208-4cbd-82a4-2e821f1303a7\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483597h5k" Apr 16 15:20:23.593273 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:23.593256 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9d72ff65-9208-4cbd-82a4-2e821f1303a7-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483597h5k\" (UID: \"9d72ff65-9208-4cbd-82a4-2e821f1303a7\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483597h5k" Apr 16 15:20:23.593326 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:23.593283 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9d72ff65-9208-4cbd-82a4-2e821f1303a7-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483597h5k\" (UID: \"9d72ff65-9208-4cbd-82a4-2e821f1303a7\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483597h5k" Apr 16 15:20:23.602467 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:23.602447 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zp6j\" (UniqueName: \"kubernetes.io/projected/9d72ff65-9208-4cbd-82a4-2e821f1303a7-kube-api-access-7zp6j\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483597h5k\" (UID: \"9d72ff65-9208-4cbd-82a4-2e821f1303a7\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483597h5k" Apr 16 15:20:23.720957 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:23.720871 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483597h5k" Apr 16 15:20:23.848270 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:23.848244 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483597h5k"] Apr 16 15:20:23.849685 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:20:23.849643 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d72ff65_9208_4cbd_82a4_2e821f1303a7.slice/crio-da1a445ea7067527187cb163e49a145ca8c1f9b051c769eda1fc112f24bf8451 WatchSource:0}: Error finding container da1a445ea7067527187cb163e49a145ca8c1f9b051c769eda1fc112f24bf8451: Status 404 returned error can't find the container with id da1a445ea7067527187cb163e49a145ca8c1f9b051c769eda1fc112f24bf8451 Apr 16 15:20:24.673956 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:24.673920 2573 generic.go:358] "Generic (PLEG): container finished" podID="9d72ff65-9208-4cbd-82a4-2e821f1303a7" containerID="f10b913a865dc3771887f7f2caa8ef791ee968a8889cca45cbf77fb776890e6d" exitCode=0 Apr 16 15:20:24.674286 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:24.674010 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483597h5k" event={"ID":"9d72ff65-9208-4cbd-82a4-2e821f1303a7","Type":"ContainerDied","Data":"f10b913a865dc3771887f7f2caa8ef791ee968a8889cca45cbf77fb776890e6d"} Apr 16 15:20:24.674286 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:24.674041 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483597h5k" event={"ID":"9d72ff65-9208-4cbd-82a4-2e821f1303a7","Type":"ContainerStarted","Data":"da1a445ea7067527187cb163e49a145ca8c1f9b051c769eda1fc112f24bf8451"} Apr 16 15:20:24.843309 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:24.843266 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-5bb547c98c-s85c9"] Apr 16 15:20:24.846780 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:24.846757 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-5bb547c98c-s85c9" Apr 16 15:20:24.852249 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:24.852227 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 16 15:20:24.852382 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:24.852281 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-p8mts\"" Apr 16 15:20:24.852382 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:24.852345 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 16 15:20:24.860836 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:24.860785 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-5bb547c98c-s85c9"] Apr 16 15:20:24.905569 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:24.905545 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6b1f0daf-7085-452a-bcf8-246adaa9a7b1-tls-certs\") pod \"kube-auth-proxy-5bb547c98c-s85c9\" (UID: \"6b1f0daf-7085-452a-bcf8-246adaa9a7b1\") " pod="openshift-ingress/kube-auth-proxy-5bb547c98c-s85c9" Apr 16 15:20:24.905706 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:24.905577 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6b1f0daf-7085-452a-bcf8-246adaa9a7b1-tmp\") pod \"kube-auth-proxy-5bb547c98c-s85c9\" (UID: \"6b1f0daf-7085-452a-bcf8-246adaa9a7b1\") " pod="openshift-ingress/kube-auth-proxy-5bb547c98c-s85c9" Apr 16 15:20:24.905750 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:24.905722 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bx77\" (UniqueName: \"kubernetes.io/projected/6b1f0daf-7085-452a-bcf8-246adaa9a7b1-kube-api-access-9bx77\") pod \"kube-auth-proxy-5bb547c98c-s85c9\" (UID: \"6b1f0daf-7085-452a-bcf8-246adaa9a7b1\") " pod="openshift-ingress/kube-auth-proxy-5bb547c98c-s85c9" Apr 16 15:20:25.006357 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:25.006320 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6b1f0daf-7085-452a-bcf8-246adaa9a7b1-tmp\") pod \"kube-auth-proxy-5bb547c98c-s85c9\" (UID: \"6b1f0daf-7085-452a-bcf8-246adaa9a7b1\") " pod="openshift-ingress/kube-auth-proxy-5bb547c98c-s85c9" Apr 16 15:20:25.006549 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:25.006396 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9bx77\" (UniqueName: \"kubernetes.io/projected/6b1f0daf-7085-452a-bcf8-246adaa9a7b1-kube-api-access-9bx77\") pod \"kube-auth-proxy-5bb547c98c-s85c9\" (UID: \"6b1f0daf-7085-452a-bcf8-246adaa9a7b1\") " pod="openshift-ingress/kube-auth-proxy-5bb547c98c-s85c9" Apr 16 15:20:25.006549 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:25.006426 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6b1f0daf-7085-452a-bcf8-246adaa9a7b1-tls-certs\") pod \"kube-auth-proxy-5bb547c98c-s85c9\" (UID: \"6b1f0daf-7085-452a-bcf8-246adaa9a7b1\") " pod="openshift-ingress/kube-auth-proxy-5bb547c98c-s85c9" Apr 16 15:20:25.008893 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:25.008871 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6b1f0daf-7085-452a-bcf8-246adaa9a7b1-tmp\") pod \"kube-auth-proxy-5bb547c98c-s85c9\" (UID: \"6b1f0daf-7085-452a-bcf8-246adaa9a7b1\") " pod="openshift-ingress/kube-auth-proxy-5bb547c98c-s85c9" Apr 16 15:20:25.009118 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:25.009098 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6b1f0daf-7085-452a-bcf8-246adaa9a7b1-tls-certs\") pod \"kube-auth-proxy-5bb547c98c-s85c9\" (UID: \"6b1f0daf-7085-452a-bcf8-246adaa9a7b1\") " pod="openshift-ingress/kube-auth-proxy-5bb547c98c-s85c9" Apr 16 15:20:25.018490 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:25.018467 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bx77\" (UniqueName: \"kubernetes.io/projected/6b1f0daf-7085-452a-bcf8-246adaa9a7b1-kube-api-access-9bx77\") pod \"kube-auth-proxy-5bb547c98c-s85c9\" (UID: \"6b1f0daf-7085-452a-bcf8-246adaa9a7b1\") " pod="openshift-ingress/kube-auth-proxy-5bb547c98c-s85c9" Apr 16 15:20:25.156515 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:25.156487 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-5bb547c98c-s85c9" Apr 16 15:20:25.275471 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:25.275447 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-5bb547c98c-s85c9"] Apr 16 15:20:25.277005 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:20:25.276979 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b1f0daf_7085_452a_bcf8_246adaa9a7b1.slice/crio-b9a3636475dab6557fb71a3b3f8eab13867838e0ef196723ea5c3454a36bb58c WatchSource:0}: Error finding container b9a3636475dab6557fb71a3b3f8eab13867838e0ef196723ea5c3454a36bb58c: Status 404 returned error can't find the container with id b9a3636475dab6557fb71a3b3f8eab13867838e0ef196723ea5c3454a36bb58c Apr 16 15:20:25.681353 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:25.681314 2573 generic.go:358] "Generic (PLEG): container finished" podID="9d72ff65-9208-4cbd-82a4-2e821f1303a7" containerID="965d843fa2aa410c72e3dd83858807b3d1908a7c516b859dbabef5db3a593460" exitCode=0 Apr 16 15:20:25.681802 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:25.681418 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483597h5k" event={"ID":"9d72ff65-9208-4cbd-82a4-2e821f1303a7","Type":"ContainerDied","Data":"965d843fa2aa410c72e3dd83858807b3d1908a7c516b859dbabef5db3a593460"} Apr 16 15:20:25.682804 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:25.682776 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-5bb547c98c-s85c9" event={"ID":"6b1f0daf-7085-452a-bcf8-246adaa9a7b1","Type":"ContainerStarted","Data":"b9a3636475dab6557fb71a3b3f8eab13867838e0ef196723ea5c3454a36bb58c"} Apr 16 15:20:26.689325 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:26.689286 2573 generic.go:358] "Generic (PLEG): container finished" podID="9d72ff65-9208-4cbd-82a4-2e821f1303a7" containerID="5767f7826c7744e6179ef3862d61bef3f93aecbc159e4149060058afa484f2d7" exitCode=0 Apr 16 15:20:26.689821 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:26.689337 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483597h5k" event={"ID":"9d72ff65-9208-4cbd-82a4-2e821f1303a7","Type":"ContainerDied","Data":"5767f7826c7744e6179ef3862d61bef3f93aecbc159e4149060058afa484f2d7"} Apr 16 15:20:27.816450 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:27.816425 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483597h5k" Apr 16 15:20:27.933780 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:27.933738 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9d72ff65-9208-4cbd-82a4-2e821f1303a7-bundle\") pod \"9d72ff65-9208-4cbd-82a4-2e821f1303a7\" (UID: \"9d72ff65-9208-4cbd-82a4-2e821f1303a7\") " Apr 16 15:20:27.933945 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:27.933795 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7zp6j\" (UniqueName: \"kubernetes.io/projected/9d72ff65-9208-4cbd-82a4-2e821f1303a7-kube-api-access-7zp6j\") pod \"9d72ff65-9208-4cbd-82a4-2e821f1303a7\" (UID: \"9d72ff65-9208-4cbd-82a4-2e821f1303a7\") " Apr 16 15:20:27.933945 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:27.933823 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9d72ff65-9208-4cbd-82a4-2e821f1303a7-util\") pod \"9d72ff65-9208-4cbd-82a4-2e821f1303a7\" (UID: \"9d72ff65-9208-4cbd-82a4-2e821f1303a7\") " Apr 16 15:20:27.934639 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:27.934608 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d72ff65-9208-4cbd-82a4-2e821f1303a7-bundle" (OuterVolumeSpecName: "bundle") pod "9d72ff65-9208-4cbd-82a4-2e821f1303a7" (UID: "9d72ff65-9208-4cbd-82a4-2e821f1303a7"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:20:27.936041 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:27.936013 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d72ff65-9208-4cbd-82a4-2e821f1303a7-kube-api-access-7zp6j" (OuterVolumeSpecName: "kube-api-access-7zp6j") pod "9d72ff65-9208-4cbd-82a4-2e821f1303a7" (UID: "9d72ff65-9208-4cbd-82a4-2e821f1303a7"). InnerVolumeSpecName "kube-api-access-7zp6j". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 15:20:27.938910 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:27.938885 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d72ff65-9208-4cbd-82a4-2e821f1303a7-util" (OuterVolumeSpecName: "util") pod "9d72ff65-9208-4cbd-82a4-2e821f1303a7" (UID: "9d72ff65-9208-4cbd-82a4-2e821f1303a7"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:20:28.034635 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:28.034594 2573 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9d72ff65-9208-4cbd-82a4-2e821f1303a7-bundle\") on node \"ip-10-0-136-151.ec2.internal\" DevicePath \"\"" Apr 16 15:20:28.034635 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:28.034619 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7zp6j\" (UniqueName: \"kubernetes.io/projected/9d72ff65-9208-4cbd-82a4-2e821f1303a7-kube-api-access-7zp6j\") on node \"ip-10-0-136-151.ec2.internal\" DevicePath \"\"" Apr 16 15:20:28.034635 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:28.034628 2573 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9d72ff65-9208-4cbd-82a4-2e821f1303a7-util\") on node \"ip-10-0-136-151.ec2.internal\" DevicePath \"\"" Apr 16 15:20:28.698430 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:28.698386 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483597h5k" event={"ID":"9d72ff65-9208-4cbd-82a4-2e821f1303a7","Type":"ContainerDied","Data":"da1a445ea7067527187cb163e49a145ca8c1f9b051c769eda1fc112f24bf8451"} Apr 16 15:20:28.698430 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:28.698423 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483597h5k" Apr 16 15:20:28.698743 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:28.698430 2573 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="da1a445ea7067527187cb163e49a145ca8c1f9b051c769eda1fc112f24bf8451" Apr 16 15:20:29.703003 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:29.702968 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-5bb547c98c-s85c9" event={"ID":"6b1f0daf-7085-452a-bcf8-246adaa9a7b1","Type":"ContainerStarted","Data":"a1d45760d72c4a83368cd26b601c69be04e24a70cb3d3050a95270a45991a002"} Apr 16 15:20:29.720938 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:29.720869 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-5bb547c98c-s85c9" podStartSLOduration=2.021168175 podStartE2EDuration="5.720856071s" podCreationTimestamp="2026-04-16 15:20:24 +0000 UTC" firstStartedPulling="2026-04-16 15:20:25.278677347 +0000 UTC m=+503.864881135" lastFinishedPulling="2026-04-16 15:20:28.978365243 +0000 UTC m=+507.564569031" observedRunningTime="2026-04-16 15:20:29.719458516 +0000 UTC m=+508.305662324" watchObservedRunningTime="2026-04-16 15:20:29.720856071 +0000 UTC m=+508.307059879" Apr 16 15:20:37.843032 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:37.842942 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2mmvrs"] Apr 16 15:20:37.843584 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:37.843469 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9d72ff65-9208-4cbd-82a4-2e821f1303a7" containerName="pull" Apr 16 15:20:37.843584 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:37.843487 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d72ff65-9208-4cbd-82a4-2e821f1303a7" containerName="pull" Apr 16 15:20:37.843584 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:37.843516 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9d72ff65-9208-4cbd-82a4-2e821f1303a7" containerName="util" Apr 16 15:20:37.843584 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:37.843524 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d72ff65-9208-4cbd-82a4-2e821f1303a7" containerName="util" Apr 16 15:20:37.843584 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:37.843542 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9d72ff65-9208-4cbd-82a4-2e821f1303a7" containerName="extract" Apr 16 15:20:37.843584 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:37.843552 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d72ff65-9208-4cbd-82a4-2e821f1303a7" containerName="extract" Apr 16 15:20:37.843940 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:37.843637 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="9d72ff65-9208-4cbd-82a4-2e821f1303a7" containerName="extract" Apr 16 15:20:37.847624 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:37.847602 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2mmvrs" Apr 16 15:20:37.850835 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:37.850812 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-98fps\"" Apr 16 15:20:37.852030 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:37.852011 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 15:20:37.852087 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:37.852011 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 15:20:37.892276 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:37.892248 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2mmvrs"] Apr 16 15:20:37.928712 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:37.928672 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3024ce39-9506-4dbf-b893-e5b4edc7802e-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2mmvrs\" (UID: \"3024ce39-9506-4dbf-b893-e5b4edc7802e\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2mmvrs" Apr 16 15:20:37.928876 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:37.928727 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7pvx\" (UniqueName: \"kubernetes.io/projected/3024ce39-9506-4dbf-b893-e5b4edc7802e-kube-api-access-l7pvx\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2mmvrs\" (UID: \"3024ce39-9506-4dbf-b893-e5b4edc7802e\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2mmvrs" Apr 16 15:20:37.928876 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:37.928792 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3024ce39-9506-4dbf-b893-e5b4edc7802e-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2mmvrs\" (UID: \"3024ce39-9506-4dbf-b893-e5b4edc7802e\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2mmvrs" Apr 16 15:20:38.029361 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:38.029323 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3024ce39-9506-4dbf-b893-e5b4edc7802e-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2mmvrs\" (UID: \"3024ce39-9506-4dbf-b893-e5b4edc7802e\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2mmvrs" Apr 16 15:20:38.029546 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:38.029374 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l7pvx\" (UniqueName: \"kubernetes.io/projected/3024ce39-9506-4dbf-b893-e5b4edc7802e-kube-api-access-l7pvx\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2mmvrs\" (UID: \"3024ce39-9506-4dbf-b893-e5b4edc7802e\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2mmvrs" Apr 16 15:20:38.029546 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:38.029456 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3024ce39-9506-4dbf-b893-e5b4edc7802e-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2mmvrs\" (UID: \"3024ce39-9506-4dbf-b893-e5b4edc7802e\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2mmvrs" Apr 16 15:20:38.029726 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:38.029703 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3024ce39-9506-4dbf-b893-e5b4edc7802e-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2mmvrs\" (UID: \"3024ce39-9506-4dbf-b893-e5b4edc7802e\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2mmvrs" Apr 16 15:20:38.029837 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:38.029819 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3024ce39-9506-4dbf-b893-e5b4edc7802e-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2mmvrs\" (UID: \"3024ce39-9506-4dbf-b893-e5b4edc7802e\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2mmvrs" Apr 16 15:20:38.052551 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:38.052522 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7pvx\" (UniqueName: \"kubernetes.io/projected/3024ce39-9506-4dbf-b893-e5b4edc7802e-kube-api-access-l7pvx\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2mmvrs\" (UID: \"3024ce39-9506-4dbf-b893-e5b4edc7802e\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2mmvrs" Apr 16 15:20:38.156985 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:38.156958 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2mmvrs" Apr 16 15:20:38.369501 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:38.369472 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2mmvrs"] Apr 16 15:20:38.373376 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:20:38.373350 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3024ce39_9506_4dbf_b893_e5b4edc7802e.slice/crio-172ef6aa9105735606f7dc70563728651cfc9690f7074763cc542122ecd4033e WatchSource:0}: Error finding container 172ef6aa9105735606f7dc70563728651cfc9690f7074763cc542122ecd4033e: Status 404 returned error can't find the container with id 172ef6aa9105735606f7dc70563728651cfc9690f7074763cc542122ecd4033e Apr 16 15:20:38.737640 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:38.737543 2573 generic.go:358] "Generic (PLEG): container finished" podID="3024ce39-9506-4dbf-b893-e5b4edc7802e" containerID="9125dbe9438210131fbc4a567d40adac275b5d7ba1a39bc12c78a4cbc2c81e8f" exitCode=0 Apr 16 15:20:38.737824 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:38.737634 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2mmvrs" event={"ID":"3024ce39-9506-4dbf-b893-e5b4edc7802e","Type":"ContainerDied","Data":"9125dbe9438210131fbc4a567d40adac275b5d7ba1a39bc12c78a4cbc2c81e8f"} Apr 16 15:20:38.737824 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:38.737694 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2mmvrs" event={"ID":"3024ce39-9506-4dbf-b893-e5b4edc7802e","Type":"ContainerStarted","Data":"172ef6aa9105735606f7dc70563728651cfc9690f7074763cc542122ecd4033e"} Apr 16 15:20:39.744440 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:39.744401 2573 generic.go:358] "Generic (PLEG): container finished" podID="3024ce39-9506-4dbf-b893-e5b4edc7802e" containerID="bc0394ae015a253ca5e66426c60ee8ddf7570eb6899c4c2f6b50dcfe0d343978" exitCode=0 Apr 16 15:20:39.744834 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:39.744488 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2mmvrs" event={"ID":"3024ce39-9506-4dbf-b893-e5b4edc7802e","Type":"ContainerDied","Data":"bc0394ae015a253ca5e66426c60ee8ddf7570eb6899c4c2f6b50dcfe0d343978"} Apr 16 15:20:40.752150 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:40.752115 2573 generic.go:358] "Generic (PLEG): container finished" podID="3024ce39-9506-4dbf-b893-e5b4edc7802e" containerID="c7fcdac2826a58fce42d3fb19dba42ecd82db9ee6a0910d5a2af8e1253f3e5ba" exitCode=0 Apr 16 15:20:40.752542 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:40.752197 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2mmvrs" event={"ID":"3024ce39-9506-4dbf-b893-e5b4edc7802e","Type":"ContainerDied","Data":"c7fcdac2826a58fce42d3fb19dba42ecd82db9ee6a0910d5a2af8e1253f3e5ba"} Apr 16 15:20:41.882258 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:41.882228 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2mmvrs" Apr 16 15:20:41.960419 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:41.960393 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3024ce39-9506-4dbf-b893-e5b4edc7802e-bundle\") pod \"3024ce39-9506-4dbf-b893-e5b4edc7802e\" (UID: \"3024ce39-9506-4dbf-b893-e5b4edc7802e\") " Apr 16 15:20:41.960569 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:41.960496 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l7pvx\" (UniqueName: \"kubernetes.io/projected/3024ce39-9506-4dbf-b893-e5b4edc7802e-kube-api-access-l7pvx\") pod \"3024ce39-9506-4dbf-b893-e5b4edc7802e\" (UID: \"3024ce39-9506-4dbf-b893-e5b4edc7802e\") " Apr 16 15:20:41.960569 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:41.960554 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3024ce39-9506-4dbf-b893-e5b4edc7802e-util\") pod \"3024ce39-9506-4dbf-b893-e5b4edc7802e\" (UID: \"3024ce39-9506-4dbf-b893-e5b4edc7802e\") " Apr 16 15:20:41.961376 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:41.961340 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3024ce39-9506-4dbf-b893-e5b4edc7802e-bundle" (OuterVolumeSpecName: "bundle") pod "3024ce39-9506-4dbf-b893-e5b4edc7802e" (UID: "3024ce39-9506-4dbf-b893-e5b4edc7802e"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:20:41.962785 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:41.962760 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3024ce39-9506-4dbf-b893-e5b4edc7802e-kube-api-access-l7pvx" (OuterVolumeSpecName: "kube-api-access-l7pvx") pod "3024ce39-9506-4dbf-b893-e5b4edc7802e" (UID: "3024ce39-9506-4dbf-b893-e5b4edc7802e"). InnerVolumeSpecName "kube-api-access-l7pvx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 15:20:41.966378 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:41.966357 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3024ce39-9506-4dbf-b893-e5b4edc7802e-util" (OuterVolumeSpecName: "util") pod "3024ce39-9506-4dbf-b893-e5b4edc7802e" (UID: "3024ce39-9506-4dbf-b893-e5b4edc7802e"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:20:42.062039 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:42.062005 2573 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3024ce39-9506-4dbf-b893-e5b4edc7802e-util\") on node \"ip-10-0-136-151.ec2.internal\" DevicePath \"\"" Apr 16 15:20:42.062039 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:42.062037 2573 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3024ce39-9506-4dbf-b893-e5b4edc7802e-bundle\") on node \"ip-10-0-136-151.ec2.internal\" DevicePath \"\"" Apr 16 15:20:42.062231 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:42.062048 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-l7pvx\" (UniqueName: \"kubernetes.io/projected/3024ce39-9506-4dbf-b893-e5b4edc7802e-kube-api-access-l7pvx\") on node \"ip-10-0-136-151.ec2.internal\" DevicePath \"\"" Apr 16 15:20:42.760920 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:42.760885 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2mmvrs" event={"ID":"3024ce39-9506-4dbf-b893-e5b4edc7802e","Type":"ContainerDied","Data":"172ef6aa9105735606f7dc70563728651cfc9690f7074763cc542122ecd4033e"} Apr 16 15:20:42.760920 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:42.760917 2573 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="172ef6aa9105735606f7dc70563728651cfc9690f7074763cc542122ecd4033e" Apr 16 15:20:42.761123 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:20:42.760953 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2mmvrs" Apr 16 15:21:20.402097 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:20.402067 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5bcb57dd94-r6xk2"] Apr 16 15:21:20.402564 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:20.402426 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3024ce39-9506-4dbf-b893-e5b4edc7802e" containerName="extract" Apr 16 15:21:20.402564 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:20.402438 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="3024ce39-9506-4dbf-b893-e5b4edc7802e" containerName="extract" Apr 16 15:21:20.402564 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:20.402451 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3024ce39-9506-4dbf-b893-e5b4edc7802e" containerName="pull" Apr 16 15:21:20.402564 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:20.402457 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="3024ce39-9506-4dbf-b893-e5b4edc7802e" containerName="pull" Apr 16 15:21:20.402564 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:20.402471 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3024ce39-9506-4dbf-b893-e5b4edc7802e" containerName="util" Apr 16 15:21:20.402564 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:20.402477 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="3024ce39-9506-4dbf-b893-e5b4edc7802e" containerName="util" Apr 16 15:21:20.402564 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:20.402531 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="3024ce39-9506-4dbf-b893-e5b4edc7802e" containerName="extract" Apr 16 15:21:20.406034 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:20.405995 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5bcb57dd94-r6xk2" Apr 16 15:21:20.416234 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:20.416210 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5bcb57dd94-r6xk2"] Apr 16 15:21:20.484833 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:20.484796 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d8d2ac3f-dd82-4da5-8eae-3d9fe0a7d38c-console-serving-cert\") pod \"console-5bcb57dd94-r6xk2\" (UID: \"d8d2ac3f-dd82-4da5-8eae-3d9fe0a7d38c\") " pod="openshift-console/console-5bcb57dd94-r6xk2" Apr 16 15:21:20.484833 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:20.484833 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d8d2ac3f-dd82-4da5-8eae-3d9fe0a7d38c-oauth-serving-cert\") pod \"console-5bcb57dd94-r6xk2\" (UID: \"d8d2ac3f-dd82-4da5-8eae-3d9fe0a7d38c\") " pod="openshift-console/console-5bcb57dd94-r6xk2" Apr 16 15:21:20.485079 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:20.484858 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d8d2ac3f-dd82-4da5-8eae-3d9fe0a7d38c-service-ca\") pod \"console-5bcb57dd94-r6xk2\" (UID: \"d8d2ac3f-dd82-4da5-8eae-3d9fe0a7d38c\") " pod="openshift-console/console-5bcb57dd94-r6xk2" Apr 16 15:21:20.485079 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:20.484935 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d8d2ac3f-dd82-4da5-8eae-3d9fe0a7d38c-console-oauth-config\") pod \"console-5bcb57dd94-r6xk2\" (UID: \"d8d2ac3f-dd82-4da5-8eae-3d9fe0a7d38c\") " pod="openshift-console/console-5bcb57dd94-r6xk2" Apr 16 15:21:20.485079 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:20.485006 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzfk7\" (UniqueName: \"kubernetes.io/projected/d8d2ac3f-dd82-4da5-8eae-3d9fe0a7d38c-kube-api-access-kzfk7\") pod \"console-5bcb57dd94-r6xk2\" (UID: \"d8d2ac3f-dd82-4da5-8eae-3d9fe0a7d38c\") " pod="openshift-console/console-5bcb57dd94-r6xk2" Apr 16 15:21:20.485079 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:20.485063 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d8d2ac3f-dd82-4da5-8eae-3d9fe0a7d38c-console-config\") pod \"console-5bcb57dd94-r6xk2\" (UID: \"d8d2ac3f-dd82-4da5-8eae-3d9fe0a7d38c\") " pod="openshift-console/console-5bcb57dd94-r6xk2" Apr 16 15:21:20.485223 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:20.485096 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d8d2ac3f-dd82-4da5-8eae-3d9fe0a7d38c-trusted-ca-bundle\") pod \"console-5bcb57dd94-r6xk2\" (UID: \"d8d2ac3f-dd82-4da5-8eae-3d9fe0a7d38c\") " pod="openshift-console/console-5bcb57dd94-r6xk2" Apr 16 15:21:20.586211 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:20.586173 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d8d2ac3f-dd82-4da5-8eae-3d9fe0a7d38c-console-config\") pod \"console-5bcb57dd94-r6xk2\" (UID: \"d8d2ac3f-dd82-4da5-8eae-3d9fe0a7d38c\") " pod="openshift-console/console-5bcb57dd94-r6xk2" Apr 16 15:21:20.586211 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:20.586211 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d8d2ac3f-dd82-4da5-8eae-3d9fe0a7d38c-trusted-ca-bundle\") pod \"console-5bcb57dd94-r6xk2\" (UID: \"d8d2ac3f-dd82-4da5-8eae-3d9fe0a7d38c\") " pod="openshift-console/console-5bcb57dd94-r6xk2" Apr 16 15:21:20.586440 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:20.586348 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d8d2ac3f-dd82-4da5-8eae-3d9fe0a7d38c-console-serving-cert\") pod \"console-5bcb57dd94-r6xk2\" (UID: \"d8d2ac3f-dd82-4da5-8eae-3d9fe0a7d38c\") " pod="openshift-console/console-5bcb57dd94-r6xk2" Apr 16 15:21:20.586440 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:20.586394 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d8d2ac3f-dd82-4da5-8eae-3d9fe0a7d38c-oauth-serving-cert\") pod \"console-5bcb57dd94-r6xk2\" (UID: \"d8d2ac3f-dd82-4da5-8eae-3d9fe0a7d38c\") " pod="openshift-console/console-5bcb57dd94-r6xk2" Apr 16 15:21:20.586603 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:20.586440 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d8d2ac3f-dd82-4da5-8eae-3d9fe0a7d38c-service-ca\") pod \"console-5bcb57dd94-r6xk2\" (UID: \"d8d2ac3f-dd82-4da5-8eae-3d9fe0a7d38c\") " pod="openshift-console/console-5bcb57dd94-r6xk2" Apr 16 15:21:20.586603 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:20.586479 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d8d2ac3f-dd82-4da5-8eae-3d9fe0a7d38c-console-oauth-config\") pod \"console-5bcb57dd94-r6xk2\" (UID: \"d8d2ac3f-dd82-4da5-8eae-3d9fe0a7d38c\") " pod="openshift-console/console-5bcb57dd94-r6xk2" Apr 16 15:21:20.586603 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:20.586541 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kzfk7\" (UniqueName: \"kubernetes.io/projected/d8d2ac3f-dd82-4da5-8eae-3d9fe0a7d38c-kube-api-access-kzfk7\") pod \"console-5bcb57dd94-r6xk2\" (UID: \"d8d2ac3f-dd82-4da5-8eae-3d9fe0a7d38c\") " pod="openshift-console/console-5bcb57dd94-r6xk2" Apr 16 15:21:20.586985 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:20.586961 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d8d2ac3f-dd82-4da5-8eae-3d9fe0a7d38c-console-config\") pod \"console-5bcb57dd94-r6xk2\" (UID: \"d8d2ac3f-dd82-4da5-8eae-3d9fe0a7d38c\") " pod="openshift-console/console-5bcb57dd94-r6xk2" Apr 16 15:21:20.587101 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:20.587070 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d8d2ac3f-dd82-4da5-8eae-3d9fe0a7d38c-trusted-ca-bundle\") pod \"console-5bcb57dd94-r6xk2\" (UID: \"d8d2ac3f-dd82-4da5-8eae-3d9fe0a7d38c\") " pod="openshift-console/console-5bcb57dd94-r6xk2" Apr 16 15:21:20.587155 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:20.587128 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d8d2ac3f-dd82-4da5-8eae-3d9fe0a7d38c-oauth-serving-cert\") pod \"console-5bcb57dd94-r6xk2\" (UID: \"d8d2ac3f-dd82-4da5-8eae-3d9fe0a7d38c\") " pod="openshift-console/console-5bcb57dd94-r6xk2" Apr 16 15:21:20.587427 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:20.587362 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d8d2ac3f-dd82-4da5-8eae-3d9fe0a7d38c-service-ca\") pod \"console-5bcb57dd94-r6xk2\" (UID: \"d8d2ac3f-dd82-4da5-8eae-3d9fe0a7d38c\") " pod="openshift-console/console-5bcb57dd94-r6xk2" Apr 16 15:21:20.588937 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:20.588912 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d8d2ac3f-dd82-4da5-8eae-3d9fe0a7d38c-console-serving-cert\") pod \"console-5bcb57dd94-r6xk2\" (UID: \"d8d2ac3f-dd82-4da5-8eae-3d9fe0a7d38c\") " pod="openshift-console/console-5bcb57dd94-r6xk2" Apr 16 15:21:20.589134 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:20.589117 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d8d2ac3f-dd82-4da5-8eae-3d9fe0a7d38c-console-oauth-config\") pod \"console-5bcb57dd94-r6xk2\" (UID: \"d8d2ac3f-dd82-4da5-8eae-3d9fe0a7d38c\") " pod="openshift-console/console-5bcb57dd94-r6xk2" Apr 16 15:21:20.594923 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:20.594903 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzfk7\" (UniqueName: \"kubernetes.io/projected/d8d2ac3f-dd82-4da5-8eae-3d9fe0a7d38c-kube-api-access-kzfk7\") pod \"console-5bcb57dd94-r6xk2\" (UID: \"d8d2ac3f-dd82-4da5-8eae-3d9fe0a7d38c\") " pod="openshift-console/console-5bcb57dd94-r6xk2" Apr 16 15:21:20.718582 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:20.718499 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5bcb57dd94-r6xk2" Apr 16 15:21:20.848835 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:20.848813 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5bcb57dd94-r6xk2"] Apr 16 15:21:20.851575 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:21:20.851550 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd8d2ac3f_dd82_4da5_8eae_3d9fe0a7d38c.slice/crio-e17e6b4559f18f6d4e1bb7ffa5a865d9faa713a10064e9ce3ea8b677c23d0307 WatchSource:0}: Error finding container e17e6b4559f18f6d4e1bb7ffa5a865d9faa713a10064e9ce3ea8b677c23d0307: Status 404 returned error can't find the container with id e17e6b4559f18f6d4e1bb7ffa5a865d9faa713a10064e9ce3ea8b677c23d0307 Apr 16 15:21:20.895124 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:20.895096 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5bcb57dd94-r6xk2" event={"ID":"d8d2ac3f-dd82-4da5-8eae-3d9fe0a7d38c","Type":"ContainerStarted","Data":"e17e6b4559f18f6d4e1bb7ffa5a865d9faa713a10064e9ce3ea8b677c23d0307"} Apr 16 15:21:21.899886 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:21.899845 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5bcb57dd94-r6xk2" event={"ID":"d8d2ac3f-dd82-4da5-8eae-3d9fe0a7d38c","Type":"ContainerStarted","Data":"1881c9595138a259388d9634773a2908381a1af6dc6141ed7e35a339dd0eb8b8"} Apr 16 15:21:21.919194 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:21.919152 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5bcb57dd94-r6xk2" podStartSLOduration=1.919136064 podStartE2EDuration="1.919136064s" podCreationTimestamp="2026-04-16 15:21:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:21:21.918432701 +0000 UTC m=+560.504636520" watchObservedRunningTime="2026-04-16 15:21:21.919136064 +0000 UTC m=+560.505339876" Apr 16 15:21:30.718625 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:30.718577 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5bcb57dd94-r6xk2" Apr 16 15:21:30.719096 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:30.718902 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5bcb57dd94-r6xk2" Apr 16 15:21:30.723417 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:30.723391 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5bcb57dd94-r6xk2" Apr 16 15:21:30.946350 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:30.946322 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5bcb57dd94-r6xk2" Apr 16 15:21:30.996003 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:30.995822 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6bbf4d9849-clvm8"] Apr 16 15:21:37.156053 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:37.156019 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1tpf8w"] Apr 16 15:21:37.160859 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:37.160842 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1tpf8w" Apr 16 15:21:37.164032 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:37.164011 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 16 15:21:37.165398 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:37.165372 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 16 15:21:37.165523 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:37.165427 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-mjgck\"" Apr 16 15:21:37.168986 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:37.168866 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1tpf8w"] Apr 16 15:21:37.240487 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:37.240444 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4f8c\" (UniqueName: \"kubernetes.io/projected/5935dea1-ae73-40c9-98ae-ed32f97ce643-kube-api-access-k4f8c\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1tpf8w\" (UID: \"5935dea1-ae73-40c9-98ae-ed32f97ce643\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1tpf8w" Apr 16 15:21:37.240639 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:37.240588 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5935dea1-ae73-40c9-98ae-ed32f97ce643-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1tpf8w\" (UID: \"5935dea1-ae73-40c9-98ae-ed32f97ce643\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1tpf8w" Apr 16 15:21:37.240639 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:37.240617 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5935dea1-ae73-40c9-98ae-ed32f97ce643-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1tpf8w\" (UID: \"5935dea1-ae73-40c9-98ae-ed32f97ce643\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1tpf8w" Apr 16 15:21:37.341904 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:37.341868 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k4f8c\" (UniqueName: \"kubernetes.io/projected/5935dea1-ae73-40c9-98ae-ed32f97ce643-kube-api-access-k4f8c\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1tpf8w\" (UID: \"5935dea1-ae73-40c9-98ae-ed32f97ce643\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1tpf8w" Apr 16 15:21:37.342044 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:37.341948 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5935dea1-ae73-40c9-98ae-ed32f97ce643-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1tpf8w\" (UID: \"5935dea1-ae73-40c9-98ae-ed32f97ce643\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1tpf8w" Apr 16 15:21:37.342044 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:37.341968 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5935dea1-ae73-40c9-98ae-ed32f97ce643-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1tpf8w\" (UID: \"5935dea1-ae73-40c9-98ae-ed32f97ce643\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1tpf8w" Apr 16 15:21:37.342339 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:37.342323 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5935dea1-ae73-40c9-98ae-ed32f97ce643-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1tpf8w\" (UID: \"5935dea1-ae73-40c9-98ae-ed32f97ce643\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1tpf8w" Apr 16 15:21:37.342379 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:37.342356 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5935dea1-ae73-40c9-98ae-ed32f97ce643-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1tpf8w\" (UID: \"5935dea1-ae73-40c9-98ae-ed32f97ce643\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1tpf8w" Apr 16 15:21:37.350345 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:37.350315 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4f8c\" (UniqueName: \"kubernetes.io/projected/5935dea1-ae73-40c9-98ae-ed32f97ce643-kube-api-access-k4f8c\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1tpf8w\" (UID: \"5935dea1-ae73-40c9-98ae-ed32f97ce643\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1tpf8w" Apr 16 15:21:37.471697 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:37.471596 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1tpf8w" Apr 16 15:21:37.621612 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:37.621584 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1tpf8w"] Apr 16 15:21:37.622889 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:21:37.622860 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5935dea1_ae73_40c9_98ae_ed32f97ce643.slice/crio-9006e34ed18f511497c2242d03d30aeac22c57f922670a4e49f2851c7b530f1c WatchSource:0}: Error finding container 9006e34ed18f511497c2242d03d30aeac22c57f922670a4e49f2851c7b530f1c: Status 404 returned error can't find the container with id 9006e34ed18f511497c2242d03d30aeac22c57f922670a4e49f2851c7b530f1c Apr 16 15:21:37.763377 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:37.763346 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed735jmgw"] Apr 16 15:21:37.766867 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:37.766837 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed735jmgw" Apr 16 15:21:37.776237 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:37.776211 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed735jmgw"] Apr 16 15:21:37.850091 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:37.850063 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5c7b8a7a-8603-481a-b226-086efa4a6d4f-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed735jmgw\" (UID: \"5c7b8a7a-8603-481a-b226-086efa4a6d4f\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed735jmgw" Apr 16 15:21:37.850270 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:37.850102 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqstt\" (UniqueName: \"kubernetes.io/projected/5c7b8a7a-8603-481a-b226-086efa4a6d4f-kube-api-access-lqstt\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed735jmgw\" (UID: \"5c7b8a7a-8603-481a-b226-086efa4a6d4f\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed735jmgw" Apr 16 15:21:37.850270 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:37.850212 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5c7b8a7a-8603-481a-b226-086efa4a6d4f-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed735jmgw\" (UID: \"5c7b8a7a-8603-481a-b226-086efa4a6d4f\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed735jmgw" Apr 16 15:21:37.951205 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:37.951168 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5c7b8a7a-8603-481a-b226-086efa4a6d4f-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed735jmgw\" (UID: \"5c7b8a7a-8603-481a-b226-086efa4a6d4f\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed735jmgw" Apr 16 15:21:37.951382 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:37.951217 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5c7b8a7a-8603-481a-b226-086efa4a6d4f-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed735jmgw\" (UID: \"5c7b8a7a-8603-481a-b226-086efa4a6d4f\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed735jmgw" Apr 16 15:21:37.951382 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:37.951240 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lqstt\" (UniqueName: \"kubernetes.io/projected/5c7b8a7a-8603-481a-b226-086efa4a6d4f-kube-api-access-lqstt\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed735jmgw\" (UID: \"5c7b8a7a-8603-481a-b226-086efa4a6d4f\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed735jmgw" Apr 16 15:21:37.951569 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:37.951544 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5c7b8a7a-8603-481a-b226-086efa4a6d4f-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed735jmgw\" (UID: \"5c7b8a7a-8603-481a-b226-086efa4a6d4f\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed735jmgw" Apr 16 15:21:37.951633 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:37.951605 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5c7b8a7a-8603-481a-b226-086efa4a6d4f-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed735jmgw\" (UID: \"5c7b8a7a-8603-481a-b226-086efa4a6d4f\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed735jmgw" Apr 16 15:21:37.960247 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:37.960228 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqstt\" (UniqueName: \"kubernetes.io/projected/5c7b8a7a-8603-481a-b226-086efa4a6d4f-kube-api-access-lqstt\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed735jmgw\" (UID: \"5c7b8a7a-8603-481a-b226-086efa4a6d4f\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed735jmgw" Apr 16 15:21:37.968080 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:37.968052 2573 generic.go:358] "Generic (PLEG): container finished" podID="5935dea1-ae73-40c9-98ae-ed32f97ce643" containerID="4a2130327a67316eabc5bc108b29ce4247e32da5af0ccf953f0a1d96ef005137" exitCode=0 Apr 16 15:21:37.968187 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:37.968140 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1tpf8w" event={"ID":"5935dea1-ae73-40c9-98ae-ed32f97ce643","Type":"ContainerDied","Data":"4a2130327a67316eabc5bc108b29ce4247e32da5af0ccf953f0a1d96ef005137"} Apr 16 15:21:37.968187 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:37.968181 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1tpf8w" event={"ID":"5935dea1-ae73-40c9-98ae-ed32f97ce643","Type":"ContainerStarted","Data":"9006e34ed18f511497c2242d03d30aeac22c57f922670a4e49f2851c7b530f1c"} Apr 16 15:21:38.082894 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:38.082812 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed735jmgw" Apr 16 15:21:38.158325 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:38.158297 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e08zb6b"] Apr 16 15:21:38.163897 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:38.163872 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e08zb6b" Apr 16 15:21:38.170793 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:38.170764 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e08zb6b"] Apr 16 15:21:38.212676 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:38.212632 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed735jmgw"] Apr 16 15:21:38.214409 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:21:38.214367 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c7b8a7a_8603_481a_b226_086efa4a6d4f.slice/crio-9677e2adabc8d3dd09fbd53066fb78f019be6530397b812f00d7b4f98052e176 WatchSource:0}: Error finding container 9677e2adabc8d3dd09fbd53066fb78f019be6530397b812f00d7b4f98052e176: Status 404 returned error can't find the container with id 9677e2adabc8d3dd09fbd53066fb78f019be6530397b812f00d7b4f98052e176 Apr 16 15:21:38.253593 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:38.253571 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjcsn\" (UniqueName: \"kubernetes.io/projected/3cd7de14-76d8-4975-a48c-583dc8ead026-kube-api-access-rjcsn\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e08zb6b\" (UID: \"3cd7de14-76d8-4975-a48c-583dc8ead026\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e08zb6b" Apr 16 15:21:38.253704 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:38.253612 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3cd7de14-76d8-4975-a48c-583dc8ead026-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e08zb6b\" (UID: \"3cd7de14-76d8-4975-a48c-583dc8ead026\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e08zb6b" Apr 16 15:21:38.253753 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:38.253736 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3cd7de14-76d8-4975-a48c-583dc8ead026-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e08zb6b\" (UID: \"3cd7de14-76d8-4975-a48c-583dc8ead026\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e08zb6b" Apr 16 15:21:38.354669 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:38.354566 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3cd7de14-76d8-4975-a48c-583dc8ead026-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e08zb6b\" (UID: \"3cd7de14-76d8-4975-a48c-583dc8ead026\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e08zb6b" Apr 16 15:21:38.354812 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:38.354678 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3cd7de14-76d8-4975-a48c-583dc8ead026-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e08zb6b\" (UID: \"3cd7de14-76d8-4975-a48c-583dc8ead026\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e08zb6b" Apr 16 15:21:38.354812 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:38.354729 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rjcsn\" (UniqueName: \"kubernetes.io/projected/3cd7de14-76d8-4975-a48c-583dc8ead026-kube-api-access-rjcsn\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e08zb6b\" (UID: \"3cd7de14-76d8-4975-a48c-583dc8ead026\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e08zb6b" Apr 16 15:21:38.354959 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:38.354939 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3cd7de14-76d8-4975-a48c-583dc8ead026-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e08zb6b\" (UID: \"3cd7de14-76d8-4975-a48c-583dc8ead026\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e08zb6b" Apr 16 15:21:38.355031 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:38.355011 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3cd7de14-76d8-4975-a48c-583dc8ead026-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e08zb6b\" (UID: \"3cd7de14-76d8-4975-a48c-583dc8ead026\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e08zb6b" Apr 16 15:21:38.364495 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:38.364472 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjcsn\" (UniqueName: \"kubernetes.io/projected/3cd7de14-76d8-4975-a48c-583dc8ead026-kube-api-access-rjcsn\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e08zb6b\" (UID: \"3cd7de14-76d8-4975-a48c-583dc8ead026\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e08zb6b" Apr 16 15:21:38.477698 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:38.477668 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e08zb6b" Apr 16 15:21:38.558144 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:38.558112 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7594h5xw"] Apr 16 15:21:38.563149 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:38.563118 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7594h5xw" Apr 16 15:21:38.573414 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:38.573389 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7594h5xw"] Apr 16 15:21:38.612208 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:38.612184 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e08zb6b"] Apr 16 15:21:38.614045 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:21:38.614020 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3cd7de14_76d8_4975_a48c_583dc8ead026.slice/crio-da727c7fb3d4e6189af1d534b5fdb56f87772894d20a5a845fb831abda29acba WatchSource:0}: Error finding container da727c7fb3d4e6189af1d534b5fdb56f87772894d20a5a845fb831abda29acba: Status 404 returned error can't find the container with id da727c7fb3d4e6189af1d534b5fdb56f87772894d20a5a845fb831abda29acba Apr 16 15:21:38.658315 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:38.658288 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/844c3ca6-337b-4c63-b069-2bb7c48e6f4f-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7594h5xw\" (UID: \"844c3ca6-337b-4c63-b069-2bb7c48e6f4f\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7594h5xw" Apr 16 15:21:38.658440 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:38.658334 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/844c3ca6-337b-4c63-b069-2bb7c48e6f4f-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7594h5xw\" (UID: \"844c3ca6-337b-4c63-b069-2bb7c48e6f4f\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7594h5xw" Apr 16 15:21:38.658440 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:38.658417 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7hn8\" (UniqueName: \"kubernetes.io/projected/844c3ca6-337b-4c63-b069-2bb7c48e6f4f-kube-api-access-r7hn8\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7594h5xw\" (UID: \"844c3ca6-337b-4c63-b069-2bb7c48e6f4f\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7594h5xw" Apr 16 15:21:38.759643 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:38.759615 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/844c3ca6-337b-4c63-b069-2bb7c48e6f4f-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7594h5xw\" (UID: \"844c3ca6-337b-4c63-b069-2bb7c48e6f4f\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7594h5xw" Apr 16 15:21:38.759643 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:38.759675 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/844c3ca6-337b-4c63-b069-2bb7c48e6f4f-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7594h5xw\" (UID: \"844c3ca6-337b-4c63-b069-2bb7c48e6f4f\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7594h5xw" Apr 16 15:21:38.759908 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:38.759721 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r7hn8\" (UniqueName: \"kubernetes.io/projected/844c3ca6-337b-4c63-b069-2bb7c48e6f4f-kube-api-access-r7hn8\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7594h5xw\" (UID: \"844c3ca6-337b-4c63-b069-2bb7c48e6f4f\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7594h5xw" Apr 16 15:21:38.759967 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:38.759952 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/844c3ca6-337b-4c63-b069-2bb7c48e6f4f-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7594h5xw\" (UID: \"844c3ca6-337b-4c63-b069-2bb7c48e6f4f\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7594h5xw" Apr 16 15:21:38.760054 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:38.760035 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/844c3ca6-337b-4c63-b069-2bb7c48e6f4f-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7594h5xw\" (UID: \"844c3ca6-337b-4c63-b069-2bb7c48e6f4f\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7594h5xw" Apr 16 15:21:38.770444 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:38.770419 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7hn8\" (UniqueName: \"kubernetes.io/projected/844c3ca6-337b-4c63-b069-2bb7c48e6f4f-kube-api-access-r7hn8\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7594h5xw\" (UID: \"844c3ca6-337b-4c63-b069-2bb7c48e6f4f\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7594h5xw" Apr 16 15:21:38.875756 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:38.875735 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7594h5xw" Apr 16 15:21:38.976972 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:38.976819 2573 generic.go:358] "Generic (PLEG): container finished" podID="3cd7de14-76d8-4975-a48c-583dc8ead026" containerID="453fa205f6daf3013760a745b147c57e8454c96c6ce2a0548c7ea79037664fbf" exitCode=0 Apr 16 15:21:38.976972 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:38.976943 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e08zb6b" event={"ID":"3cd7de14-76d8-4975-a48c-583dc8ead026","Type":"ContainerDied","Data":"453fa205f6daf3013760a745b147c57e8454c96c6ce2a0548c7ea79037664fbf"} Apr 16 15:21:38.976972 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:38.976971 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e08zb6b" event={"ID":"3cd7de14-76d8-4975-a48c-583dc8ead026","Type":"ContainerStarted","Data":"da727c7fb3d4e6189af1d534b5fdb56f87772894d20a5a845fb831abda29acba"} Apr 16 15:21:38.979386 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:38.979337 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1tpf8w" event={"ID":"5935dea1-ae73-40c9-98ae-ed32f97ce643","Type":"ContainerStarted","Data":"b4d77dbb9cd62069ff873bf951a72d9ab492aa92134e8f4dc45cb02415452a71"} Apr 16 15:21:38.980967 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:38.980942 2573 generic.go:358] "Generic (PLEG): container finished" podID="5c7b8a7a-8603-481a-b226-086efa4a6d4f" containerID="e8d2685140691dd13ed746f94c790359c57dfd4021ef9fa81d2bd8b7f5277b54" exitCode=0 Apr 16 15:21:38.981062 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:38.981029 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed735jmgw" event={"ID":"5c7b8a7a-8603-481a-b226-086efa4a6d4f","Type":"ContainerDied","Data":"e8d2685140691dd13ed746f94c790359c57dfd4021ef9fa81d2bd8b7f5277b54"} Apr 16 15:21:38.981062 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:38.981058 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed735jmgw" event={"ID":"5c7b8a7a-8603-481a-b226-086efa4a6d4f","Type":"ContainerStarted","Data":"9677e2adabc8d3dd09fbd53066fb78f019be6530397b812f00d7b4f98052e176"} Apr 16 15:21:39.013017 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:39.012992 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7594h5xw"] Apr 16 15:21:39.034502 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:21:39.034471 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod844c3ca6_337b_4c63_b069_2bb7c48e6f4f.slice/crio-510858af7401e60e409a889ec7bf36d66547b65d23b2eb200d1dfe34c0a9aebe WatchSource:0}: Error finding container 510858af7401e60e409a889ec7bf36d66547b65d23b2eb200d1dfe34c0a9aebe: Status 404 returned error can't find the container with id 510858af7401e60e409a889ec7bf36d66547b65d23b2eb200d1dfe34c0a9aebe Apr 16 15:21:39.986184 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:39.986091 2573 generic.go:358] "Generic (PLEG): container finished" podID="844c3ca6-337b-4c63-b069-2bb7c48e6f4f" containerID="3a953ebccf1dab4fb0ba1eab571341af48c4247f5d7eadd476815a0c49cc1323" exitCode=0 Apr 16 15:21:39.986594 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:39.986186 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7594h5xw" event={"ID":"844c3ca6-337b-4c63-b069-2bb7c48e6f4f","Type":"ContainerDied","Data":"3a953ebccf1dab4fb0ba1eab571341af48c4247f5d7eadd476815a0c49cc1323"} Apr 16 15:21:39.986594 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:39.986228 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7594h5xw" event={"ID":"844c3ca6-337b-4c63-b069-2bb7c48e6f4f","Type":"ContainerStarted","Data":"510858af7401e60e409a889ec7bf36d66547b65d23b2eb200d1dfe34c0a9aebe"} Apr 16 15:21:39.988039 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:39.988015 2573 generic.go:358] "Generic (PLEG): container finished" podID="3cd7de14-76d8-4975-a48c-583dc8ead026" containerID="8f3dedf4b6180edae2a6c2a411623c6a89b9688a6aa328dc51af8e41eb3909e0" exitCode=0 Apr 16 15:21:39.988138 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:39.988103 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e08zb6b" event={"ID":"3cd7de14-76d8-4975-a48c-583dc8ead026","Type":"ContainerDied","Data":"8f3dedf4b6180edae2a6c2a411623c6a89b9688a6aa328dc51af8e41eb3909e0"} Apr 16 15:21:39.989923 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:39.989898 2573 generic.go:358] "Generic (PLEG): container finished" podID="5935dea1-ae73-40c9-98ae-ed32f97ce643" containerID="b4d77dbb9cd62069ff873bf951a72d9ab492aa92134e8f4dc45cb02415452a71" exitCode=0 Apr 16 15:21:39.990040 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:39.989983 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1tpf8w" event={"ID":"5935dea1-ae73-40c9-98ae-ed32f97ce643","Type":"ContainerDied","Data":"b4d77dbb9cd62069ff873bf951a72d9ab492aa92134e8f4dc45cb02415452a71"} Apr 16 15:21:39.991908 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:39.991885 2573 generic.go:358] "Generic (PLEG): container finished" podID="5c7b8a7a-8603-481a-b226-086efa4a6d4f" containerID="e5d82bad55a37c8430ad931b59f837e1bb91a3daec5f21bc7c79a6293e104a4c" exitCode=0 Apr 16 15:21:39.992010 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:39.991950 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed735jmgw" event={"ID":"5c7b8a7a-8603-481a-b226-086efa4a6d4f","Type":"ContainerDied","Data":"e5d82bad55a37c8430ad931b59f837e1bb91a3daec5f21bc7c79a6293e104a4c"} Apr 16 15:21:40.998799 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:40.998764 2573 generic.go:358] "Generic (PLEG): container finished" podID="844c3ca6-337b-4c63-b069-2bb7c48e6f4f" containerID="dbb030b59805acffdcc4144826917076ee616387d31777e093bb32fe6baef981" exitCode=0 Apr 16 15:21:40.999184 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:40.998859 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7594h5xw" event={"ID":"844c3ca6-337b-4c63-b069-2bb7c48e6f4f","Type":"ContainerDied","Data":"dbb030b59805acffdcc4144826917076ee616387d31777e093bb32fe6baef981"} Apr 16 15:21:41.000870 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:41.000846 2573 generic.go:358] "Generic (PLEG): container finished" podID="3cd7de14-76d8-4975-a48c-583dc8ead026" containerID="19d5e7908bb25e2ed1ec8c62f655592fbc472cca700879307da402795b602c07" exitCode=0 Apr 16 15:21:41.000991 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:41.000970 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e08zb6b" event={"ID":"3cd7de14-76d8-4975-a48c-583dc8ead026","Type":"ContainerDied","Data":"19d5e7908bb25e2ed1ec8c62f655592fbc472cca700879307da402795b602c07"} Apr 16 15:21:41.002765 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:41.002743 2573 generic.go:358] "Generic (PLEG): container finished" podID="5935dea1-ae73-40c9-98ae-ed32f97ce643" containerID="b7594de4dafdef7401cdb65b1777b956adaeac5cdff617c35fef1b94183ddc67" exitCode=0 Apr 16 15:21:41.002869 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:41.002815 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1tpf8w" event={"ID":"5935dea1-ae73-40c9-98ae-ed32f97ce643","Type":"ContainerDied","Data":"b7594de4dafdef7401cdb65b1777b956adaeac5cdff617c35fef1b94183ddc67"} Apr 16 15:21:41.004482 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:41.004462 2573 generic.go:358] "Generic (PLEG): container finished" podID="5c7b8a7a-8603-481a-b226-086efa4a6d4f" containerID="f14e6cdeb7e0da18d10c29bc4832174a439d7385a83f811ccfb388610dba6916" exitCode=0 Apr 16 15:21:41.004557 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:41.004509 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed735jmgw" event={"ID":"5c7b8a7a-8603-481a-b226-086efa4a6d4f","Type":"ContainerDied","Data":"f14e6cdeb7e0da18d10c29bc4832174a439d7385a83f811ccfb388610dba6916"} Apr 16 15:21:42.011340 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:42.011302 2573 generic.go:358] "Generic (PLEG): container finished" podID="844c3ca6-337b-4c63-b069-2bb7c48e6f4f" containerID="f9996a28b0374345f41aafa976716575f108fa7b53b451b20c94049a15f42818" exitCode=0 Apr 16 15:21:42.011811 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:42.011415 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7594h5xw" event={"ID":"844c3ca6-337b-4c63-b069-2bb7c48e6f4f","Type":"ContainerDied","Data":"f9996a28b0374345f41aafa976716575f108fa7b53b451b20c94049a15f42818"} Apr 16 15:21:42.166435 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:42.166409 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed735jmgw" Apr 16 15:21:42.209812 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:42.209793 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e08zb6b" Apr 16 15:21:42.213060 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:42.213044 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1tpf8w" Apr 16 15:21:42.289672 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:42.289575 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5935dea1-ae73-40c9-98ae-ed32f97ce643-util\") pod \"5935dea1-ae73-40c9-98ae-ed32f97ce643\" (UID: \"5935dea1-ae73-40c9-98ae-ed32f97ce643\") " Apr 16 15:21:42.289672 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:42.289608 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5c7b8a7a-8603-481a-b226-086efa4a6d4f-bundle\") pod \"5c7b8a7a-8603-481a-b226-086efa4a6d4f\" (UID: \"5c7b8a7a-8603-481a-b226-086efa4a6d4f\") " Apr 16 15:21:42.289672 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:42.289624 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5c7b8a7a-8603-481a-b226-086efa4a6d4f-util\") pod \"5c7b8a7a-8603-481a-b226-086efa4a6d4f\" (UID: \"5c7b8a7a-8603-481a-b226-086efa4a6d4f\") " Apr 16 15:21:42.289990 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:42.289710 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k4f8c\" (UniqueName: \"kubernetes.io/projected/5935dea1-ae73-40c9-98ae-ed32f97ce643-kube-api-access-k4f8c\") pod \"5935dea1-ae73-40c9-98ae-ed32f97ce643\" (UID: \"5935dea1-ae73-40c9-98ae-ed32f97ce643\") " Apr 16 15:21:42.289990 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:42.289795 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lqstt\" (UniqueName: \"kubernetes.io/projected/5c7b8a7a-8603-481a-b226-086efa4a6d4f-kube-api-access-lqstt\") pod \"5c7b8a7a-8603-481a-b226-086efa4a6d4f\" (UID: \"5c7b8a7a-8603-481a-b226-086efa4a6d4f\") " Apr 16 15:21:42.289990 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:42.289828 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rjcsn\" (UniqueName: \"kubernetes.io/projected/3cd7de14-76d8-4975-a48c-583dc8ead026-kube-api-access-rjcsn\") pod \"3cd7de14-76d8-4975-a48c-583dc8ead026\" (UID: \"3cd7de14-76d8-4975-a48c-583dc8ead026\") " Apr 16 15:21:42.289990 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:42.289888 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5935dea1-ae73-40c9-98ae-ed32f97ce643-bundle\") pod \"5935dea1-ae73-40c9-98ae-ed32f97ce643\" (UID: \"5935dea1-ae73-40c9-98ae-ed32f97ce643\") " Apr 16 15:21:42.289990 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:42.289961 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3cd7de14-76d8-4975-a48c-583dc8ead026-bundle\") pod \"3cd7de14-76d8-4975-a48c-583dc8ead026\" (UID: \"3cd7de14-76d8-4975-a48c-583dc8ead026\") " Apr 16 15:21:42.289990 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:42.289986 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3cd7de14-76d8-4975-a48c-583dc8ead026-util\") pod \"3cd7de14-76d8-4975-a48c-583dc8ead026\" (UID: \"3cd7de14-76d8-4975-a48c-583dc8ead026\") " Apr 16 15:21:42.290719 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:42.290665 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5935dea1-ae73-40c9-98ae-ed32f97ce643-bundle" (OuterVolumeSpecName: "bundle") pod "5935dea1-ae73-40c9-98ae-ed32f97ce643" (UID: "5935dea1-ae73-40c9-98ae-ed32f97ce643"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:21:42.291108 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:42.291068 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3cd7de14-76d8-4975-a48c-583dc8ead026-bundle" (OuterVolumeSpecName: "bundle") pod "3cd7de14-76d8-4975-a48c-583dc8ead026" (UID: "3cd7de14-76d8-4975-a48c-583dc8ead026"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:21:42.291426 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:42.291398 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c7b8a7a-8603-481a-b226-086efa4a6d4f-bundle" (OuterVolumeSpecName: "bundle") pod "5c7b8a7a-8603-481a-b226-086efa4a6d4f" (UID: "5c7b8a7a-8603-481a-b226-086efa4a6d4f"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:21:42.292686 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:42.292636 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cd7de14-76d8-4975-a48c-583dc8ead026-kube-api-access-rjcsn" (OuterVolumeSpecName: "kube-api-access-rjcsn") pod "3cd7de14-76d8-4975-a48c-583dc8ead026" (UID: "3cd7de14-76d8-4975-a48c-583dc8ead026"). InnerVolumeSpecName "kube-api-access-rjcsn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 15:21:42.292791 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:42.292713 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5935dea1-ae73-40c9-98ae-ed32f97ce643-kube-api-access-k4f8c" (OuterVolumeSpecName: "kube-api-access-k4f8c") pod "5935dea1-ae73-40c9-98ae-ed32f97ce643" (UID: "5935dea1-ae73-40c9-98ae-ed32f97ce643"). InnerVolumeSpecName "kube-api-access-k4f8c". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 15:21:42.293220 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:42.293189 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c7b8a7a-8603-481a-b226-086efa4a6d4f-kube-api-access-lqstt" (OuterVolumeSpecName: "kube-api-access-lqstt") pod "5c7b8a7a-8603-481a-b226-086efa4a6d4f" (UID: "5c7b8a7a-8603-481a-b226-086efa4a6d4f"). InnerVolumeSpecName "kube-api-access-lqstt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 15:21:42.296531 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:42.296509 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3cd7de14-76d8-4975-a48c-583dc8ead026-util" (OuterVolumeSpecName: "util") pod "3cd7de14-76d8-4975-a48c-583dc8ead026" (UID: "3cd7de14-76d8-4975-a48c-583dc8ead026"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:21:42.296672 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:42.296633 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5935dea1-ae73-40c9-98ae-ed32f97ce643-util" (OuterVolumeSpecName: "util") pod "5935dea1-ae73-40c9-98ae-ed32f97ce643" (UID: "5935dea1-ae73-40c9-98ae-ed32f97ce643"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:21:42.296970 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:42.296952 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c7b8a7a-8603-481a-b226-086efa4a6d4f-util" (OuterVolumeSpecName: "util") pod "5c7b8a7a-8603-481a-b226-086efa4a6d4f" (UID: "5c7b8a7a-8603-481a-b226-086efa4a6d4f"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:21:42.391523 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:42.391493 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-k4f8c\" (UniqueName: \"kubernetes.io/projected/5935dea1-ae73-40c9-98ae-ed32f97ce643-kube-api-access-k4f8c\") on node \"ip-10-0-136-151.ec2.internal\" DevicePath \"\"" Apr 16 15:21:42.391523 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:42.391523 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lqstt\" (UniqueName: \"kubernetes.io/projected/5c7b8a7a-8603-481a-b226-086efa4a6d4f-kube-api-access-lqstt\") on node \"ip-10-0-136-151.ec2.internal\" DevicePath \"\"" Apr 16 15:21:42.391775 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:42.391533 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rjcsn\" (UniqueName: \"kubernetes.io/projected/3cd7de14-76d8-4975-a48c-583dc8ead026-kube-api-access-rjcsn\") on node \"ip-10-0-136-151.ec2.internal\" DevicePath \"\"" Apr 16 15:21:42.391775 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:42.391543 2573 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5935dea1-ae73-40c9-98ae-ed32f97ce643-bundle\") on node \"ip-10-0-136-151.ec2.internal\" DevicePath \"\"" Apr 16 15:21:42.391775 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:42.391553 2573 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3cd7de14-76d8-4975-a48c-583dc8ead026-bundle\") on node \"ip-10-0-136-151.ec2.internal\" DevicePath \"\"" Apr 16 15:21:42.391775 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:42.391561 2573 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3cd7de14-76d8-4975-a48c-583dc8ead026-util\") on node \"ip-10-0-136-151.ec2.internal\" DevicePath \"\"" Apr 16 15:21:42.391775 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:42.391569 2573 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5935dea1-ae73-40c9-98ae-ed32f97ce643-util\") on node \"ip-10-0-136-151.ec2.internal\" DevicePath \"\"" Apr 16 15:21:42.391775 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:42.391576 2573 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5c7b8a7a-8603-481a-b226-086efa4a6d4f-bundle\") on node \"ip-10-0-136-151.ec2.internal\" DevicePath \"\"" Apr 16 15:21:42.391775 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:42.391584 2573 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5c7b8a7a-8603-481a-b226-086efa4a6d4f-util\") on node \"ip-10-0-136-151.ec2.internal\" DevicePath \"\"" Apr 16 15:21:43.017308 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:43.017277 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e08zb6b" Apr 16 15:21:43.017741 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:43.017278 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e08zb6b" event={"ID":"3cd7de14-76d8-4975-a48c-583dc8ead026","Type":"ContainerDied","Data":"da727c7fb3d4e6189af1d534b5fdb56f87772894d20a5a845fb831abda29acba"} Apr 16 15:21:43.017741 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:43.017382 2573 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="da727c7fb3d4e6189af1d534b5fdb56f87772894d20a5a845fb831abda29acba" Apr 16 15:21:43.019054 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:43.019028 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1tpf8w" event={"ID":"5935dea1-ae73-40c9-98ae-ed32f97ce643","Type":"ContainerDied","Data":"9006e34ed18f511497c2242d03d30aeac22c57f922670a4e49f2851c7b530f1c"} Apr 16 15:21:43.019135 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:43.019061 2573 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9006e34ed18f511497c2242d03d30aeac22c57f922670a4e49f2851c7b530f1c" Apr 16 15:21:43.019135 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:43.019062 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1tpf8w" Apr 16 15:21:43.020728 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:43.020703 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed735jmgw" event={"ID":"5c7b8a7a-8603-481a-b226-086efa4a6d4f","Type":"ContainerDied","Data":"9677e2adabc8d3dd09fbd53066fb78f019be6530397b812f00d7b4f98052e176"} Apr 16 15:21:43.020846 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:43.020739 2573 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9677e2adabc8d3dd09fbd53066fb78f019be6530397b812f00d7b4f98052e176" Apr 16 15:21:43.020846 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:43.020712 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed735jmgw" Apr 16 15:21:43.140187 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:43.140158 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7594h5xw" Apr 16 15:21:43.299507 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:43.299422 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/844c3ca6-337b-4c63-b069-2bb7c48e6f4f-bundle\") pod \"844c3ca6-337b-4c63-b069-2bb7c48e6f4f\" (UID: \"844c3ca6-337b-4c63-b069-2bb7c48e6f4f\") " Apr 16 15:21:43.299507 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:43.299492 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r7hn8\" (UniqueName: \"kubernetes.io/projected/844c3ca6-337b-4c63-b069-2bb7c48e6f4f-kube-api-access-r7hn8\") pod \"844c3ca6-337b-4c63-b069-2bb7c48e6f4f\" (UID: \"844c3ca6-337b-4c63-b069-2bb7c48e6f4f\") " Apr 16 15:21:43.299753 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:43.299541 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/844c3ca6-337b-4c63-b069-2bb7c48e6f4f-util\") pod \"844c3ca6-337b-4c63-b069-2bb7c48e6f4f\" (UID: \"844c3ca6-337b-4c63-b069-2bb7c48e6f4f\") " Apr 16 15:21:43.299964 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:43.299937 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/844c3ca6-337b-4c63-b069-2bb7c48e6f4f-bundle" (OuterVolumeSpecName: "bundle") pod "844c3ca6-337b-4c63-b069-2bb7c48e6f4f" (UID: "844c3ca6-337b-4c63-b069-2bb7c48e6f4f"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:21:43.301888 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:43.301865 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/844c3ca6-337b-4c63-b069-2bb7c48e6f4f-kube-api-access-r7hn8" (OuterVolumeSpecName: "kube-api-access-r7hn8") pod "844c3ca6-337b-4c63-b069-2bb7c48e6f4f" (UID: "844c3ca6-337b-4c63-b069-2bb7c48e6f4f"). InnerVolumeSpecName "kube-api-access-r7hn8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 15:21:43.306603 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:43.306575 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/844c3ca6-337b-4c63-b069-2bb7c48e6f4f-util" (OuterVolumeSpecName: "util") pod "844c3ca6-337b-4c63-b069-2bb7c48e6f4f" (UID: "844c3ca6-337b-4c63-b069-2bb7c48e6f4f"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:21:43.400941 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:43.400892 2573 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/844c3ca6-337b-4c63-b069-2bb7c48e6f4f-util\") on node \"ip-10-0-136-151.ec2.internal\" DevicePath \"\"" Apr 16 15:21:43.400941 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:43.400935 2573 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/844c3ca6-337b-4c63-b069-2bb7c48e6f4f-bundle\") on node \"ip-10-0-136-151.ec2.internal\" DevicePath \"\"" Apr 16 15:21:43.400941 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:43.400947 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-r7hn8\" (UniqueName: \"kubernetes.io/projected/844c3ca6-337b-4c63-b069-2bb7c48e6f4f-kube-api-access-r7hn8\") on node \"ip-10-0-136-151.ec2.internal\" DevicePath \"\"" Apr 16 15:21:44.025995 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:44.025960 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7594h5xw" event={"ID":"844c3ca6-337b-4c63-b069-2bb7c48e6f4f","Type":"ContainerDied","Data":"510858af7401e60e409a889ec7bf36d66547b65d23b2eb200d1dfe34c0a9aebe"} Apr 16 15:21:44.026339 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:44.026006 2573 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="510858af7401e60e409a889ec7bf36d66547b65d23b2eb200d1dfe34c0a9aebe" Apr 16 15:21:44.026339 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:44.025973 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7594h5xw" Apr 16 15:21:56.022479 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:56.022422 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6bbf4d9849-clvm8" podUID="a7635360-cbb1-4161-9e1f-d65f7cef6c61" containerName="console" containerID="cri-o://5bc1eefc72a224b20ebca13abff795a736f7e48760d37d56c79571a974b24d4e" gracePeriod=15 Apr 16 15:21:56.270985 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:56.270965 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6bbf4d9849-clvm8_a7635360-cbb1-4161-9e1f-d65f7cef6c61/console/0.log" Apr 16 15:21:56.271104 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:56.271025 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6bbf4d9849-clvm8" Apr 16 15:21:56.313543 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:56.313483 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a7635360-cbb1-4161-9e1f-d65f7cef6c61-trusted-ca-bundle\") pod \"a7635360-cbb1-4161-9e1f-d65f7cef6c61\" (UID: \"a7635360-cbb1-4161-9e1f-d65f7cef6c61\") " Apr 16 15:21:56.313543 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:56.313513 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a7635360-cbb1-4161-9e1f-d65f7cef6c61-oauth-serving-cert\") pod \"a7635360-cbb1-4161-9e1f-d65f7cef6c61\" (UID: \"a7635360-cbb1-4161-9e1f-d65f7cef6c61\") " Apr 16 15:21:56.313721 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:56.313551 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a7635360-cbb1-4161-9e1f-d65f7cef6c61-console-serving-cert\") pod \"a7635360-cbb1-4161-9e1f-d65f7cef6c61\" (UID: \"a7635360-cbb1-4161-9e1f-d65f7cef6c61\") " Apr 16 15:21:56.313721 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:56.313665 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a7635360-cbb1-4161-9e1f-d65f7cef6c61-console-oauth-config\") pod \"a7635360-cbb1-4161-9e1f-d65f7cef6c61\" (UID: \"a7635360-cbb1-4161-9e1f-d65f7cef6c61\") " Apr 16 15:21:56.313804 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:56.313721 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a7635360-cbb1-4161-9e1f-d65f7cef6c61-console-config\") pod \"a7635360-cbb1-4161-9e1f-d65f7cef6c61\" (UID: \"a7635360-cbb1-4161-9e1f-d65f7cef6c61\") " Apr 16 15:21:56.313804 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:56.313752 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a7635360-cbb1-4161-9e1f-d65f7cef6c61-service-ca\") pod \"a7635360-cbb1-4161-9e1f-d65f7cef6c61\" (UID: \"a7635360-cbb1-4161-9e1f-d65f7cef6c61\") " Apr 16 15:21:56.314039 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:56.314008 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7635360-cbb1-4161-9e1f-d65f7cef6c61-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "a7635360-cbb1-4161-9e1f-d65f7cef6c61" (UID: "a7635360-cbb1-4161-9e1f-d65f7cef6c61"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 15:21:56.314039 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:56.314015 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7635360-cbb1-4161-9e1f-d65f7cef6c61-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "a7635360-cbb1-4161-9e1f-d65f7cef6c61" (UID: "a7635360-cbb1-4161-9e1f-d65f7cef6c61"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 15:21:56.314206 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:56.314102 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7635360-cbb1-4161-9e1f-d65f7cef6c61-console-config" (OuterVolumeSpecName: "console-config") pod "a7635360-cbb1-4161-9e1f-d65f7cef6c61" (UID: "a7635360-cbb1-4161-9e1f-d65f7cef6c61"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 15:21:56.314247 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:56.314193 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7635360-cbb1-4161-9e1f-d65f7cef6c61-service-ca" (OuterVolumeSpecName: "service-ca") pod "a7635360-cbb1-4161-9e1f-d65f7cef6c61" (UID: "a7635360-cbb1-4161-9e1f-d65f7cef6c61"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 15:21:56.315931 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:56.315907 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7635360-cbb1-4161-9e1f-d65f7cef6c61-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "a7635360-cbb1-4161-9e1f-d65f7cef6c61" (UID: "a7635360-cbb1-4161-9e1f-d65f7cef6c61"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 15:21:56.316017 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:56.315988 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7635360-cbb1-4161-9e1f-d65f7cef6c61-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "a7635360-cbb1-4161-9e1f-d65f7cef6c61" (UID: "a7635360-cbb1-4161-9e1f-d65f7cef6c61"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 15:21:56.414349 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:56.414316 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9vvhf\" (UniqueName: \"kubernetes.io/projected/a7635360-cbb1-4161-9e1f-d65f7cef6c61-kube-api-access-9vvhf\") pod \"a7635360-cbb1-4161-9e1f-d65f7cef6c61\" (UID: \"a7635360-cbb1-4161-9e1f-d65f7cef6c61\") " Apr 16 15:21:56.414530 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:56.414507 2573 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a7635360-cbb1-4161-9e1f-d65f7cef6c61-trusted-ca-bundle\") on node \"ip-10-0-136-151.ec2.internal\" DevicePath \"\"" Apr 16 15:21:56.414530 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:56.414519 2573 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a7635360-cbb1-4161-9e1f-d65f7cef6c61-oauth-serving-cert\") on node \"ip-10-0-136-151.ec2.internal\" DevicePath \"\"" Apr 16 15:21:56.414530 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:56.414529 2573 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a7635360-cbb1-4161-9e1f-d65f7cef6c61-console-serving-cert\") on node \"ip-10-0-136-151.ec2.internal\" DevicePath \"\"" Apr 16 15:21:56.414733 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:56.414538 2573 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a7635360-cbb1-4161-9e1f-d65f7cef6c61-console-oauth-config\") on node \"ip-10-0-136-151.ec2.internal\" DevicePath \"\"" Apr 16 15:21:56.414733 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:56.414546 2573 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a7635360-cbb1-4161-9e1f-d65f7cef6c61-console-config\") on node \"ip-10-0-136-151.ec2.internal\" DevicePath \"\"" Apr 16 15:21:56.414733 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:56.414555 2573 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a7635360-cbb1-4161-9e1f-d65f7cef6c61-service-ca\") on node \"ip-10-0-136-151.ec2.internal\" DevicePath \"\"" Apr 16 15:21:56.416586 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:56.416560 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7635360-cbb1-4161-9e1f-d65f7cef6c61-kube-api-access-9vvhf" (OuterVolumeSpecName: "kube-api-access-9vvhf") pod "a7635360-cbb1-4161-9e1f-d65f7cef6c61" (UID: "a7635360-cbb1-4161-9e1f-d65f7cef6c61"). InnerVolumeSpecName "kube-api-access-9vvhf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 15:21:56.515629 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:56.515596 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9vvhf\" (UniqueName: \"kubernetes.io/projected/a7635360-cbb1-4161-9e1f-d65f7cef6c61-kube-api-access-9vvhf\") on node \"ip-10-0-136-151.ec2.internal\" DevicePath \"\"" Apr 16 15:21:57.074856 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:57.074831 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6bbf4d9849-clvm8_a7635360-cbb1-4161-9e1f-d65f7cef6c61/console/0.log" Apr 16 15:21:57.075268 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:57.074869 2573 generic.go:358] "Generic (PLEG): container finished" podID="a7635360-cbb1-4161-9e1f-d65f7cef6c61" containerID="5bc1eefc72a224b20ebca13abff795a736f7e48760d37d56c79571a974b24d4e" exitCode=2 Apr 16 15:21:57.075268 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:57.074951 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6bbf4d9849-clvm8" Apr 16 15:21:57.075268 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:57.074969 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6bbf4d9849-clvm8" event={"ID":"a7635360-cbb1-4161-9e1f-d65f7cef6c61","Type":"ContainerDied","Data":"5bc1eefc72a224b20ebca13abff795a736f7e48760d37d56c79571a974b24d4e"} Apr 16 15:21:57.075268 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:57.075012 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6bbf4d9849-clvm8" event={"ID":"a7635360-cbb1-4161-9e1f-d65f7cef6c61","Type":"ContainerDied","Data":"e3d6d87c52e3f6d35183f06528dd0ea1a6d10625e6cdd20da9e7733037d2d28f"} Apr 16 15:21:57.075268 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:57.075028 2573 scope.go:117] "RemoveContainer" containerID="5bc1eefc72a224b20ebca13abff795a736f7e48760d37d56c79571a974b24d4e" Apr 16 15:21:57.084365 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:57.084348 2573 scope.go:117] "RemoveContainer" containerID="5bc1eefc72a224b20ebca13abff795a736f7e48760d37d56c79571a974b24d4e" Apr 16 15:21:57.084611 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:21:57.084594 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5bc1eefc72a224b20ebca13abff795a736f7e48760d37d56c79571a974b24d4e\": container with ID starting with 5bc1eefc72a224b20ebca13abff795a736f7e48760d37d56c79571a974b24d4e not found: ID does not exist" containerID="5bc1eefc72a224b20ebca13abff795a736f7e48760d37d56c79571a974b24d4e" Apr 16 15:21:57.084687 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:57.084620 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5bc1eefc72a224b20ebca13abff795a736f7e48760d37d56c79571a974b24d4e"} err="failed to get container status \"5bc1eefc72a224b20ebca13abff795a736f7e48760d37d56c79571a974b24d4e\": rpc error: code = NotFound desc = could not find container \"5bc1eefc72a224b20ebca13abff795a736f7e48760d37d56c79571a974b24d4e\": container with ID starting with 5bc1eefc72a224b20ebca13abff795a736f7e48760d37d56c79571a974b24d4e not found: ID does not exist" Apr 16 15:21:57.103633 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:57.103597 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6bbf4d9849-clvm8"] Apr 16 15:21:57.112537 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:57.112507 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6bbf4d9849-clvm8"] Apr 16 15:21:58.023911 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:21:58.023876 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7635360-cbb1-4161-9e1f-d65f7cef6c61" path="/var/lib/kubelet/pods/a7635360-cbb1-4161-9e1f-d65f7cef6c61/volumes" Apr 16 15:22:01.927366 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:22:01.927329 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-xlsw6_22e6f93e-aa56-4c51-85da-4f105af8d76d/console-operator/2.log" Apr 16 15:22:01.927804 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:22:01.927555 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-xlsw6_22e6f93e-aa56-4c51-85da-4f105af8d76d/console-operator/2.log" Apr 16 15:22:01.932513 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:22:01.932493 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n96lw_edf7b8a9-878b-451c-86c3-6a78d2ece872/ovn-acl-logging/0.log" Apr 16 15:22:01.932805 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:22:01.932783 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n96lw_edf7b8a9-878b-451c-86c3-6a78d2ece872/ovn-acl-logging/0.log" Apr 16 15:22:17.356383 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:22:17.356305 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-t42zx"] Apr 16 15:22:17.356917 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:22:17.356829 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3cd7de14-76d8-4975-a48c-583dc8ead026" containerName="util" Apr 16 15:22:17.356917 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:22:17.356849 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cd7de14-76d8-4975-a48c-583dc8ead026" containerName="util" Apr 16 15:22:17.356917 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:22:17.356862 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3cd7de14-76d8-4975-a48c-583dc8ead026" containerName="pull" Apr 16 15:22:17.356917 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:22:17.356871 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cd7de14-76d8-4975-a48c-583dc8ead026" containerName="pull" Apr 16 15:22:17.356917 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:22:17.356884 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5935dea1-ae73-40c9-98ae-ed32f97ce643" containerName="util" Apr 16 15:22:17.356917 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:22:17.356893 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="5935dea1-ae73-40c9-98ae-ed32f97ce643" containerName="util" Apr 16 15:22:17.356917 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:22:17.356905 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="844c3ca6-337b-4c63-b069-2bb7c48e6f4f" containerName="util" Apr 16 15:22:17.356917 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:22:17.356912 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="844c3ca6-337b-4c63-b069-2bb7c48e6f4f" containerName="util" Apr 16 15:22:17.356917 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:22:17.356923 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="844c3ca6-337b-4c63-b069-2bb7c48e6f4f" containerName="pull" Apr 16 15:22:17.357370 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:22:17.356930 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="844c3ca6-337b-4c63-b069-2bb7c48e6f4f" containerName="pull" Apr 16 15:22:17.357370 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:22:17.356942 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3cd7de14-76d8-4975-a48c-583dc8ead026" containerName="extract" Apr 16 15:22:17.357370 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:22:17.356951 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cd7de14-76d8-4975-a48c-583dc8ead026" containerName="extract" Apr 16 15:22:17.357370 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:22:17.356963 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5c7b8a7a-8603-481a-b226-086efa4a6d4f" containerName="util" Apr 16 15:22:17.357370 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:22:17.356970 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c7b8a7a-8603-481a-b226-086efa4a6d4f" containerName="util" Apr 16 15:22:17.357370 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:22:17.356985 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5935dea1-ae73-40c9-98ae-ed32f97ce643" containerName="pull" Apr 16 15:22:17.357370 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:22:17.356994 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="5935dea1-ae73-40c9-98ae-ed32f97ce643" containerName="pull" Apr 16 15:22:17.357370 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:22:17.357011 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5c7b8a7a-8603-481a-b226-086efa4a6d4f" containerName="extract" Apr 16 15:22:17.357370 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:22:17.357020 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c7b8a7a-8603-481a-b226-086efa4a6d4f" containerName="extract" Apr 16 15:22:17.357370 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:22:17.357040 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5935dea1-ae73-40c9-98ae-ed32f97ce643" containerName="extract" Apr 16 15:22:17.357370 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:22:17.357048 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="5935dea1-ae73-40c9-98ae-ed32f97ce643" containerName="extract" Apr 16 15:22:17.357370 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:22:17.357058 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="844c3ca6-337b-4c63-b069-2bb7c48e6f4f" containerName="extract" Apr 16 15:22:17.357370 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:22:17.357069 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="844c3ca6-337b-4c63-b069-2bb7c48e6f4f" containerName="extract" Apr 16 15:22:17.357370 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:22:17.357081 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5c7b8a7a-8603-481a-b226-086efa4a6d4f" containerName="pull" Apr 16 15:22:17.357370 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:22:17.357089 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c7b8a7a-8603-481a-b226-086efa4a6d4f" containerName="pull" Apr 16 15:22:17.357370 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:22:17.357104 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a7635360-cbb1-4161-9e1f-d65f7cef6c61" containerName="console" Apr 16 15:22:17.357370 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:22:17.357112 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7635360-cbb1-4161-9e1f-d65f7cef6c61" containerName="console" Apr 16 15:22:17.357370 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:22:17.357205 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="844c3ca6-337b-4c63-b069-2bb7c48e6f4f" containerName="extract" Apr 16 15:22:17.357370 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:22:17.357221 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="5935dea1-ae73-40c9-98ae-ed32f97ce643" containerName="extract" Apr 16 15:22:17.357370 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:22:17.357234 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="a7635360-cbb1-4161-9e1f-d65f7cef6c61" containerName="console" Apr 16 15:22:17.357370 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:22:17.357244 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="5c7b8a7a-8603-481a-b226-086efa4a6d4f" containerName="extract" Apr 16 15:22:17.357370 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:22:17.357256 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="3cd7de14-76d8-4975-a48c-583dc8ead026" containerName="extract" Apr 16 15:22:17.360227 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:22:17.360207 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-t42zx" Apr 16 15:22:17.362857 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:22:17.362826 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"plugin-serving-cert\"" Apr 16 15:22:17.362980 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:22:17.362940 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kuadrant-console-nginx-conf\"" Apr 16 15:22:17.363100 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:22:17.363080 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 16 15:22:17.364230 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:22:17.364211 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-mjgck\"" Apr 16 15:22:17.364330 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:22:17.364247 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 16 15:22:17.370183 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:22:17.370163 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-t42zx"] Apr 16 15:22:17.399456 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:22:17.399429 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2zsd\" (UniqueName: \"kubernetes.io/projected/83c616ca-a5f0-4123-9bfa-474d0ba7d534-kube-api-access-d2zsd\") pod \"kuadrant-console-plugin-6cb54b5c86-t42zx\" (UID: \"83c616ca-a5f0-4123-9bfa-474d0ba7d534\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-t42zx" Apr 16 15:22:17.399572 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:22:17.399459 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/83c616ca-a5f0-4123-9bfa-474d0ba7d534-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-t42zx\" (UID: \"83c616ca-a5f0-4123-9bfa-474d0ba7d534\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-t42zx" Apr 16 15:22:17.399572 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:22:17.399515 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/83c616ca-a5f0-4123-9bfa-474d0ba7d534-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-t42zx\" (UID: \"83c616ca-a5f0-4123-9bfa-474d0ba7d534\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-t42zx" Apr 16 15:22:17.500474 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:22:17.500441 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d2zsd\" (UniqueName: \"kubernetes.io/projected/83c616ca-a5f0-4123-9bfa-474d0ba7d534-kube-api-access-d2zsd\") pod \"kuadrant-console-plugin-6cb54b5c86-t42zx\" (UID: \"83c616ca-a5f0-4123-9bfa-474d0ba7d534\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-t42zx" Apr 16 15:22:17.500474 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:22:17.500475 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/83c616ca-a5f0-4123-9bfa-474d0ba7d534-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-t42zx\" (UID: \"83c616ca-a5f0-4123-9bfa-474d0ba7d534\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-t42zx" Apr 16 15:22:17.500716 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:22:17.500506 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/83c616ca-a5f0-4123-9bfa-474d0ba7d534-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-t42zx\" (UID: \"83c616ca-a5f0-4123-9bfa-474d0ba7d534\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-t42zx" Apr 16 15:22:17.501116 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:22:17.501099 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/83c616ca-a5f0-4123-9bfa-474d0ba7d534-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-t42zx\" (UID: \"83c616ca-a5f0-4123-9bfa-474d0ba7d534\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-t42zx" Apr 16 15:22:17.503041 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:22:17.503024 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/83c616ca-a5f0-4123-9bfa-474d0ba7d534-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-t42zx\" (UID: \"83c616ca-a5f0-4123-9bfa-474d0ba7d534\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-t42zx" Apr 16 15:22:17.508379 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:22:17.508349 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2zsd\" (UniqueName: \"kubernetes.io/projected/83c616ca-a5f0-4123-9bfa-474d0ba7d534-kube-api-access-d2zsd\") pod \"kuadrant-console-plugin-6cb54b5c86-t42zx\" (UID: \"83c616ca-a5f0-4123-9bfa-474d0ba7d534\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-t42zx" Apr 16 15:22:17.670009 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:22:17.669977 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-t42zx" Apr 16 15:22:17.791819 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:22:17.791789 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-t42zx"] Apr 16 15:22:17.793441 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:22:17.793410 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod83c616ca_a5f0_4123_9bfa_474d0ba7d534.slice/crio-08c4b3d2dfc52ecd7d386d17927df2610932cd5b9d653dfe2afdfcf511c94473 WatchSource:0}: Error finding container 08c4b3d2dfc52ecd7d386d17927df2610932cd5b9d653dfe2afdfcf511c94473: Status 404 returned error can't find the container with id 08c4b3d2dfc52ecd7d386d17927df2610932cd5b9d653dfe2afdfcf511c94473 Apr 16 15:22:18.154716 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:22:18.154683 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-t42zx" event={"ID":"83c616ca-a5f0-4123-9bfa-474d0ba7d534","Type":"ContainerStarted","Data":"08c4b3d2dfc52ecd7d386d17927df2610932cd5b9d653dfe2afdfcf511c94473"} Apr 16 15:22:43.263565 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:22:43.263532 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-t42zx" event={"ID":"83c616ca-a5f0-4123-9bfa-474d0ba7d534","Type":"ContainerStarted","Data":"325e8c73f56ebcac908d3538a5cdd5b7c94cd115ea02397be8f1bbf9bddf079e"} Apr 16 15:22:43.279863 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:22:43.279799 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-t42zx" podStartSLOduration=1.304314217 podStartE2EDuration="26.279779615s" podCreationTimestamp="2026-04-16 15:22:17 +0000 UTC" firstStartedPulling="2026-04-16 15:22:17.794799922 +0000 UTC m=+616.381003711" lastFinishedPulling="2026-04-16 15:22:42.770265313 +0000 UTC m=+641.356469109" observedRunningTime="2026-04-16 15:22:43.279166832 +0000 UTC m=+641.865370643" watchObservedRunningTime="2026-04-16 15:22:43.279779615 +0000 UTC m=+641.865983427" Apr 16 15:23:01.558942 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:01.558901 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-r275c"] Apr 16 15:23:01.620811 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:01.620776 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-r275c"] Apr 16 15:23:01.620986 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:01.620910 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-r275c" Apr 16 15:23:01.624108 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:01.624083 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 16 15:23:01.663990 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:01.663960 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-r275c"] Apr 16 15:23:01.719611 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:01.719571 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/43674e8b-ed9d-4128-9768-3d1f604dfe6b-config-file\") pod \"limitador-limitador-7d549b5b-r275c\" (UID: \"43674e8b-ed9d-4128-9768-3d1f604dfe6b\") " pod="kuadrant-system/limitador-limitador-7d549b5b-r275c" Apr 16 15:23:01.719814 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:01.719689 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzzv9\" (UniqueName: \"kubernetes.io/projected/43674e8b-ed9d-4128-9768-3d1f604dfe6b-kube-api-access-kzzv9\") pod \"limitador-limitador-7d549b5b-r275c\" (UID: \"43674e8b-ed9d-4128-9768-3d1f604dfe6b\") " pod="kuadrant-system/limitador-limitador-7d549b5b-r275c" Apr 16 15:23:01.820708 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:01.820622 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kzzv9\" (UniqueName: \"kubernetes.io/projected/43674e8b-ed9d-4128-9768-3d1f604dfe6b-kube-api-access-kzzv9\") pod \"limitador-limitador-7d549b5b-r275c\" (UID: \"43674e8b-ed9d-4128-9768-3d1f604dfe6b\") " pod="kuadrant-system/limitador-limitador-7d549b5b-r275c" Apr 16 15:23:01.820855 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:01.820732 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/43674e8b-ed9d-4128-9768-3d1f604dfe6b-config-file\") pod \"limitador-limitador-7d549b5b-r275c\" (UID: \"43674e8b-ed9d-4128-9768-3d1f604dfe6b\") " pod="kuadrant-system/limitador-limitador-7d549b5b-r275c" Apr 16 15:23:01.821355 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:01.821337 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/43674e8b-ed9d-4128-9768-3d1f604dfe6b-config-file\") pod \"limitador-limitador-7d549b5b-r275c\" (UID: \"43674e8b-ed9d-4128-9768-3d1f604dfe6b\") " pod="kuadrant-system/limitador-limitador-7d549b5b-r275c" Apr 16 15:23:01.829096 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:01.829077 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzzv9\" (UniqueName: \"kubernetes.io/projected/43674e8b-ed9d-4128-9768-3d1f604dfe6b-kube-api-access-kzzv9\") pod \"limitador-limitador-7d549b5b-r275c\" (UID: \"43674e8b-ed9d-4128-9768-3d1f604dfe6b\") " pod="kuadrant-system/limitador-limitador-7d549b5b-r275c" Apr 16 15:23:01.931888 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:01.931839 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-r275c" Apr 16 15:23:02.060766 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:02.060743 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-r275c"] Apr 16 15:23:02.065708 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:02.065688 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 15:23:02.336284 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:02.336246 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-r275c" event={"ID":"43674e8b-ed9d-4128-9768-3d1f604dfe6b","Type":"ContainerStarted","Data":"41553dca4c76c04bb7bedeb8bb8387a51c78b18029a1f5d6c8e25d2f219832a7"} Apr 16 15:23:02.615309 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:02.615235 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-7498df8756-hbflh"] Apr 16 15:23:02.618694 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:02.618674 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-hbflh" Apr 16 15:23:02.621582 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:02.621558 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-fssxm\"" Apr 16 15:23:02.627599 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:02.627578 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-hbflh"] Apr 16 15:23:02.728662 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:02.728595 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4zzr\" (UniqueName: \"kubernetes.io/projected/aa3343d1-02a6-451c-90cc-eef878ebfa29-kube-api-access-j4zzr\") pod \"authorino-7498df8756-hbflh\" (UID: \"aa3343d1-02a6-451c-90cc-eef878ebfa29\") " pod="kuadrant-system/authorino-7498df8756-hbflh" Apr 16 15:23:02.830188 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:02.829689 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j4zzr\" (UniqueName: \"kubernetes.io/projected/aa3343d1-02a6-451c-90cc-eef878ebfa29-kube-api-access-j4zzr\") pod \"authorino-7498df8756-hbflh\" (UID: \"aa3343d1-02a6-451c-90cc-eef878ebfa29\") " pod="kuadrant-system/authorino-7498df8756-hbflh" Apr 16 15:23:02.839038 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:02.838982 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4zzr\" (UniqueName: \"kubernetes.io/projected/aa3343d1-02a6-451c-90cc-eef878ebfa29-kube-api-access-j4zzr\") pod \"authorino-7498df8756-hbflh\" (UID: \"aa3343d1-02a6-451c-90cc-eef878ebfa29\") " pod="kuadrant-system/authorino-7498df8756-hbflh" Apr 16 15:23:02.929956 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:02.929918 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-hbflh" Apr 16 15:23:03.137342 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:03.137311 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-hbflh"] Apr 16 15:23:03.138446 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:23:03.138420 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa3343d1_02a6_451c_90cc_eef878ebfa29.slice/crio-d2b8cbc68f79d1ca2451eb97d952988d8335d93bda4d10a826e70fa8fcfcf0f0 WatchSource:0}: Error finding container d2b8cbc68f79d1ca2451eb97d952988d8335d93bda4d10a826e70fa8fcfcf0f0: Status 404 returned error can't find the container with id d2b8cbc68f79d1ca2451eb97d952988d8335d93bda4d10a826e70fa8fcfcf0f0 Apr 16 15:23:03.341766 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:03.341678 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-hbflh" event={"ID":"aa3343d1-02a6-451c-90cc-eef878ebfa29","Type":"ContainerStarted","Data":"d2b8cbc68f79d1ca2451eb97d952988d8335d93bda4d10a826e70fa8fcfcf0f0"} Apr 16 15:23:05.353025 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:05.352922 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-r275c" event={"ID":"43674e8b-ed9d-4128-9768-3d1f604dfe6b","Type":"ContainerStarted","Data":"bb6cbfb49259983c29061f28dc4c8ea6be47086b987317c625390d4bda7dd4e6"} Apr 16 15:23:05.353025 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:05.353018 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-7d549b5b-r275c" Apr 16 15:23:05.373059 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:05.373012 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-7d549b5b-r275c" podStartSLOduration=1.446004837 podStartE2EDuration="4.372998056s" podCreationTimestamp="2026-04-16 15:23:01 +0000 UTC" firstStartedPulling="2026-04-16 15:23:02.065810053 +0000 UTC m=+660.652013843" lastFinishedPulling="2026-04-16 15:23:04.992803264 +0000 UTC m=+663.579007062" observedRunningTime="2026-04-16 15:23:05.369427304 +0000 UTC m=+663.955631114" watchObservedRunningTime="2026-04-16 15:23:05.372998056 +0000 UTC m=+663.959201865" Apr 16 15:23:06.359038 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:06.359005 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-hbflh" event={"ID":"aa3343d1-02a6-451c-90cc-eef878ebfa29","Type":"ContainerStarted","Data":"5cfc63ee910f9759e6f8d168ca6ea866a0727d75a93c0f67840a97c5babe4d88"} Apr 16 15:23:16.360089 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:16.360060 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-7d549b5b-r275c" Apr 16 15:23:16.378107 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:16.378057 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-7498df8756-hbflh" podStartSLOduration=11.246744157 podStartE2EDuration="14.378042496s" podCreationTimestamp="2026-04-16 15:23:02 +0000 UTC" firstStartedPulling="2026-04-16 15:23:03.140621007 +0000 UTC m=+661.726824798" lastFinishedPulling="2026-04-16 15:23:06.271919348 +0000 UTC m=+664.858123137" observedRunningTime="2026-04-16 15:23:06.373968506 +0000 UTC m=+664.960172316" watchObservedRunningTime="2026-04-16 15:23:16.378042496 +0000 UTC m=+674.964246302" Apr 16 15:23:16.526327 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:16.526285 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-r275c"] Apr 16 15:23:16.526506 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:16.526481 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-limitador-7d549b5b-r275c" podUID="43674e8b-ed9d-4128-9768-3d1f604dfe6b" containerName="limitador" containerID="cri-o://bb6cbfb49259983c29061f28dc4c8ea6be47086b987317c625390d4bda7dd4e6" gracePeriod=30 Apr 16 15:23:17.066022 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:17.066002 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-r275c" Apr 16 15:23:17.256523 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:17.256488 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kzzv9\" (UniqueName: \"kubernetes.io/projected/43674e8b-ed9d-4128-9768-3d1f604dfe6b-kube-api-access-kzzv9\") pod \"43674e8b-ed9d-4128-9768-3d1f604dfe6b\" (UID: \"43674e8b-ed9d-4128-9768-3d1f604dfe6b\") " Apr 16 15:23:17.256523 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:17.256526 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/43674e8b-ed9d-4128-9768-3d1f604dfe6b-config-file\") pod \"43674e8b-ed9d-4128-9768-3d1f604dfe6b\" (UID: \"43674e8b-ed9d-4128-9768-3d1f604dfe6b\") " Apr 16 15:23:17.256913 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:17.256891 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43674e8b-ed9d-4128-9768-3d1f604dfe6b-config-file" (OuterVolumeSpecName: "config-file") pod "43674e8b-ed9d-4128-9768-3d1f604dfe6b" (UID: "43674e8b-ed9d-4128-9768-3d1f604dfe6b"). InnerVolumeSpecName "config-file". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 15:23:17.258703 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:17.258682 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43674e8b-ed9d-4128-9768-3d1f604dfe6b-kube-api-access-kzzv9" (OuterVolumeSpecName: "kube-api-access-kzzv9") pod "43674e8b-ed9d-4128-9768-3d1f604dfe6b" (UID: "43674e8b-ed9d-4128-9768-3d1f604dfe6b"). InnerVolumeSpecName "kube-api-access-kzzv9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 15:23:17.357115 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:17.357076 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kzzv9\" (UniqueName: \"kubernetes.io/projected/43674e8b-ed9d-4128-9768-3d1f604dfe6b-kube-api-access-kzzv9\") on node \"ip-10-0-136-151.ec2.internal\" DevicePath \"\"" Apr 16 15:23:17.357115 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:17.357108 2573 reconciler_common.go:299] "Volume detached for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/43674e8b-ed9d-4128-9768-3d1f604dfe6b-config-file\") on node \"ip-10-0-136-151.ec2.internal\" DevicePath \"\"" Apr 16 15:23:17.402288 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:17.402256 2573 generic.go:358] "Generic (PLEG): container finished" podID="43674e8b-ed9d-4128-9768-3d1f604dfe6b" containerID="bb6cbfb49259983c29061f28dc4c8ea6be47086b987317c625390d4bda7dd4e6" exitCode=0 Apr 16 15:23:17.402714 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:17.402315 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-r275c" Apr 16 15:23:17.402714 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:17.402344 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-r275c" event={"ID":"43674e8b-ed9d-4128-9768-3d1f604dfe6b","Type":"ContainerDied","Data":"bb6cbfb49259983c29061f28dc4c8ea6be47086b987317c625390d4bda7dd4e6"} Apr 16 15:23:17.402714 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:17.402378 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-r275c" event={"ID":"43674e8b-ed9d-4128-9768-3d1f604dfe6b","Type":"ContainerDied","Data":"41553dca4c76c04bb7bedeb8bb8387a51c78b18029a1f5d6c8e25d2f219832a7"} Apr 16 15:23:17.402714 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:17.402394 2573 scope.go:117] "RemoveContainer" containerID="bb6cbfb49259983c29061f28dc4c8ea6be47086b987317c625390d4bda7dd4e6" Apr 16 15:23:17.411254 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:17.411240 2573 scope.go:117] "RemoveContainer" containerID="bb6cbfb49259983c29061f28dc4c8ea6be47086b987317c625390d4bda7dd4e6" Apr 16 15:23:17.411480 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:23:17.411465 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb6cbfb49259983c29061f28dc4c8ea6be47086b987317c625390d4bda7dd4e6\": container with ID starting with bb6cbfb49259983c29061f28dc4c8ea6be47086b987317c625390d4bda7dd4e6 not found: ID does not exist" containerID="bb6cbfb49259983c29061f28dc4c8ea6be47086b987317c625390d4bda7dd4e6" Apr 16 15:23:17.411522 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:17.411488 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb6cbfb49259983c29061f28dc4c8ea6be47086b987317c625390d4bda7dd4e6"} err="failed to get container status \"bb6cbfb49259983c29061f28dc4c8ea6be47086b987317c625390d4bda7dd4e6\": rpc error: code = NotFound desc = could not find container \"bb6cbfb49259983c29061f28dc4c8ea6be47086b987317c625390d4bda7dd4e6\": container with ID starting with bb6cbfb49259983c29061f28dc4c8ea6be47086b987317c625390d4bda7dd4e6 not found: ID does not exist" Apr 16 15:23:17.424716 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:17.424691 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-r275c"] Apr 16 15:23:17.428505 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:17.428487 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-r275c"] Apr 16 15:23:17.508141 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:17.508061 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/postgres-868db5846d-29667"] Apr 16 15:23:17.508502 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:17.508484 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="43674e8b-ed9d-4128-9768-3d1f604dfe6b" containerName="limitador" Apr 16 15:23:17.508580 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:17.508505 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="43674e8b-ed9d-4128-9768-3d1f604dfe6b" containerName="limitador" Apr 16 15:23:17.508637 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:17.508620 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="43674e8b-ed9d-4128-9768-3d1f604dfe6b" containerName="limitador" Apr 16 15:23:17.513043 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:17.513024 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-29667" Apr 16 15:23:17.516486 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:17.516463 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"default-dockercfg-8xvhn\"" Apr 16 15:23:17.516840 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:17.516822 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"postgres-creds\"" Apr 16 15:23:17.524747 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:17.524722 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-29667"] Apr 16 15:23:17.660142 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:17.660114 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvxgc\" (UniqueName: \"kubernetes.io/projected/14b38495-0655-4ac1-a07f-f1bb931d9b6d-kube-api-access-bvxgc\") pod \"postgres-868db5846d-29667\" (UID: \"14b38495-0655-4ac1-a07f-f1bb931d9b6d\") " pod="opendatahub/postgres-868db5846d-29667" Apr 16 15:23:17.660317 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:17.660162 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/14b38495-0655-4ac1-a07f-f1bb931d9b6d-data\") pod \"postgres-868db5846d-29667\" (UID: \"14b38495-0655-4ac1-a07f-f1bb931d9b6d\") " pod="opendatahub/postgres-868db5846d-29667" Apr 16 15:23:17.761227 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:17.761130 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/14b38495-0655-4ac1-a07f-f1bb931d9b6d-data\") pod \"postgres-868db5846d-29667\" (UID: \"14b38495-0655-4ac1-a07f-f1bb931d9b6d\") " pod="opendatahub/postgres-868db5846d-29667" Apr 16 15:23:17.761227 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:17.761223 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bvxgc\" (UniqueName: \"kubernetes.io/projected/14b38495-0655-4ac1-a07f-f1bb931d9b6d-kube-api-access-bvxgc\") pod \"postgres-868db5846d-29667\" (UID: \"14b38495-0655-4ac1-a07f-f1bb931d9b6d\") " pod="opendatahub/postgres-868db5846d-29667" Apr 16 15:23:17.761560 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:17.761541 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/14b38495-0655-4ac1-a07f-f1bb931d9b6d-data\") pod \"postgres-868db5846d-29667\" (UID: \"14b38495-0655-4ac1-a07f-f1bb931d9b6d\") " pod="opendatahub/postgres-868db5846d-29667" Apr 16 15:23:17.770811 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:17.770778 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvxgc\" (UniqueName: \"kubernetes.io/projected/14b38495-0655-4ac1-a07f-f1bb931d9b6d-kube-api-access-bvxgc\") pod \"postgres-868db5846d-29667\" (UID: \"14b38495-0655-4ac1-a07f-f1bb931d9b6d\") " pod="opendatahub/postgres-868db5846d-29667" Apr 16 15:23:17.826350 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:17.826324 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-29667" Apr 16 15:23:18.018255 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:18.018177 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43674e8b-ed9d-4128-9768-3d1f604dfe6b" path="/var/lib/kubelet/pods/43674e8b-ed9d-4128-9768-3d1f604dfe6b/volumes" Apr 16 15:23:18.155162 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:18.155136 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-29667"] Apr 16 15:23:18.156230 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:23:18.156198 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod14b38495_0655_4ac1_a07f_f1bb931d9b6d.slice/crio-dc9d2fd3122a92ec8f7b66a254024d0f3b7c72461ad879a922bb93dc936d11b3 WatchSource:0}: Error finding container dc9d2fd3122a92ec8f7b66a254024d0f3b7c72461ad879a922bb93dc936d11b3: Status 404 returned error can't find the container with id dc9d2fd3122a92ec8f7b66a254024d0f3b7c72461ad879a922bb93dc936d11b3 Apr 16 15:23:18.407326 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:18.407290 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-29667" event={"ID":"14b38495-0655-4ac1-a07f-f1bb931d9b6d","Type":"ContainerStarted","Data":"dc9d2fd3122a92ec8f7b66a254024d0f3b7c72461ad879a922bb93dc936d11b3"} Apr 16 15:23:24.434780 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:24.434747 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-29667" event={"ID":"14b38495-0655-4ac1-a07f-f1bb931d9b6d","Type":"ContainerStarted","Data":"fe87ce36f5d5fa9f62402bad2a2e808e0591111ac5838b84bf88bdd810418b3c"} Apr 16 15:23:24.435149 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:24.434804 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/postgres-868db5846d-29667" Apr 16 15:23:24.453987 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:24.453939 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/postgres-868db5846d-29667" podStartSLOduration=1.311925183 podStartE2EDuration="7.453925683s" podCreationTimestamp="2026-04-16 15:23:17 +0000 UTC" firstStartedPulling="2026-04-16 15:23:18.157593906 +0000 UTC m=+676.743797701" lastFinishedPulling="2026-04-16 15:23:24.299594414 +0000 UTC m=+682.885798201" observedRunningTime="2026-04-16 15:23:24.451068059 +0000 UTC m=+683.037271884" watchObservedRunningTime="2026-04-16 15:23:24.453925683 +0000 UTC m=+683.040129493" Apr 16 15:23:30.468446 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:30.468418 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/postgres-868db5846d-29667" Apr 16 15:23:32.107820 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:32.107786 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-8b475cf9f-l4ptg"] Apr 16 15:23:32.113667 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:32.113624 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-l4ptg" Apr 16 15:23:32.121107 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:32.121087 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-l4ptg"] Apr 16 15:23:32.180000 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:32.179959 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltt57\" (UniqueName: \"kubernetes.io/projected/2a310d03-cb89-49c0-bd89-5ce9ba6f32d5-kube-api-access-ltt57\") pod \"authorino-8b475cf9f-l4ptg\" (UID: \"2a310d03-cb89-49c0-bd89-5ce9ba6f32d5\") " pod="kuadrant-system/authorino-8b475cf9f-l4ptg" Apr 16 15:23:32.280788 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:32.280753 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ltt57\" (UniqueName: \"kubernetes.io/projected/2a310d03-cb89-49c0-bd89-5ce9ba6f32d5-kube-api-access-ltt57\") pod \"authorino-8b475cf9f-l4ptg\" (UID: \"2a310d03-cb89-49c0-bd89-5ce9ba6f32d5\") " pod="kuadrant-system/authorino-8b475cf9f-l4ptg" Apr 16 15:23:32.292079 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:32.292045 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltt57\" (UniqueName: \"kubernetes.io/projected/2a310d03-cb89-49c0-bd89-5ce9ba6f32d5-kube-api-access-ltt57\") pod \"authorino-8b475cf9f-l4ptg\" (UID: \"2a310d03-cb89-49c0-bd89-5ce9ba6f32d5\") " pod="kuadrant-system/authorino-8b475cf9f-l4ptg" Apr 16 15:23:32.352758 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:32.352713 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-l4ptg"] Apr 16 15:23:32.353030 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:32.353014 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-l4ptg" Apr 16 15:23:32.377466 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:32.377427 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-ccf95b8f9-4tjh4"] Apr 16 15:23:32.383460 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:32.383437 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-ccf95b8f9-4tjh4" Apr 16 15:23:32.388010 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:32.387978 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-ccf95b8f9-4tjh4"] Apr 16 15:23:32.452903 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:32.452866 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-ccf95b8f9-4tjh4"] Apr 16 15:23:32.453173 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:23:32.453152 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-cjjz9], unattached volumes=[], failed to process volumes=[kube-api-access-cjjz9]: context canceled" pod="kuadrant-system/authorino-ccf95b8f9-4tjh4" podUID="3feea7f8-d707-42d2-a550-20344e86382c" Apr 16 15:23:32.467584 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:32.467559 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-ccf95b8f9-4tjh4" Apr 16 15:23:32.472752 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:32.472731 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-ccf95b8f9-4tjh4" Apr 16 15:23:32.483307 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:32.483286 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-l4ptg"] Apr 16 15:23:32.484796 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:23:32.484774 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2a310d03_cb89_49c0_bd89_5ce9ba6f32d5.slice/crio-ae67dba4b168805ac6827ac0ba4d49a2af8bf0e976de8da6c6bc7a9c8c3fbd8b WatchSource:0}: Error finding container ae67dba4b168805ac6827ac0ba4d49a2af8bf0e976de8da6c6bc7a9c8c3fbd8b: Status 404 returned error can't find the container with id ae67dba4b168805ac6827ac0ba4d49a2af8bf0e976de8da6c6bc7a9c8c3fbd8b Apr 16 15:23:33.472177 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:33.472151 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-ccf95b8f9-4tjh4" Apr 16 15:23:33.472740 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:33.472185 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-l4ptg" event={"ID":"2a310d03-cb89-49c0-bd89-5ce9ba6f32d5","Type":"ContainerStarted","Data":"022dad4b53aa81379c6c1cbaa2ed955669ca914fc19b27176102f0d53887322e"} Apr 16 15:23:33.472740 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:33.472208 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-8b475cf9f-l4ptg" podUID="2a310d03-cb89-49c0-bd89-5ce9ba6f32d5" containerName="authorino" containerID="cri-o://022dad4b53aa81379c6c1cbaa2ed955669ca914fc19b27176102f0d53887322e" gracePeriod=30 Apr 16 15:23:33.472740 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:33.472218 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-l4ptg" event={"ID":"2a310d03-cb89-49c0-bd89-5ce9ba6f32d5","Type":"ContainerStarted","Data":"ae67dba4b168805ac6827ac0ba4d49a2af8bf0e976de8da6c6bc7a9c8c3fbd8b"} Apr 16 15:23:33.490787 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:33.490738 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-8b475cf9f-l4ptg" podStartSLOduration=1.033498785 podStartE2EDuration="1.490726514s" podCreationTimestamp="2026-04-16 15:23:32 +0000 UTC" firstStartedPulling="2026-04-16 15:23:32.486111419 +0000 UTC m=+691.072315223" lastFinishedPulling="2026-04-16 15:23:32.943339161 +0000 UTC m=+691.529542952" observedRunningTime="2026-04-16 15:23:33.488483127 +0000 UTC m=+692.074686936" watchObservedRunningTime="2026-04-16 15:23:33.490726514 +0000 UTC m=+692.076930322" Apr 16 15:23:33.523374 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:33.523344 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-ccf95b8f9-4tjh4"] Apr 16 15:23:33.529865 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:33.529841 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-ccf95b8f9-4tjh4"] Apr 16 15:23:33.752512 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:33.752492 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-l4ptg" Apr 16 15:23:33.795012 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:33.794983 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ltt57\" (UniqueName: \"kubernetes.io/projected/2a310d03-cb89-49c0-bd89-5ce9ba6f32d5-kube-api-access-ltt57\") pod \"2a310d03-cb89-49c0-bd89-5ce9ba6f32d5\" (UID: \"2a310d03-cb89-49c0-bd89-5ce9ba6f32d5\") " Apr 16 15:23:33.797081 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:33.797053 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a310d03-cb89-49c0-bd89-5ce9ba6f32d5-kube-api-access-ltt57" (OuterVolumeSpecName: "kube-api-access-ltt57") pod "2a310d03-cb89-49c0-bd89-5ce9ba6f32d5" (UID: "2a310d03-cb89-49c0-bd89-5ce9ba6f32d5"). InnerVolumeSpecName "kube-api-access-ltt57". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 15:23:33.896373 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:33.896349 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ltt57\" (UniqueName: \"kubernetes.io/projected/2a310d03-cb89-49c0-bd89-5ce9ba6f32d5-kube-api-access-ltt57\") on node \"ip-10-0-136-151.ec2.internal\" DevicePath \"\"" Apr 16 15:23:34.018226 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:34.018150 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3feea7f8-d707-42d2-a550-20344e86382c" path="/var/lib/kubelet/pods/3feea7f8-d707-42d2-a550-20344e86382c/volumes" Apr 16 15:23:34.474619 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:34.474586 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-hbflh"] Apr 16 15:23:34.475089 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:34.474852 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-7498df8756-hbflh" podUID="aa3343d1-02a6-451c-90cc-eef878ebfa29" containerName="authorino" containerID="cri-o://5cfc63ee910f9759e6f8d168ca6ea866a0727d75a93c0f67840a97c5babe4d88" gracePeriod=30 Apr 16 15:23:34.477522 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:34.477492 2573 generic.go:358] "Generic (PLEG): container finished" podID="2a310d03-cb89-49c0-bd89-5ce9ba6f32d5" containerID="022dad4b53aa81379c6c1cbaa2ed955669ca914fc19b27176102f0d53887322e" exitCode=0 Apr 16 15:23:34.477693 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:34.477540 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-l4ptg" Apr 16 15:23:34.477925 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:34.477540 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-l4ptg" event={"ID":"2a310d03-cb89-49c0-bd89-5ce9ba6f32d5","Type":"ContainerDied","Data":"022dad4b53aa81379c6c1cbaa2ed955669ca914fc19b27176102f0d53887322e"} Apr 16 15:23:34.478017 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:34.477923 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-l4ptg" event={"ID":"2a310d03-cb89-49c0-bd89-5ce9ba6f32d5","Type":"ContainerDied","Data":"ae67dba4b168805ac6827ac0ba4d49a2af8bf0e976de8da6c6bc7a9c8c3fbd8b"} Apr 16 15:23:34.478017 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:34.477950 2573 scope.go:117] "RemoveContainer" containerID="022dad4b53aa81379c6c1cbaa2ed955669ca914fc19b27176102f0d53887322e" Apr 16 15:23:34.490017 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:34.489999 2573 scope.go:117] "RemoveContainer" containerID="022dad4b53aa81379c6c1cbaa2ed955669ca914fc19b27176102f0d53887322e" Apr 16 15:23:34.490282 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:23:34.490261 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"022dad4b53aa81379c6c1cbaa2ed955669ca914fc19b27176102f0d53887322e\": container with ID starting with 022dad4b53aa81379c6c1cbaa2ed955669ca914fc19b27176102f0d53887322e not found: ID does not exist" containerID="022dad4b53aa81379c6c1cbaa2ed955669ca914fc19b27176102f0d53887322e" Apr 16 15:23:34.490351 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:34.490294 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"022dad4b53aa81379c6c1cbaa2ed955669ca914fc19b27176102f0d53887322e"} err="failed to get container status \"022dad4b53aa81379c6c1cbaa2ed955669ca914fc19b27176102f0d53887322e\": rpc error: code = NotFound desc = could not find container \"022dad4b53aa81379c6c1cbaa2ed955669ca914fc19b27176102f0d53887322e\": container with ID starting with 022dad4b53aa81379c6c1cbaa2ed955669ca914fc19b27176102f0d53887322e not found: ID does not exist" Apr 16 15:23:34.499637 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:34.499613 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-l4ptg"] Apr 16 15:23:34.505309 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:34.505288 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-l4ptg"] Apr 16 15:23:34.733377 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:34.733312 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-hbflh" Apr 16 15:23:34.803971 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:34.803942 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j4zzr\" (UniqueName: \"kubernetes.io/projected/aa3343d1-02a6-451c-90cc-eef878ebfa29-kube-api-access-j4zzr\") pod \"aa3343d1-02a6-451c-90cc-eef878ebfa29\" (UID: \"aa3343d1-02a6-451c-90cc-eef878ebfa29\") " Apr 16 15:23:34.806154 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:34.806117 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa3343d1-02a6-451c-90cc-eef878ebfa29-kube-api-access-j4zzr" (OuterVolumeSpecName: "kube-api-access-j4zzr") pod "aa3343d1-02a6-451c-90cc-eef878ebfa29" (UID: "aa3343d1-02a6-451c-90cc-eef878ebfa29"). InnerVolumeSpecName "kube-api-access-j4zzr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 15:23:34.905525 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:34.905490 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-j4zzr\" (UniqueName: \"kubernetes.io/projected/aa3343d1-02a6-451c-90cc-eef878ebfa29-kube-api-access-j4zzr\") on node \"ip-10-0-136-151.ec2.internal\" DevicePath \"\"" Apr 16 15:23:35.482504 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:35.482471 2573 generic.go:358] "Generic (PLEG): container finished" podID="aa3343d1-02a6-451c-90cc-eef878ebfa29" containerID="5cfc63ee910f9759e6f8d168ca6ea866a0727d75a93c0f67840a97c5babe4d88" exitCode=0 Apr 16 15:23:35.482939 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:35.482523 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-hbflh" Apr 16 15:23:35.482939 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:35.482543 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-hbflh" event={"ID":"aa3343d1-02a6-451c-90cc-eef878ebfa29","Type":"ContainerDied","Data":"5cfc63ee910f9759e6f8d168ca6ea866a0727d75a93c0f67840a97c5babe4d88"} Apr 16 15:23:35.482939 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:35.482576 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-hbflh" event={"ID":"aa3343d1-02a6-451c-90cc-eef878ebfa29","Type":"ContainerDied","Data":"d2b8cbc68f79d1ca2451eb97d952988d8335d93bda4d10a826e70fa8fcfcf0f0"} Apr 16 15:23:35.482939 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:35.482594 2573 scope.go:117] "RemoveContainer" containerID="5cfc63ee910f9759e6f8d168ca6ea866a0727d75a93c0f67840a97c5babe4d88" Apr 16 15:23:35.491571 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:35.491552 2573 scope.go:117] "RemoveContainer" containerID="5cfc63ee910f9759e6f8d168ca6ea866a0727d75a93c0f67840a97c5babe4d88" Apr 16 15:23:35.491811 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:23:35.491792 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5cfc63ee910f9759e6f8d168ca6ea866a0727d75a93c0f67840a97c5babe4d88\": container with ID starting with 5cfc63ee910f9759e6f8d168ca6ea866a0727d75a93c0f67840a97c5babe4d88 not found: ID does not exist" containerID="5cfc63ee910f9759e6f8d168ca6ea866a0727d75a93c0f67840a97c5babe4d88" Apr 16 15:23:35.491863 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:35.491820 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cfc63ee910f9759e6f8d168ca6ea866a0727d75a93c0f67840a97c5babe4d88"} err="failed to get container status \"5cfc63ee910f9759e6f8d168ca6ea866a0727d75a93c0f67840a97c5babe4d88\": rpc error: code = NotFound desc = could not find container \"5cfc63ee910f9759e6f8d168ca6ea866a0727d75a93c0f67840a97c5babe4d88\": container with ID starting with 5cfc63ee910f9759e6f8d168ca6ea866a0727d75a93c0f67840a97c5babe4d88 not found: ID does not exist" Apr 16 15:23:35.505528 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:35.505503 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-hbflh"] Apr 16 15:23:35.509256 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:35.509233 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-7498df8756-hbflh"] Apr 16 15:23:36.018772 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:36.018741 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a310d03-cb89-49c0-bd89-5ce9ba6f32d5" path="/var/lib/kubelet/pods/2a310d03-cb89-49c0-bd89-5ce9ba6f32d5/volumes" Apr 16 15:23:36.019096 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:36.019083 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa3343d1-02a6-451c-90cc-eef878ebfa29" path="/var/lib/kubelet/pods/aa3343d1-02a6-451c-90cc-eef878ebfa29/volumes" Apr 16 15:23:47.772195 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:47.772157 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-68fcd65c4f-4d5nw"] Apr 16 15:23:47.772575 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:47.772532 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aa3343d1-02a6-451c-90cc-eef878ebfa29" containerName="authorino" Apr 16 15:23:47.772575 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:47.772543 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa3343d1-02a6-451c-90cc-eef878ebfa29" containerName="authorino" Apr 16 15:23:47.772575 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:47.772557 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2a310d03-cb89-49c0-bd89-5ce9ba6f32d5" containerName="authorino" Apr 16 15:23:47.772575 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:47.772562 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a310d03-cb89-49c0-bd89-5ce9ba6f32d5" containerName="authorino" Apr 16 15:23:47.772742 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:47.772621 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="aa3343d1-02a6-451c-90cc-eef878ebfa29" containerName="authorino" Apr 16 15:23:47.772742 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:47.772633 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="2a310d03-cb89-49c0-bd89-5ce9ba6f32d5" containerName="authorino" Apr 16 15:23:47.775826 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:47.775811 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-68fcd65c4f-4d5nw" Apr 16 15:23:47.778770 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:47.778747 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-zzh56\"" Apr 16 15:23:47.791213 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:47.787426 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-68fcd65c4f-4d5nw"] Apr 16 15:23:47.815888 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:47.815861 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94bsl\" (UniqueName: \"kubernetes.io/projected/75c4d4e1-bb0c-4f6c-b71c-936923b4f26b-kube-api-access-94bsl\") pod \"maas-controller-68fcd65c4f-4d5nw\" (UID: \"75c4d4e1-bb0c-4f6c-b71c-936923b4f26b\") " pod="opendatahub/maas-controller-68fcd65c4f-4d5nw" Apr 16 15:23:47.917238 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:47.917182 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-94bsl\" (UniqueName: \"kubernetes.io/projected/75c4d4e1-bb0c-4f6c-b71c-936923b4f26b-kube-api-access-94bsl\") pod \"maas-controller-68fcd65c4f-4d5nw\" (UID: \"75c4d4e1-bb0c-4f6c-b71c-936923b4f26b\") " pod="opendatahub/maas-controller-68fcd65c4f-4d5nw" Apr 16 15:23:47.927145 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:47.927117 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-94bsl\" (UniqueName: \"kubernetes.io/projected/75c4d4e1-bb0c-4f6c-b71c-936923b4f26b-kube-api-access-94bsl\") pod \"maas-controller-68fcd65c4f-4d5nw\" (UID: \"75c4d4e1-bb0c-4f6c-b71c-936923b4f26b\") " pod="opendatahub/maas-controller-68fcd65c4f-4d5nw" Apr 16 15:23:48.088912 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:48.088831 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-68fcd65c4f-4d5nw" Apr 16 15:23:48.218010 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:48.217976 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-68fcd65c4f-4d5nw"] Apr 16 15:23:48.219127 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:23:48.219097 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75c4d4e1_bb0c_4f6c_b71c_936923b4f26b.slice/crio-7c9c9770ea6dba21d9c6c22c2d9e9122030d9e68c3b932b70572af8ca0248758 WatchSource:0}: Error finding container 7c9c9770ea6dba21d9c6c22c2d9e9122030d9e68c3b932b70572af8ca0248758: Status 404 returned error can't find the container with id 7c9c9770ea6dba21d9c6c22c2d9e9122030d9e68c3b932b70572af8ca0248758 Apr 16 15:23:48.535667 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:48.535607 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-68fcd65c4f-4d5nw" event={"ID":"75c4d4e1-bb0c-4f6c-b71c-936923b4f26b","Type":"ContainerStarted","Data":"7c9c9770ea6dba21d9c6c22c2d9e9122030d9e68c3b932b70572af8ca0248758"} Apr 16 15:23:50.545500 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:50.545408 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-68fcd65c4f-4d5nw" event={"ID":"75c4d4e1-bb0c-4f6c-b71c-936923b4f26b","Type":"ContainerStarted","Data":"a2b7c485b644a25af207dba6465615036f0ef565bc0d112b4506889b466d00f0"} Apr 16 15:23:50.545500 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:50.545454 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-68fcd65c4f-4d5nw" Apr 16 15:23:50.564318 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:23:50.564271 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-68fcd65c4f-4d5nw" podStartSLOduration=1.5443632699999998 podStartE2EDuration="3.564259099s" podCreationTimestamp="2026-04-16 15:23:47 +0000 UTC" firstStartedPulling="2026-04-16 15:23:48.220382263 +0000 UTC m=+706.806586054" lastFinishedPulling="2026-04-16 15:23:50.240278086 +0000 UTC m=+708.826481883" observedRunningTime="2026-04-16 15:23:50.562495144 +0000 UTC m=+709.148698954" watchObservedRunningTime="2026-04-16 15:23:50.564259099 +0000 UTC m=+709.150462974" Apr 16 15:24:01.556238 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:24:01.556210 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-68fcd65c4f-4d5nw" Apr 16 15:24:08.574100 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:24:08.574056 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-8454f99c75-x7v5h"] Apr 16 15:24:08.579274 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:24:08.579254 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-x7v5h" Apr 16 15:24:08.582508 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:24:08.582481 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-wnsmp\"" Apr 16 15:24:08.582635 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:24:08.582584 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 16 15:24:08.583726 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:24:08.583708 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-2-simulated-kserve-self-signed-certs\"" Apr 16 15:24:08.583887 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:24:08.583864 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 16 15:24:08.591051 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:24:08.591028 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-8454f99c75-x7v5h"] Apr 16 15:24:08.704965 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:24:08.704925 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ae36f93c-ad1a-4b1c-8155-9b0371fd4f5b-model-cache\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-x7v5h\" (UID: \"ae36f93c-ad1a-4b1c-8155-9b0371fd4f5b\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-x7v5h" Apr 16 15:24:08.705124 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:24:08.704978 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ae36f93c-ad1a-4b1c-8155-9b0371fd4f5b-home\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-x7v5h\" (UID: \"ae36f93c-ad1a-4b1c-8155-9b0371fd4f5b\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-x7v5h" Apr 16 15:24:08.705124 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:24:08.705040 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ae36f93c-ad1a-4b1c-8155-9b0371fd4f5b-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-x7v5h\" (UID: \"ae36f93c-ad1a-4b1c-8155-9b0371fd4f5b\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-x7v5h" Apr 16 15:24:08.705124 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:24:08.705065 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-766xr\" (UniqueName: \"kubernetes.io/projected/ae36f93c-ad1a-4b1c-8155-9b0371fd4f5b-kube-api-access-766xr\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-x7v5h\" (UID: \"ae36f93c-ad1a-4b1c-8155-9b0371fd4f5b\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-x7v5h" Apr 16 15:24:08.705124 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:24:08.705086 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ae36f93c-ad1a-4b1c-8155-9b0371fd4f5b-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-x7v5h\" (UID: \"ae36f93c-ad1a-4b1c-8155-9b0371fd4f5b\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-x7v5h" Apr 16 15:24:08.705293 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:24:08.705144 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ae36f93c-ad1a-4b1c-8155-9b0371fd4f5b-dshm\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-x7v5h\" (UID: \"ae36f93c-ad1a-4b1c-8155-9b0371fd4f5b\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-x7v5h" Apr 16 15:24:08.806481 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:24:08.806441 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ae36f93c-ad1a-4b1c-8155-9b0371fd4f5b-model-cache\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-x7v5h\" (UID: \"ae36f93c-ad1a-4b1c-8155-9b0371fd4f5b\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-x7v5h" Apr 16 15:24:08.806713 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:24:08.806503 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ae36f93c-ad1a-4b1c-8155-9b0371fd4f5b-home\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-x7v5h\" (UID: \"ae36f93c-ad1a-4b1c-8155-9b0371fd4f5b\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-x7v5h" Apr 16 15:24:08.806713 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:24:08.806554 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ae36f93c-ad1a-4b1c-8155-9b0371fd4f5b-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-x7v5h\" (UID: \"ae36f93c-ad1a-4b1c-8155-9b0371fd4f5b\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-x7v5h" Apr 16 15:24:08.806713 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:24:08.806589 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-766xr\" (UniqueName: \"kubernetes.io/projected/ae36f93c-ad1a-4b1c-8155-9b0371fd4f5b-kube-api-access-766xr\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-x7v5h\" (UID: \"ae36f93c-ad1a-4b1c-8155-9b0371fd4f5b\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-x7v5h" Apr 16 15:24:08.806713 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:24:08.806624 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ae36f93c-ad1a-4b1c-8155-9b0371fd4f5b-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-x7v5h\" (UID: \"ae36f93c-ad1a-4b1c-8155-9b0371fd4f5b\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-x7v5h" Apr 16 15:24:08.806713 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:24:08.806683 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ae36f93c-ad1a-4b1c-8155-9b0371fd4f5b-dshm\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-x7v5h\" (UID: \"ae36f93c-ad1a-4b1c-8155-9b0371fd4f5b\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-x7v5h" Apr 16 15:24:08.807009 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:24:08.806911 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ae36f93c-ad1a-4b1c-8155-9b0371fd4f5b-model-cache\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-x7v5h\" (UID: \"ae36f93c-ad1a-4b1c-8155-9b0371fd4f5b\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-x7v5h" Apr 16 15:24:08.807009 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:24:08.806928 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ae36f93c-ad1a-4b1c-8155-9b0371fd4f5b-home\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-x7v5h\" (UID: \"ae36f93c-ad1a-4b1c-8155-9b0371fd4f5b\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-x7v5h" Apr 16 15:24:08.807009 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:24:08.806979 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ae36f93c-ad1a-4b1c-8155-9b0371fd4f5b-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-x7v5h\" (UID: \"ae36f93c-ad1a-4b1c-8155-9b0371fd4f5b\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-x7v5h" Apr 16 15:24:08.809056 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:24:08.809029 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ae36f93c-ad1a-4b1c-8155-9b0371fd4f5b-dshm\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-x7v5h\" (UID: \"ae36f93c-ad1a-4b1c-8155-9b0371fd4f5b\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-x7v5h" Apr 16 15:24:08.809305 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:24:08.809288 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ae36f93c-ad1a-4b1c-8155-9b0371fd4f5b-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-x7v5h\" (UID: \"ae36f93c-ad1a-4b1c-8155-9b0371fd4f5b\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-x7v5h" Apr 16 15:24:08.816444 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:24:08.816420 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-766xr\" (UniqueName: \"kubernetes.io/projected/ae36f93c-ad1a-4b1c-8155-9b0371fd4f5b-kube-api-access-766xr\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-x7v5h\" (UID: \"ae36f93c-ad1a-4b1c-8155-9b0371fd4f5b\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-x7v5h" Apr 16 15:24:08.890906 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:24:08.890847 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-x7v5h" Apr 16 15:24:09.018884 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:24:09.018854 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-8454f99c75-x7v5h"] Apr 16 15:24:09.020077 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:24:09.020049 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae36f93c_ad1a_4b1c_8155_9b0371fd4f5b.slice/crio-082afc3bc456299b0f2288691d34bb268d3d380adccf4644f02647be82efe92f WatchSource:0}: Error finding container 082afc3bc456299b0f2288691d34bb268d3d380adccf4644f02647be82efe92f: Status 404 returned error can't find the container with id 082afc3bc456299b0f2288691d34bb268d3d380adccf4644f02647be82efe92f Apr 16 15:24:09.620610 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:24:09.620570 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-x7v5h" event={"ID":"ae36f93c-ad1a-4b1c-8155-9b0371fd4f5b","Type":"ContainerStarted","Data":"082afc3bc456299b0f2288691d34bb268d3d380adccf4644f02647be82efe92f"} Apr 16 15:24:14.644397 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:24:14.644359 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-x7v5h" event={"ID":"ae36f93c-ad1a-4b1c-8155-9b0371fd4f5b","Type":"ContainerStarted","Data":"a1f76b5ec06abfa82558c10e8c5b45e522e277a66386068865ad41b4b314b806"} Apr 16 15:24:19.665291 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:24:19.665258 2573 generic.go:358] "Generic (PLEG): container finished" podID="ae36f93c-ad1a-4b1c-8155-9b0371fd4f5b" containerID="a1f76b5ec06abfa82558c10e8c5b45e522e277a66386068865ad41b4b314b806" exitCode=0 Apr 16 15:24:19.665672 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:24:19.665335 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-x7v5h" event={"ID":"ae36f93c-ad1a-4b1c-8155-9b0371fd4f5b","Type":"ContainerDied","Data":"a1f76b5ec06abfa82558c10e8c5b45e522e277a66386068865ad41b4b314b806"} Apr 16 15:24:23.687410 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:24:23.687377 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-x7v5h" event={"ID":"ae36f93c-ad1a-4b1c-8155-9b0371fd4f5b","Type":"ContainerStarted","Data":"65157860869f3e25b80cf2b507f8f9a09076d34a3476b169a304f1aef5bad0ef"} Apr 16 15:24:23.687800 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:24:23.687586 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-x7v5h" Apr 16 15:24:23.709411 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:24:23.709362 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-x7v5h" podStartSLOduration=1.463772397 podStartE2EDuration="15.7093487s" podCreationTimestamp="2026-04-16 15:24:08 +0000 UTC" firstStartedPulling="2026-04-16 15:24:09.021993492 +0000 UTC m=+727.608197282" lastFinishedPulling="2026-04-16 15:24:23.267569795 +0000 UTC m=+741.853773585" observedRunningTime="2026-04-16 15:24:23.707319121 +0000 UTC m=+742.293522929" watchObservedRunningTime="2026-04-16 15:24:23.7093487 +0000 UTC m=+742.295552508" Apr 16 15:24:29.873831 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:24:29.873788 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-f5df4587b-bkbxx"] Apr 16 15:24:29.879669 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:24:29.879632 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-bkbxx" Apr 16 15:24:29.882434 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:24:29.882411 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"premium-simulated-simulated-premium-kserve-self-signed-certs\"" Apr 16 15:24:29.888063 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:24:29.888039 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-f5df4587b-bkbxx"] Apr 16 15:24:30.012321 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:24:30.012292 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2d34e857-d835-4460-9449-4bc59c723ae8-home\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-bkbxx\" (UID: \"2d34e857-d835-4460-9449-4bc59c723ae8\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-bkbxx" Apr 16 15:24:30.012321 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:24:30.012326 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2d34e857-d835-4460-9449-4bc59c723ae8-dshm\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-bkbxx\" (UID: \"2d34e857-d835-4460-9449-4bc59c723ae8\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-bkbxx" Apr 16 15:24:30.012514 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:24:30.012360 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbnvn\" (UniqueName: \"kubernetes.io/projected/2d34e857-d835-4460-9449-4bc59c723ae8-kube-api-access-rbnvn\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-bkbxx\" (UID: \"2d34e857-d835-4460-9449-4bc59c723ae8\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-bkbxx" Apr 16 15:24:30.012514 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:24:30.012426 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2d34e857-d835-4460-9449-4bc59c723ae8-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-bkbxx\" (UID: \"2d34e857-d835-4460-9449-4bc59c723ae8\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-bkbxx" Apr 16 15:24:30.012514 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:24:30.012474 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2d34e857-d835-4460-9449-4bc59c723ae8-model-cache\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-bkbxx\" (UID: \"2d34e857-d835-4460-9449-4bc59c723ae8\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-bkbxx" Apr 16 15:24:30.012514 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:24:30.012489 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2d34e857-d835-4460-9449-4bc59c723ae8-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-bkbxx\" (UID: \"2d34e857-d835-4460-9449-4bc59c723ae8\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-bkbxx" Apr 16 15:24:30.113083 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:24:30.113048 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2d34e857-d835-4460-9449-4bc59c723ae8-home\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-bkbxx\" (UID: \"2d34e857-d835-4460-9449-4bc59c723ae8\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-bkbxx" Apr 16 15:24:30.113291 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:24:30.113206 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2d34e857-d835-4460-9449-4bc59c723ae8-dshm\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-bkbxx\" (UID: \"2d34e857-d835-4460-9449-4bc59c723ae8\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-bkbxx" Apr 16 15:24:30.113291 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:24:30.113278 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rbnvn\" (UniqueName: \"kubernetes.io/projected/2d34e857-d835-4460-9449-4bc59c723ae8-kube-api-access-rbnvn\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-bkbxx\" (UID: \"2d34e857-d835-4460-9449-4bc59c723ae8\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-bkbxx" Apr 16 15:24:30.113437 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:24:30.113354 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2d34e857-d835-4460-9449-4bc59c723ae8-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-bkbxx\" (UID: \"2d34e857-d835-4460-9449-4bc59c723ae8\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-bkbxx" Apr 16 15:24:30.113506 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:24:30.113461 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2d34e857-d835-4460-9449-4bc59c723ae8-model-cache\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-bkbxx\" (UID: \"2d34e857-d835-4460-9449-4bc59c723ae8\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-bkbxx" Apr 16 15:24:30.113506 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:24:30.113489 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2d34e857-d835-4460-9449-4bc59c723ae8-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-bkbxx\" (UID: \"2d34e857-d835-4460-9449-4bc59c723ae8\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-bkbxx" Apr 16 15:24:30.113614 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:24:30.113523 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2d34e857-d835-4460-9449-4bc59c723ae8-home\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-bkbxx\" (UID: \"2d34e857-d835-4460-9449-4bc59c723ae8\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-bkbxx" Apr 16 15:24:30.113712 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:24:30.113612 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2d34e857-d835-4460-9449-4bc59c723ae8-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-bkbxx\" (UID: \"2d34e857-d835-4460-9449-4bc59c723ae8\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-bkbxx" Apr 16 15:24:30.113860 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:24:30.113840 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2d34e857-d835-4460-9449-4bc59c723ae8-model-cache\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-bkbxx\" (UID: \"2d34e857-d835-4460-9449-4bc59c723ae8\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-bkbxx" Apr 16 15:24:30.115542 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:24:30.115524 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2d34e857-d835-4460-9449-4bc59c723ae8-dshm\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-bkbxx\" (UID: \"2d34e857-d835-4460-9449-4bc59c723ae8\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-bkbxx" Apr 16 15:24:30.116184 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:24:30.116160 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2d34e857-d835-4460-9449-4bc59c723ae8-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-bkbxx\" (UID: \"2d34e857-d835-4460-9449-4bc59c723ae8\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-bkbxx" Apr 16 15:24:30.122399 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:24:30.122378 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbnvn\" (UniqueName: \"kubernetes.io/projected/2d34e857-d835-4460-9449-4bc59c723ae8-kube-api-access-rbnvn\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-bkbxx\" (UID: \"2d34e857-d835-4460-9449-4bc59c723ae8\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-bkbxx" Apr 16 15:24:30.190595 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:24:30.190565 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-bkbxx" Apr 16 15:24:30.321440 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:24:30.321402 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-f5df4587b-bkbxx"] Apr 16 15:24:30.322704 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:24:30.322681 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d34e857_d835_4460_9449_4bc59c723ae8.slice/crio-02528fda2d70c9347847f92822209e9a5ea845c5b6df8a2db1d62cd3522d1afd WatchSource:0}: Error finding container 02528fda2d70c9347847f92822209e9a5ea845c5b6df8a2db1d62cd3522d1afd: Status 404 returned error can't find the container with id 02528fda2d70c9347847f92822209e9a5ea845c5b6df8a2db1d62cd3522d1afd Apr 16 15:24:30.714849 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:24:30.714806 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-bkbxx" event={"ID":"2d34e857-d835-4460-9449-4bc59c723ae8","Type":"ContainerStarted","Data":"cf1657e552244baaa01c3f9e971e90a43a9933679a9855919e844b353dfa0588"} Apr 16 15:24:30.714849 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:24:30.714855 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-bkbxx" event={"ID":"2d34e857-d835-4460-9449-4bc59c723ae8","Type":"ContainerStarted","Data":"02528fda2d70c9347847f92822209e9a5ea845c5b6df8a2db1d62cd3522d1afd"} Apr 16 15:24:34.705038 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:24:34.705008 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-x7v5h" Apr 16 15:24:38.749252 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:24:38.749202 2573 generic.go:358] "Generic (PLEG): container finished" podID="2d34e857-d835-4460-9449-4bc59c723ae8" containerID="cf1657e552244baaa01c3f9e971e90a43a9933679a9855919e844b353dfa0588" exitCode=0 Apr 16 15:24:38.749628 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:24:38.749279 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-bkbxx" event={"ID":"2d34e857-d835-4460-9449-4bc59c723ae8","Type":"ContainerDied","Data":"cf1657e552244baaa01c3f9e971e90a43a9933679a9855919e844b353dfa0588"} Apr 16 15:24:39.755021 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:24:39.754980 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-bkbxx" event={"ID":"2d34e857-d835-4460-9449-4bc59c723ae8","Type":"ContainerStarted","Data":"94337987ea98f184b0476cf5200aeb87e3c103fac03fe350b3516136ca7620aa"} Apr 16 15:24:39.755514 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:24:39.755192 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-bkbxx" Apr 16 15:24:39.789310 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:24:39.789257 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-bkbxx" podStartSLOduration=10.51218724 podStartE2EDuration="10.789241363s" podCreationTimestamp="2026-04-16 15:24:29 +0000 UTC" firstStartedPulling="2026-04-16 15:24:38.749938338 +0000 UTC m=+757.336142125" lastFinishedPulling="2026-04-16 15:24:39.026992462 +0000 UTC m=+757.613196248" observedRunningTime="2026-04-16 15:24:39.786250703 +0000 UTC m=+758.372454524" watchObservedRunningTime="2026-04-16 15:24:39.789241363 +0000 UTC m=+758.375445171" Apr 16 15:24:50.773072 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:24:50.773041 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-bkbxx" Apr 16 15:25:46.923888 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:25:46.923852 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-7dd7479dcd-gjrgd"] Apr 16 15:25:46.927329 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:25:46.927313 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7dd7479dcd-gjrgd" Apr 16 15:25:46.931468 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:25:46.931444 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-server-cert\"" Apr 16 15:25:46.931591 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:25:46.931444 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-fssxm\"" Apr 16 15:25:46.936075 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:25:46.936050 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7dd7479dcd-gjrgd"] Apr 16 15:25:47.010150 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:25:47.010119 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z25bh\" (UniqueName: \"kubernetes.io/projected/0021a4bb-8b0a-4a69-89af-271f7be9748a-kube-api-access-z25bh\") pod \"authorino-7dd7479dcd-gjrgd\" (UID: \"0021a4bb-8b0a-4a69-89af-271f7be9748a\") " pod="kuadrant-system/authorino-7dd7479dcd-gjrgd" Apr 16 15:25:47.010308 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:25:47.010223 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/0021a4bb-8b0a-4a69-89af-271f7be9748a-tls-cert\") pod \"authorino-7dd7479dcd-gjrgd\" (UID: \"0021a4bb-8b0a-4a69-89af-271f7be9748a\") " pod="kuadrant-system/authorino-7dd7479dcd-gjrgd" Apr 16 15:25:47.110753 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:25:47.110722 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z25bh\" (UniqueName: \"kubernetes.io/projected/0021a4bb-8b0a-4a69-89af-271f7be9748a-kube-api-access-z25bh\") pod \"authorino-7dd7479dcd-gjrgd\" (UID: \"0021a4bb-8b0a-4a69-89af-271f7be9748a\") " pod="kuadrant-system/authorino-7dd7479dcd-gjrgd" Apr 16 15:25:47.110934 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:25:47.110791 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/0021a4bb-8b0a-4a69-89af-271f7be9748a-tls-cert\") pod \"authorino-7dd7479dcd-gjrgd\" (UID: \"0021a4bb-8b0a-4a69-89af-271f7be9748a\") " pod="kuadrant-system/authorino-7dd7479dcd-gjrgd" Apr 16 15:25:47.113266 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:25:47.113245 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/0021a4bb-8b0a-4a69-89af-271f7be9748a-tls-cert\") pod \"authorino-7dd7479dcd-gjrgd\" (UID: \"0021a4bb-8b0a-4a69-89af-271f7be9748a\") " pod="kuadrant-system/authorino-7dd7479dcd-gjrgd" Apr 16 15:25:47.121215 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:25:47.121181 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z25bh\" (UniqueName: \"kubernetes.io/projected/0021a4bb-8b0a-4a69-89af-271f7be9748a-kube-api-access-z25bh\") pod \"authorino-7dd7479dcd-gjrgd\" (UID: \"0021a4bb-8b0a-4a69-89af-271f7be9748a\") " pod="kuadrant-system/authorino-7dd7479dcd-gjrgd" Apr 16 15:25:47.237369 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:25:47.237283 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7dd7479dcd-gjrgd" Apr 16 15:25:47.363530 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:25:47.363502 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7dd7479dcd-gjrgd"] Apr 16 15:25:47.364629 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:25:47.364603 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0021a4bb_8b0a_4a69_89af_271f7be9748a.slice/crio-70e16a0cc056bfbabecf7201f8d34e69c08f9ee7769b56197e6c42ebea1c0901 WatchSource:0}: Error finding container 70e16a0cc056bfbabecf7201f8d34e69c08f9ee7769b56197e6c42ebea1c0901: Status 404 returned error can't find the container with id 70e16a0cc056bfbabecf7201f8d34e69c08f9ee7769b56197e6c42ebea1c0901 Apr 16 15:25:48.022963 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:25:48.022919 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7dd7479dcd-gjrgd" event={"ID":"0021a4bb-8b0a-4a69-89af-271f7be9748a","Type":"ContainerStarted","Data":"8ab86d83b88d59ee46f216a50f2055d5767bbfba3b381fc3d166b344d1c0128f"} Apr 16 15:25:48.023355 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:25:48.022970 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7dd7479dcd-gjrgd" event={"ID":"0021a4bb-8b0a-4a69-89af-271f7be9748a","Type":"ContainerStarted","Data":"70e16a0cc056bfbabecf7201f8d34e69c08f9ee7769b56197e6c42ebea1c0901"} Apr 16 15:25:48.044048 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:25:48.044004 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-7dd7479dcd-gjrgd" podStartSLOduration=1.555173995 podStartE2EDuration="2.043990729s" podCreationTimestamp="2026-04-16 15:25:46 +0000 UTC" firstStartedPulling="2026-04-16 15:25:47.365877704 +0000 UTC m=+825.952081490" lastFinishedPulling="2026-04-16 15:25:47.854694437 +0000 UTC m=+826.440898224" observedRunningTime="2026-04-16 15:25:48.04157271 +0000 UTC m=+826.627776529" watchObservedRunningTime="2026-04-16 15:25:48.043990729 +0000 UTC m=+826.630194535" Apr 16 15:27:01.963467 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:27:01.963436 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-xlsw6_22e6f93e-aa56-4c51-85da-4f105af8d76d/console-operator/2.log" Apr 16 15:27:01.963937 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:27:01.963631 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-xlsw6_22e6f93e-aa56-4c51-85da-4f105af8d76d/console-operator/2.log" Apr 16 15:27:01.968206 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:27:01.968182 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n96lw_edf7b8a9-878b-451c-86c3-6a78d2ece872/ovn-acl-logging/0.log" Apr 16 15:27:01.968468 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:27:01.968452 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n96lw_edf7b8a9-878b-451c-86c3-6a78d2ece872/ovn-acl-logging/0.log" Apr 16 15:27:11.897308 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:27:11.897275 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-68fcd65c4f-4d5nw"] Apr 16 15:27:11.897874 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:27:11.897504 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-68fcd65c4f-4d5nw" podUID="75c4d4e1-bb0c-4f6c-b71c-936923b4f26b" containerName="manager" containerID="cri-o://a2b7c485b644a25af207dba6465615036f0ef565bc0d112b4506889b466d00f0" gracePeriod=10 Apr 16 15:27:12.147947 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:27:12.147873 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-68fcd65c4f-4d5nw" Apr 16 15:27:12.280702 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:27:12.280634 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-94bsl\" (UniqueName: \"kubernetes.io/projected/75c4d4e1-bb0c-4f6c-b71c-936923b4f26b-kube-api-access-94bsl\") pod \"75c4d4e1-bb0c-4f6c-b71c-936923b4f26b\" (UID: \"75c4d4e1-bb0c-4f6c-b71c-936923b4f26b\") " Apr 16 15:27:12.282911 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:27:12.282889 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75c4d4e1-bb0c-4f6c-b71c-936923b4f26b-kube-api-access-94bsl" (OuterVolumeSpecName: "kube-api-access-94bsl") pod "75c4d4e1-bb0c-4f6c-b71c-936923b4f26b" (UID: "75c4d4e1-bb0c-4f6c-b71c-936923b4f26b"). InnerVolumeSpecName "kube-api-access-94bsl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 15:27:12.371129 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:27:12.371092 2573 generic.go:358] "Generic (PLEG): container finished" podID="75c4d4e1-bb0c-4f6c-b71c-936923b4f26b" containerID="a2b7c485b644a25af207dba6465615036f0ef565bc0d112b4506889b466d00f0" exitCode=0 Apr 16 15:27:12.371286 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:27:12.371158 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-68fcd65c4f-4d5nw" Apr 16 15:27:12.371286 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:27:12.371169 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-68fcd65c4f-4d5nw" event={"ID":"75c4d4e1-bb0c-4f6c-b71c-936923b4f26b","Type":"ContainerDied","Data":"a2b7c485b644a25af207dba6465615036f0ef565bc0d112b4506889b466d00f0"} Apr 16 15:27:12.371286 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:27:12.371216 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-68fcd65c4f-4d5nw" event={"ID":"75c4d4e1-bb0c-4f6c-b71c-936923b4f26b","Type":"ContainerDied","Data":"7c9c9770ea6dba21d9c6c22c2d9e9122030d9e68c3b932b70572af8ca0248758"} Apr 16 15:27:12.371286 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:27:12.371237 2573 scope.go:117] "RemoveContainer" containerID="a2b7c485b644a25af207dba6465615036f0ef565bc0d112b4506889b466d00f0" Apr 16 15:27:12.380871 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:27:12.380852 2573 scope.go:117] "RemoveContainer" containerID="a2b7c485b644a25af207dba6465615036f0ef565bc0d112b4506889b466d00f0" Apr 16 15:27:12.381172 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:27:12.381147 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2b7c485b644a25af207dba6465615036f0ef565bc0d112b4506889b466d00f0\": container with ID starting with a2b7c485b644a25af207dba6465615036f0ef565bc0d112b4506889b466d00f0 not found: ID does not exist" containerID="a2b7c485b644a25af207dba6465615036f0ef565bc0d112b4506889b466d00f0" Apr 16 15:27:12.381285 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:27:12.381178 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2b7c485b644a25af207dba6465615036f0ef565bc0d112b4506889b466d00f0"} err="failed to get container status \"a2b7c485b644a25af207dba6465615036f0ef565bc0d112b4506889b466d00f0\": rpc error: code = NotFound desc = could not find container \"a2b7c485b644a25af207dba6465615036f0ef565bc0d112b4506889b466d00f0\": container with ID starting with a2b7c485b644a25af207dba6465615036f0ef565bc0d112b4506889b466d00f0 not found: ID does not exist" Apr 16 15:27:12.381412 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:27:12.381393 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-94bsl\" (UniqueName: \"kubernetes.io/projected/75c4d4e1-bb0c-4f6c-b71c-936923b4f26b-kube-api-access-94bsl\") on node \"ip-10-0-136-151.ec2.internal\" DevicePath \"\"" Apr 16 15:27:12.394859 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:27:12.394833 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-68fcd65c4f-4d5nw"] Apr 16 15:27:12.398199 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:27:12.398138 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-68fcd65c4f-4d5nw"] Apr 16 15:27:14.018986 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:27:14.018957 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75c4d4e1-bb0c-4f6c-b71c-936923b4f26b" path="/var/lib/kubelet/pods/75c4d4e1-bb0c-4f6c-b71c-936923b4f26b/volumes" Apr 16 15:30:00.143635 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:30:00.143603 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-key-cleanup-29605890-xczlf"] Apr 16 15:30:00.144133 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:30:00.144000 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="75c4d4e1-bb0c-4f6c-b71c-936923b4f26b" containerName="manager" Apr 16 15:30:00.144133 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:30:00.144014 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="75c4d4e1-bb0c-4f6c-b71c-936923b4f26b" containerName="manager" Apr 16 15:30:00.144133 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:30:00.144096 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="75c4d4e1-bb0c-4f6c-b71c-936923b4f26b" containerName="manager" Apr 16 15:30:00.151958 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:30:00.151928 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29605890-xczlf" Apr 16 15:30:00.154621 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:30:00.154601 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-bdctg\"" Apr 16 15:30:00.160673 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:30:00.160627 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29605890-xczlf"] Apr 16 15:30:00.200931 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:30:00.200905 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ldck\" (UniqueName: \"kubernetes.io/projected/e0ae1352-9e00-4775-bc3e-230c5a4af422-kube-api-access-5ldck\") pod \"maas-api-key-cleanup-29605890-xczlf\" (UID: \"e0ae1352-9e00-4775-bc3e-230c5a4af422\") " pod="opendatahub/maas-api-key-cleanup-29605890-xczlf" Apr 16 15:30:00.301413 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:30:00.301380 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5ldck\" (UniqueName: \"kubernetes.io/projected/e0ae1352-9e00-4775-bc3e-230c5a4af422-kube-api-access-5ldck\") pod \"maas-api-key-cleanup-29605890-xczlf\" (UID: \"e0ae1352-9e00-4775-bc3e-230c5a4af422\") " pod="opendatahub/maas-api-key-cleanup-29605890-xczlf" Apr 16 15:30:00.310033 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:30:00.310002 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ldck\" (UniqueName: \"kubernetes.io/projected/e0ae1352-9e00-4775-bc3e-230c5a4af422-kube-api-access-5ldck\") pod \"maas-api-key-cleanup-29605890-xczlf\" (UID: \"e0ae1352-9e00-4775-bc3e-230c5a4af422\") " pod="opendatahub/maas-api-key-cleanup-29605890-xczlf" Apr 16 15:30:00.463570 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:30:00.463539 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29605890-xczlf" Apr 16 15:30:00.585921 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:30:00.585894 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29605890-xczlf"] Apr 16 15:30:00.587130 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:30:00.587107 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0ae1352_9e00_4775_bc3e_230c5a4af422.slice/crio-4410bd76898e6c90bea8e3dd9fa6ba28bf9183668ef3af71a16b9965ff59f5f0 WatchSource:0}: Error finding container 4410bd76898e6c90bea8e3dd9fa6ba28bf9183668ef3af71a16b9965ff59f5f0: Status 404 returned error can't find the container with id 4410bd76898e6c90bea8e3dd9fa6ba28bf9183668ef3af71a16b9965ff59f5f0 Apr 16 15:30:00.589257 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:30:00.589240 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 15:30:01.033007 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:30:01.032969 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29605890-xczlf" event={"ID":"e0ae1352-9e00-4775-bc3e-230c5a4af422","Type":"ContainerStarted","Data":"4410bd76898e6c90bea8e3dd9fa6ba28bf9183668ef3af71a16b9965ff59f5f0"} Apr 16 15:30:03.041717 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:30:03.041676 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29605890-xczlf" event={"ID":"e0ae1352-9e00-4775-bc3e-230c5a4af422","Type":"ContainerStarted","Data":"588725822c97da547cd3509b416a0a92ec71b11741840fed434c31fa3b1856ed"} Apr 16 15:30:03.056154 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:30:03.056090 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-key-cleanup-29605890-xczlf" podStartSLOduration=1.860647605 podStartE2EDuration="3.056077356s" podCreationTimestamp="2026-04-16 15:30:00 +0000 UTC" firstStartedPulling="2026-04-16 15:30:00.589362707 +0000 UTC m=+1079.175566494" lastFinishedPulling="2026-04-16 15:30:01.784792457 +0000 UTC m=+1080.370996245" observedRunningTime="2026-04-16 15:30:03.055856927 +0000 UTC m=+1081.642060738" watchObservedRunningTime="2026-04-16 15:30:03.056077356 +0000 UTC m=+1081.642281164" Apr 16 15:30:23.118590 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:30:23.118557 2573 generic.go:358] "Generic (PLEG): container finished" podID="e0ae1352-9e00-4775-bc3e-230c5a4af422" containerID="588725822c97da547cd3509b416a0a92ec71b11741840fed434c31fa3b1856ed" exitCode=6 Apr 16 15:30:23.118994 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:30:23.118631 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29605890-xczlf" event={"ID":"e0ae1352-9e00-4775-bc3e-230c5a4af422","Type":"ContainerDied","Data":"588725822c97da547cd3509b416a0a92ec71b11741840fed434c31fa3b1856ed"} Apr 16 15:30:23.119052 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:30:23.118994 2573 scope.go:117] "RemoveContainer" containerID="588725822c97da547cd3509b416a0a92ec71b11741840fed434c31fa3b1856ed" Apr 16 15:30:24.124559 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:30:24.124525 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29605890-xczlf" event={"ID":"e0ae1352-9e00-4775-bc3e-230c5a4af422","Type":"ContainerStarted","Data":"ad79d40bf0c2d96d6fa41b6cd631daa7e8f0b645dcd5a9ff34e58ec00dfe97eb"} Apr 16 15:30:44.200592 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:30:44.200562 2573 generic.go:358] "Generic (PLEG): container finished" podID="e0ae1352-9e00-4775-bc3e-230c5a4af422" containerID="ad79d40bf0c2d96d6fa41b6cd631daa7e8f0b645dcd5a9ff34e58ec00dfe97eb" exitCode=6 Apr 16 15:30:44.201141 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:30:44.200637 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29605890-xczlf" event={"ID":"e0ae1352-9e00-4775-bc3e-230c5a4af422","Type":"ContainerDied","Data":"ad79d40bf0c2d96d6fa41b6cd631daa7e8f0b645dcd5a9ff34e58ec00dfe97eb"} Apr 16 15:30:44.201141 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:30:44.200709 2573 scope.go:117] "RemoveContainer" containerID="588725822c97da547cd3509b416a0a92ec71b11741840fed434c31fa3b1856ed" Apr 16 15:30:44.201141 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:30:44.201076 2573 scope.go:117] "RemoveContainer" containerID="ad79d40bf0c2d96d6fa41b6cd631daa7e8f0b645dcd5a9ff34e58ec00dfe97eb" Apr 16 15:30:44.201375 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:30:44.201347 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cleanup\" with CrashLoopBackOff: \"back-off 10s restarting failed container=cleanup pod=maas-api-key-cleanup-29605890-xczlf_opendatahub(e0ae1352-9e00-4775-bc3e-230c5a4af422)\"" pod="opendatahub/maas-api-key-cleanup-29605890-xczlf" podUID="e0ae1352-9e00-4775-bc3e-230c5a4af422" Apr 16 15:30:59.014337 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:30:59.014299 2573 scope.go:117] "RemoveContainer" containerID="ad79d40bf0c2d96d6fa41b6cd631daa7e8f0b645dcd5a9ff34e58ec00dfe97eb" Apr 16 15:31:00.010969 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:31:00.010932 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29605890-xczlf"] Apr 16 15:31:00.264753 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:31:00.264636 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29605890-xczlf" event={"ID":"e0ae1352-9e00-4775-bc3e-230c5a4af422","Type":"ContainerStarted","Data":"bf343f0aeba866cc828e4fe8e7cbca49420418563a357ee7b8a53352547e0902"} Apr 16 15:31:00.265142 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:31:00.264717 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-key-cleanup-29605890-xczlf" podUID="e0ae1352-9e00-4775-bc3e-230c5a4af422" containerName="cleanup" containerID="cri-o://bf343f0aeba866cc828e4fe8e7cbca49420418563a357ee7b8a53352547e0902" gracePeriod=30 Apr 16 15:31:19.909916 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:31:19.909894 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29605890-xczlf" Apr 16 15:31:19.927841 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:31:19.927817 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5ldck\" (UniqueName: \"kubernetes.io/projected/e0ae1352-9e00-4775-bc3e-230c5a4af422-kube-api-access-5ldck\") pod \"e0ae1352-9e00-4775-bc3e-230c5a4af422\" (UID: \"e0ae1352-9e00-4775-bc3e-230c5a4af422\") " Apr 16 15:31:19.930438 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:31:19.930396 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0ae1352-9e00-4775-bc3e-230c5a4af422-kube-api-access-5ldck" (OuterVolumeSpecName: "kube-api-access-5ldck") pod "e0ae1352-9e00-4775-bc3e-230c5a4af422" (UID: "e0ae1352-9e00-4775-bc3e-230c5a4af422"). InnerVolumeSpecName "kube-api-access-5ldck". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 15:31:20.032516 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:31:20.032478 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5ldck\" (UniqueName: \"kubernetes.io/projected/e0ae1352-9e00-4775-bc3e-230c5a4af422-kube-api-access-5ldck\") on node \"ip-10-0-136-151.ec2.internal\" DevicePath \"\"" Apr 16 15:31:20.347407 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:31:20.347321 2573 generic.go:358] "Generic (PLEG): container finished" podID="e0ae1352-9e00-4775-bc3e-230c5a4af422" containerID="bf343f0aeba866cc828e4fe8e7cbca49420418563a357ee7b8a53352547e0902" exitCode=6 Apr 16 15:31:20.347407 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:31:20.347374 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29605890-xczlf" event={"ID":"e0ae1352-9e00-4775-bc3e-230c5a4af422","Type":"ContainerDied","Data":"bf343f0aeba866cc828e4fe8e7cbca49420418563a357ee7b8a53352547e0902"} Apr 16 15:31:20.347407 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:31:20.347401 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29605890-xczlf" event={"ID":"e0ae1352-9e00-4775-bc3e-230c5a4af422","Type":"ContainerDied","Data":"4410bd76898e6c90bea8e3dd9fa6ba28bf9183668ef3af71a16b9965ff59f5f0"} Apr 16 15:31:20.347614 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:31:20.347415 2573 scope.go:117] "RemoveContainer" containerID="bf343f0aeba866cc828e4fe8e7cbca49420418563a357ee7b8a53352547e0902" Apr 16 15:31:20.347614 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:31:20.347413 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29605890-xczlf" Apr 16 15:31:20.356413 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:31:20.356395 2573 scope.go:117] "RemoveContainer" containerID="ad79d40bf0c2d96d6fa41b6cd631daa7e8f0b645dcd5a9ff34e58ec00dfe97eb" Apr 16 15:31:20.365983 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:31:20.365859 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29605890-xczlf"] Apr 16 15:31:20.366059 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:31:20.366011 2573 scope.go:117] "RemoveContainer" containerID="bf343f0aeba866cc828e4fe8e7cbca49420418563a357ee7b8a53352547e0902" Apr 16 15:31:20.366345 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:31:20.366322 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf343f0aeba866cc828e4fe8e7cbca49420418563a357ee7b8a53352547e0902\": container with ID starting with bf343f0aeba866cc828e4fe8e7cbca49420418563a357ee7b8a53352547e0902 not found: ID does not exist" containerID="bf343f0aeba866cc828e4fe8e7cbca49420418563a357ee7b8a53352547e0902" Apr 16 15:31:20.366410 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:31:20.366352 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf343f0aeba866cc828e4fe8e7cbca49420418563a357ee7b8a53352547e0902"} err="failed to get container status \"bf343f0aeba866cc828e4fe8e7cbca49420418563a357ee7b8a53352547e0902\": rpc error: code = NotFound desc = could not find container \"bf343f0aeba866cc828e4fe8e7cbca49420418563a357ee7b8a53352547e0902\": container with ID starting with bf343f0aeba866cc828e4fe8e7cbca49420418563a357ee7b8a53352547e0902 not found: ID does not exist" Apr 16 15:31:20.366410 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:31:20.366370 2573 scope.go:117] "RemoveContainer" containerID="ad79d40bf0c2d96d6fa41b6cd631daa7e8f0b645dcd5a9ff34e58ec00dfe97eb" Apr 16 15:31:20.366629 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:31:20.366608 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad79d40bf0c2d96d6fa41b6cd631daa7e8f0b645dcd5a9ff34e58ec00dfe97eb\": container with ID starting with ad79d40bf0c2d96d6fa41b6cd631daa7e8f0b645dcd5a9ff34e58ec00dfe97eb not found: ID does not exist" containerID="ad79d40bf0c2d96d6fa41b6cd631daa7e8f0b645dcd5a9ff34e58ec00dfe97eb" Apr 16 15:31:20.366699 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:31:20.366626 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad79d40bf0c2d96d6fa41b6cd631daa7e8f0b645dcd5a9ff34e58ec00dfe97eb"} err="failed to get container status \"ad79d40bf0c2d96d6fa41b6cd631daa7e8f0b645dcd5a9ff34e58ec00dfe97eb\": rpc error: code = NotFound desc = could not find container \"ad79d40bf0c2d96d6fa41b6cd631daa7e8f0b645dcd5a9ff34e58ec00dfe97eb\": container with ID starting with ad79d40bf0c2d96d6fa41b6cd631daa7e8f0b645dcd5a9ff34e58ec00dfe97eb not found: ID does not exist" Apr 16 15:31:20.368437 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:31:20.368418 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-key-cleanup-29605890-xczlf"] Apr 16 15:31:22.018734 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:31:22.018706 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0ae1352-9e00-4775-bc3e-230c5a4af422" path="/var/lib/kubelet/pods/e0ae1352-9e00-4775-bc3e-230c5a4af422/volumes" Apr 16 15:32:01.998126 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:32:01.998094 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-xlsw6_22e6f93e-aa56-4c51-85da-4f105af8d76d/console-operator/2.log" Apr 16 15:32:01.999758 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:32:01.999734 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-xlsw6_22e6f93e-aa56-4c51-85da-4f105af8d76d/console-operator/2.log" Apr 16 15:32:02.002942 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:32:02.002923 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n96lw_edf7b8a9-878b-451c-86c3-6a78d2ece872/ovn-acl-logging/0.log" Apr 16 15:32:02.004633 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:32:02.004618 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n96lw_edf7b8a9-878b-451c-86c3-6a78d2ece872/ovn-acl-logging/0.log" Apr 16 15:37:02.035699 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:37:02.035670 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-xlsw6_22e6f93e-aa56-4c51-85da-4f105af8d76d/console-operator/2.log" Apr 16 15:37:02.038892 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:37:02.038872 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-xlsw6_22e6f93e-aa56-4c51-85da-4f105af8d76d/console-operator/2.log" Apr 16 15:37:02.040513 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:37:02.040493 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n96lw_edf7b8a9-878b-451c-86c3-6a78d2ece872/ovn-acl-logging/0.log" Apr 16 15:37:02.043503 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:37:02.043489 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n96lw_edf7b8a9-878b-451c-86c3-6a78d2ece872/ovn-acl-logging/0.log" Apr 16 15:42:02.070241 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:42:02.070214 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-xlsw6_22e6f93e-aa56-4c51-85da-4f105af8d76d/console-operator/2.log" Apr 16 15:42:02.074515 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:42:02.074490 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-xlsw6_22e6f93e-aa56-4c51-85da-4f105af8d76d/console-operator/2.log" Apr 16 15:42:02.074839 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:42:02.074819 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n96lw_edf7b8a9-878b-451c-86c3-6a78d2ece872/ovn-acl-logging/0.log" Apr 16 15:42:02.079369 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:42:02.079351 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n96lw_edf7b8a9-878b-451c-86c3-6a78d2ece872/ovn-acl-logging/0.log" Apr 16 15:45:00.137108 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:45:00.137063 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-key-cleanup-29605905-2nvfb"] Apr 16 15:45:00.139507 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:45:00.137608 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e0ae1352-9e00-4775-bc3e-230c5a4af422" containerName="cleanup" Apr 16 15:45:00.139507 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:45:00.137624 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0ae1352-9e00-4775-bc3e-230c5a4af422" containerName="cleanup" Apr 16 15:45:00.139507 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:45:00.137641 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e0ae1352-9e00-4775-bc3e-230c5a4af422" containerName="cleanup" Apr 16 15:45:00.139507 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:45:00.137668 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0ae1352-9e00-4775-bc3e-230c5a4af422" containerName="cleanup" Apr 16 15:45:00.139507 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:45:00.137679 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e0ae1352-9e00-4775-bc3e-230c5a4af422" containerName="cleanup" Apr 16 15:45:00.139507 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:45:00.137685 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0ae1352-9e00-4775-bc3e-230c5a4af422" containerName="cleanup" Apr 16 15:45:00.139507 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:45:00.137753 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="e0ae1352-9e00-4775-bc3e-230c5a4af422" containerName="cleanup" Apr 16 15:45:00.139507 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:45:00.137760 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="e0ae1352-9e00-4775-bc3e-230c5a4af422" containerName="cleanup" Apr 16 15:45:00.140823 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:45:00.140797 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29605905-2nvfb" Apr 16 15:45:00.143490 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:45:00.143470 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-bdctg\"" Apr 16 15:45:00.154810 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:45:00.154791 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29605905-2nvfb"] Apr 16 15:45:00.241031 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:45:00.241001 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwmhw\" (UniqueName: \"kubernetes.io/projected/96dcf52e-b3b9-450c-932d-ddc0fa7857a4-kube-api-access-jwmhw\") pod \"maas-api-key-cleanup-29605905-2nvfb\" (UID: \"96dcf52e-b3b9-450c-932d-ddc0fa7857a4\") " pod="opendatahub/maas-api-key-cleanup-29605905-2nvfb" Apr 16 15:45:00.341829 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:45:00.341782 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jwmhw\" (UniqueName: \"kubernetes.io/projected/96dcf52e-b3b9-450c-932d-ddc0fa7857a4-kube-api-access-jwmhw\") pod \"maas-api-key-cleanup-29605905-2nvfb\" (UID: \"96dcf52e-b3b9-450c-932d-ddc0fa7857a4\") " pod="opendatahub/maas-api-key-cleanup-29605905-2nvfb" Apr 16 15:45:00.350388 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:45:00.350358 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwmhw\" (UniqueName: \"kubernetes.io/projected/96dcf52e-b3b9-450c-932d-ddc0fa7857a4-kube-api-access-jwmhw\") pod \"maas-api-key-cleanup-29605905-2nvfb\" (UID: \"96dcf52e-b3b9-450c-932d-ddc0fa7857a4\") " pod="opendatahub/maas-api-key-cleanup-29605905-2nvfb" Apr 16 15:45:00.450934 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:45:00.450906 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29605905-2nvfb" Apr 16 15:45:00.579279 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:45:00.579254 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29605905-2nvfb"] Apr 16 15:45:00.582039 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:45:00.582005 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96dcf52e_b3b9_450c_932d_ddc0fa7857a4.slice/crio-0ac7999766e32429c3f88d114082a47378268fa6addaf67d62ed1488dea98508 WatchSource:0}: Error finding container 0ac7999766e32429c3f88d114082a47378268fa6addaf67d62ed1488dea98508: Status 404 returned error can't find the container with id 0ac7999766e32429c3f88d114082a47378268fa6addaf67d62ed1488dea98508 Apr 16 15:45:00.583854 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:45:00.583838 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 15:45:01.577159 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:45:01.577115 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29605905-2nvfb" event={"ID":"96dcf52e-b3b9-450c-932d-ddc0fa7857a4","Type":"ContainerStarted","Data":"740c4c8fd72809041f32d23f53c9538a019f426032b572af17f26327f909553e"} Apr 16 15:45:01.577159 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:45:01.577164 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29605905-2nvfb" event={"ID":"96dcf52e-b3b9-450c-932d-ddc0fa7857a4","Type":"ContainerStarted","Data":"0ac7999766e32429c3f88d114082a47378268fa6addaf67d62ed1488dea98508"} Apr 16 15:45:01.593822 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:45:01.593773 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-key-cleanup-29605905-2nvfb" podStartSLOduration=1.593759462 podStartE2EDuration="1.593759462s" podCreationTimestamp="2026-04-16 15:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:45:01.591200263 +0000 UTC m=+1980.177404073" watchObservedRunningTime="2026-04-16 15:45:01.593759462 +0000 UTC m=+1980.179963270" Apr 16 15:45:21.657325 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:45:21.657293 2573 generic.go:358] "Generic (PLEG): container finished" podID="96dcf52e-b3b9-450c-932d-ddc0fa7857a4" containerID="740c4c8fd72809041f32d23f53c9538a019f426032b572af17f26327f909553e" exitCode=6 Apr 16 15:45:21.657772 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:45:21.657367 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29605905-2nvfb" event={"ID":"96dcf52e-b3b9-450c-932d-ddc0fa7857a4","Type":"ContainerDied","Data":"740c4c8fd72809041f32d23f53c9538a019f426032b572af17f26327f909553e"} Apr 16 15:45:21.657772 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:45:21.657691 2573 scope.go:117] "RemoveContainer" containerID="740c4c8fd72809041f32d23f53c9538a019f426032b572af17f26327f909553e" Apr 16 15:45:22.663670 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:45:22.663614 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29605905-2nvfb" event={"ID":"96dcf52e-b3b9-450c-932d-ddc0fa7857a4","Type":"ContainerStarted","Data":"c61f0be81e0fb7fb8cfd706be151827828cadd1299188e70845ac532c4618447"} Apr 16 15:45:42.752232 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:45:42.752194 2573 generic.go:358] "Generic (PLEG): container finished" podID="96dcf52e-b3b9-450c-932d-ddc0fa7857a4" containerID="c61f0be81e0fb7fb8cfd706be151827828cadd1299188e70845ac532c4618447" exitCode=6 Apr 16 15:45:42.752763 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:45:42.752243 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29605905-2nvfb" event={"ID":"96dcf52e-b3b9-450c-932d-ddc0fa7857a4","Type":"ContainerDied","Data":"c61f0be81e0fb7fb8cfd706be151827828cadd1299188e70845ac532c4618447"} Apr 16 15:45:42.752763 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:45:42.752283 2573 scope.go:117] "RemoveContainer" containerID="740c4c8fd72809041f32d23f53c9538a019f426032b572af17f26327f909553e" Apr 16 15:45:42.752763 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:45:42.752616 2573 scope.go:117] "RemoveContainer" containerID="c61f0be81e0fb7fb8cfd706be151827828cadd1299188e70845ac532c4618447" Apr 16 15:45:42.752917 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:45:42.752865 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cleanup\" with CrashLoopBackOff: \"back-off 10s restarting failed container=cleanup pod=maas-api-key-cleanup-29605905-2nvfb_opendatahub(96dcf52e-b3b9-450c-932d-ddc0fa7857a4)\"" pod="opendatahub/maas-api-key-cleanup-29605905-2nvfb" podUID="96dcf52e-b3b9-450c-932d-ddc0fa7857a4" Apr 16 15:45:54.014559 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:45:54.014527 2573 scope.go:117] "RemoveContainer" containerID="c61f0be81e0fb7fb8cfd706be151827828cadd1299188e70845ac532c4618447" Apr 16 15:45:54.800302 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:45:54.800266 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29605905-2nvfb" event={"ID":"96dcf52e-b3b9-450c-932d-ddc0fa7857a4","Type":"ContainerStarted","Data":"a891cf64545b6fafa47d7b92b57a33769b88b960dd58953b91ebfe18e220bcaa"} Apr 16 15:45:55.037431 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:45:55.037394 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29605905-2nvfb"] Apr 16 15:45:55.803967 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:45:55.803907 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-key-cleanup-29605905-2nvfb" podUID="96dcf52e-b3b9-450c-932d-ddc0fa7857a4" containerName="cleanup" containerID="cri-o://a891cf64545b6fafa47d7b92b57a33769b88b960dd58953b91ebfe18e220bcaa" gracePeriod=30 Apr 16 15:46:14.844134 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:46:14.844109 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29605905-2nvfb" Apr 16 15:46:14.880922 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:46:14.880892 2573 generic.go:358] "Generic (PLEG): container finished" podID="96dcf52e-b3b9-450c-932d-ddc0fa7857a4" containerID="a891cf64545b6fafa47d7b92b57a33769b88b960dd58953b91ebfe18e220bcaa" exitCode=6 Apr 16 15:46:14.881078 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:46:14.880965 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29605905-2nvfb" Apr 16 15:46:14.881078 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:46:14.880966 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29605905-2nvfb" event={"ID":"96dcf52e-b3b9-450c-932d-ddc0fa7857a4","Type":"ContainerDied","Data":"a891cf64545b6fafa47d7b92b57a33769b88b960dd58953b91ebfe18e220bcaa"} Apr 16 15:46:14.881078 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:46:14.881018 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29605905-2nvfb" event={"ID":"96dcf52e-b3b9-450c-932d-ddc0fa7857a4","Type":"ContainerDied","Data":"0ac7999766e32429c3f88d114082a47378268fa6addaf67d62ed1488dea98508"} Apr 16 15:46:14.881078 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:46:14.881037 2573 scope.go:117] "RemoveContainer" containerID="a891cf64545b6fafa47d7b92b57a33769b88b960dd58953b91ebfe18e220bcaa" Apr 16 15:46:14.890500 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:46:14.890480 2573 scope.go:117] "RemoveContainer" containerID="c61f0be81e0fb7fb8cfd706be151827828cadd1299188e70845ac532c4618447" Apr 16 15:46:14.898631 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:46:14.898616 2573 scope.go:117] "RemoveContainer" containerID="a891cf64545b6fafa47d7b92b57a33769b88b960dd58953b91ebfe18e220bcaa" Apr 16 15:46:14.898964 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:46:14.898946 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a891cf64545b6fafa47d7b92b57a33769b88b960dd58953b91ebfe18e220bcaa\": container with ID starting with a891cf64545b6fafa47d7b92b57a33769b88b960dd58953b91ebfe18e220bcaa not found: ID does not exist" containerID="a891cf64545b6fafa47d7b92b57a33769b88b960dd58953b91ebfe18e220bcaa" Apr 16 15:46:14.899018 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:46:14.898974 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a891cf64545b6fafa47d7b92b57a33769b88b960dd58953b91ebfe18e220bcaa"} err="failed to get container status \"a891cf64545b6fafa47d7b92b57a33769b88b960dd58953b91ebfe18e220bcaa\": rpc error: code = NotFound desc = could not find container \"a891cf64545b6fafa47d7b92b57a33769b88b960dd58953b91ebfe18e220bcaa\": container with ID starting with a891cf64545b6fafa47d7b92b57a33769b88b960dd58953b91ebfe18e220bcaa not found: ID does not exist" Apr 16 15:46:14.899018 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:46:14.898993 2573 scope.go:117] "RemoveContainer" containerID="c61f0be81e0fb7fb8cfd706be151827828cadd1299188e70845ac532c4618447" Apr 16 15:46:14.899229 ip-10-0-136-151 kubenswrapper[2573]: E0416 15:46:14.899213 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c61f0be81e0fb7fb8cfd706be151827828cadd1299188e70845ac532c4618447\": container with ID starting with c61f0be81e0fb7fb8cfd706be151827828cadd1299188e70845ac532c4618447 not found: ID does not exist" containerID="c61f0be81e0fb7fb8cfd706be151827828cadd1299188e70845ac532c4618447" Apr 16 15:46:14.899283 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:46:14.899243 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c61f0be81e0fb7fb8cfd706be151827828cadd1299188e70845ac532c4618447"} err="failed to get container status \"c61f0be81e0fb7fb8cfd706be151827828cadd1299188e70845ac532c4618447\": rpc error: code = NotFound desc = could not find container \"c61f0be81e0fb7fb8cfd706be151827828cadd1299188e70845ac532c4618447\": container with ID starting with c61f0be81e0fb7fb8cfd706be151827828cadd1299188e70845ac532c4618447 not found: ID does not exist" Apr 16 15:46:14.923597 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:46:14.923549 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jwmhw\" (UniqueName: \"kubernetes.io/projected/96dcf52e-b3b9-450c-932d-ddc0fa7857a4-kube-api-access-jwmhw\") pod \"96dcf52e-b3b9-450c-932d-ddc0fa7857a4\" (UID: \"96dcf52e-b3b9-450c-932d-ddc0fa7857a4\") " Apr 16 15:46:14.925605 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:46:14.925581 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96dcf52e-b3b9-450c-932d-ddc0fa7857a4-kube-api-access-jwmhw" (OuterVolumeSpecName: "kube-api-access-jwmhw") pod "96dcf52e-b3b9-450c-932d-ddc0fa7857a4" (UID: "96dcf52e-b3b9-450c-932d-ddc0fa7857a4"). InnerVolumeSpecName "kube-api-access-jwmhw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 15:46:15.025025 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:46:15.024994 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jwmhw\" (UniqueName: \"kubernetes.io/projected/96dcf52e-b3b9-450c-932d-ddc0fa7857a4-kube-api-access-jwmhw\") on node \"ip-10-0-136-151.ec2.internal\" DevicePath \"\"" Apr 16 15:46:15.203381 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:46:15.203351 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29605905-2nvfb"] Apr 16 15:46:15.205615 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:46:15.205595 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-key-cleanup-29605905-2nvfb"] Apr 16 15:46:16.018518 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:46:16.018484 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96dcf52e-b3b9-450c-932d-ddc0fa7857a4" path="/var/lib/kubelet/pods/96dcf52e-b3b9-450c-932d-ddc0fa7857a4/volumes" Apr 16 15:47:02.104334 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:47:02.104303 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-xlsw6_22e6f93e-aa56-4c51-85da-4f105af8d76d/console-operator/2.log" Apr 16 15:47:02.108528 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:47:02.108502 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n96lw_edf7b8a9-878b-451c-86c3-6a78d2ece872/ovn-acl-logging/0.log" Apr 16 15:47:02.110741 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:47:02.110724 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-xlsw6_22e6f93e-aa56-4c51-85da-4f105af8d76d/console-operator/2.log" Apr 16 15:47:02.114846 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:47:02.114828 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n96lw_edf7b8a9-878b-451c-86c3-6a78d2ece872/ovn-acl-logging/0.log" Apr 16 15:48:09.798669 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:09.798628 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-7dd7479dcd-gjrgd_0021a4bb-8b0a-4a69-89af-271f7be9748a/authorino/0.log" Apr 16 15:48:14.110464 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:14.110436 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-68df4b58f7-fsbq9_426be61e-17e1-4f01-b19f-3ab462b45ca7/manager/0.log" Apr 16 15:48:14.484803 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:14.484780 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-29667_14b38495-0655-4ac1-a07f-f1bb931d9b6d/postgres/0.log" Apr 16 15:48:15.246344 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:15.246309 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7594h5xw_844c3ca6-337b-4c63-b069-2bb7c48e6f4f/util/0.log" Apr 16 15:48:15.257949 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:15.257913 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7594h5xw_844c3ca6-337b-4c63-b069-2bb7c48e6f4f/pull/0.log" Apr 16 15:48:15.264258 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:15.264218 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7594h5xw_844c3ca6-337b-4c63-b069-2bb7c48e6f4f/extract/0.log" Apr 16 15:48:15.374494 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:15.374467 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e08zb6b_3cd7de14-76d8-4975-a48c-583dc8ead026/pull/0.log" Apr 16 15:48:15.380689 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:15.380667 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e08zb6b_3cd7de14-76d8-4975-a48c-583dc8ead026/extract/0.log" Apr 16 15:48:15.386822 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:15.386798 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e08zb6b_3cd7de14-76d8-4975-a48c-583dc8ead026/util/0.log" Apr 16 15:48:15.501121 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:15.501054 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed735jmgw_5c7b8a7a-8603-481a-b226-086efa4a6d4f/util/0.log" Apr 16 15:48:15.507953 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:15.507934 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed735jmgw_5c7b8a7a-8603-481a-b226-086efa4a6d4f/pull/0.log" Apr 16 15:48:15.514598 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:15.514583 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed735jmgw_5c7b8a7a-8603-481a-b226-086efa4a6d4f/extract/0.log" Apr 16 15:48:15.627995 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:15.627966 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1tpf8w_5935dea1-ae73-40c9-98ae-ed32f97ce643/pull/0.log" Apr 16 15:48:15.634305 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:15.634284 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1tpf8w_5935dea1-ae73-40c9-98ae-ed32f97ce643/extract/0.log" Apr 16 15:48:15.640425 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:15.640405 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1tpf8w_5935dea1-ae73-40c9-98ae-ed32f97ce643/util/0.log" Apr 16 15:48:15.763863 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:15.763783 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-7dd7479dcd-gjrgd_0021a4bb-8b0a-4a69-89af-271f7be9748a/authorino/0.log" Apr 16 15:48:16.099221 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:16.099147 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-t42zx_83c616ca-a5f0-4123-9bfa-474d0ba7d534/kuadrant-console-plugin/0.log" Apr 16 15:48:17.242527 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:17.242495 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-5bb547c98c-s85c9_6b1f0daf-7085-452a-bcf8-246adaa9a7b1/kube-auth-proxy/0.log" Apr 16 15:48:17.468223 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:17.468191 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-58ff4b54fd-qzf5r_c378b37b-c6a3-4eee-8f58-73402bb067b1/router/0.log" Apr 16 15:48:17.809745 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:17.809714 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-8454f99c75-x7v5h_ae36f93c-ad1a-4b1c-8155-9b0371fd4f5b/storage-initializer/0.log" Apr 16 15:48:17.816694 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:17.816668 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-8454f99c75-x7v5h_ae36f93c-ad1a-4b1c-8155-9b0371fd4f5b/main/0.log" Apr 16 15:48:18.397956 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:18.397929 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-f5df4587b-bkbxx_2d34e857-d835-4460-9449-4bc59c723ae8/storage-initializer/0.log" Apr 16 15:48:18.404829 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:18.404809 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-f5df4587b-bkbxx_2d34e857-d835-4460-9449-4bc59c723ae8/main/0.log" Apr 16 15:48:24.822397 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:24.822366 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-mspjk_d685ae4d-d7ff-407e-bf27-ec6402f786a0/global-pull-secret-syncer/0.log" Apr 16 15:48:24.909168 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:24.909138 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-2dkml_baca9b64-0c10-49d7-9301-8d81286c87b6/konnectivity-agent/0.log" Apr 16 15:48:25.038599 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:25.038568 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-136-151.ec2.internal_965c3203e0e8294321ef158b59084066/haproxy/0.log" Apr 16 15:48:28.644941 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:28.644832 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7594h5xw_844c3ca6-337b-4c63-b069-2bb7c48e6f4f/extract/0.log" Apr 16 15:48:28.672724 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:28.672691 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7594h5xw_844c3ca6-337b-4c63-b069-2bb7c48e6f4f/util/0.log" Apr 16 15:48:28.694978 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:28.694959 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7594h5xw_844c3ca6-337b-4c63-b069-2bb7c48e6f4f/pull/0.log" Apr 16 15:48:28.722347 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:28.722302 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e08zb6b_3cd7de14-76d8-4975-a48c-583dc8ead026/extract/0.log" Apr 16 15:48:28.753179 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:28.753156 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e08zb6b_3cd7de14-76d8-4975-a48c-583dc8ead026/util/0.log" Apr 16 15:48:28.779124 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:28.779099 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e08zb6b_3cd7de14-76d8-4975-a48c-583dc8ead026/pull/0.log" Apr 16 15:48:28.812814 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:28.812792 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed735jmgw_5c7b8a7a-8603-481a-b226-086efa4a6d4f/extract/0.log" Apr 16 15:48:28.835196 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:28.835175 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed735jmgw_5c7b8a7a-8603-481a-b226-086efa4a6d4f/util/0.log" Apr 16 15:48:28.856064 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:28.856044 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed735jmgw_5c7b8a7a-8603-481a-b226-086efa4a6d4f/pull/0.log" Apr 16 15:48:28.893608 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:28.893581 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1tpf8w_5935dea1-ae73-40c9-98ae-ed32f97ce643/extract/0.log" Apr 16 15:48:28.914550 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:28.914521 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1tpf8w_5935dea1-ae73-40c9-98ae-ed32f97ce643/util/0.log" Apr 16 15:48:28.934598 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:28.934571 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1tpf8w_5935dea1-ae73-40c9-98ae-ed32f97ce643/pull/0.log" Apr 16 15:48:29.130102 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:29.130073 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-7dd7479dcd-gjrgd_0021a4bb-8b0a-4a69-89af-271f7be9748a/authorino/0.log" Apr 16 15:48:29.215171 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:29.215093 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-t42zx_83c616ca-a5f0-4123-9bfa-474d0ba7d534/kuadrant-console-plugin/0.log" Apr 16 15:48:30.897191 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:30.897159 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_85937bc3-314c-43ed-a893-fa81d55b0bc5/alertmanager/0.log" Apr 16 15:48:30.920789 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:30.920769 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_85937bc3-314c-43ed-a893-fa81d55b0bc5/config-reloader/0.log" Apr 16 15:48:30.944374 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:30.944349 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_85937bc3-314c-43ed-a893-fa81d55b0bc5/kube-rbac-proxy-web/0.log" Apr 16 15:48:30.966141 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:30.966072 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_85937bc3-314c-43ed-a893-fa81d55b0bc5/kube-rbac-proxy/0.log" Apr 16 15:48:30.988045 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:30.988023 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_85937bc3-314c-43ed-a893-fa81d55b0bc5/kube-rbac-proxy-metric/0.log" Apr 16 15:48:31.009984 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:31.009953 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_85937bc3-314c-43ed-a893-fa81d55b0bc5/prom-label-proxy/0.log" Apr 16 15:48:31.031072 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:31.031047 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_85937bc3-314c-43ed-a893-fa81d55b0bc5/init-config-reloader/0.log" Apr 16 15:48:31.187030 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:31.186989 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-5b68b96f84-tmrm8_fe009bf2-e9e3-43d9-a8e5-fa926d26d0ec/metrics-server/0.log" Apr 16 15:48:31.393019 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:31.392921 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-qz4mc_a114c53c-19c2-4d2c-8001-45b045bfca65/node-exporter/0.log" Apr 16 15:48:31.417304 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:31.417276 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-qz4mc_a114c53c-19c2-4d2c-8001-45b045bfca65/kube-rbac-proxy/0.log" Apr 16 15:48:31.439966 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:31.439942 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-qz4mc_a114c53c-19c2-4d2c-8001-45b045bfca65/init-textfile/0.log" Apr 16 15:48:31.742552 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:31.742520 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-78f957474d-2c5vl_f973310e-d7cd-4927-b934-0e10d5013bd8/prometheus-operator/0.log" Apr 16 15:48:31.768290 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:31.768270 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-78f957474d-2c5vl_f973310e-d7cd-4927-b934-0e10d5013bd8/kube-rbac-proxy/0.log" Apr 16 15:48:31.796890 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:31.796855 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-9cb97cd87-fcsws_9a07ebbc-7250-4238-9bbe-e0feea8ed073/prometheus-operator-admission-webhook/0.log" Apr 16 15:48:31.825227 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:31.825191 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-6db7cb95c8-rz2z4_3be4ec10-6366-4b37-824d-713912b61673/telemeter-client/0.log" Apr 16 15:48:31.845745 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:31.845723 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-6db7cb95c8-rz2z4_3be4ec10-6366-4b37-824d-713912b61673/reload/0.log" Apr 16 15:48:31.867086 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:31.867052 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-6db7cb95c8-rz2z4_3be4ec10-6366-4b37-824d-713912b61673/kube-rbac-proxy/0.log" Apr 16 15:48:31.896540 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:31.896517 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-695cc4d4c4-tn6ql_fd886ad9-e063-4c34-8522-7e261bdfe21c/thanos-query/0.log" Apr 16 15:48:31.917036 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:31.917013 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-695cc4d4c4-tn6ql_fd886ad9-e063-4c34-8522-7e261bdfe21c/kube-rbac-proxy-web/0.log" Apr 16 15:48:31.937483 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:31.937459 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-695cc4d4c4-tn6ql_fd886ad9-e063-4c34-8522-7e261bdfe21c/kube-rbac-proxy/0.log" Apr 16 15:48:31.970142 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:31.970120 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-695cc4d4c4-tn6ql_fd886ad9-e063-4c34-8522-7e261bdfe21c/prom-label-proxy/0.log" Apr 16 15:48:31.990434 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:31.990416 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-695cc4d4c4-tn6ql_fd886ad9-e063-4c34-8522-7e261bdfe21c/kube-rbac-proxy-rules/0.log" Apr 16 15:48:32.010861 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:32.010775 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-695cc4d4c4-tn6ql_fd886ad9-e063-4c34-8522-7e261bdfe21c/kube-rbac-proxy-metrics/0.log" Apr 16 15:48:33.401982 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:33.401900 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-t57q5/perf-node-gather-daemonset-rfckk"] Apr 16 15:48:33.402416 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:33.402356 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="96dcf52e-b3b9-450c-932d-ddc0fa7857a4" containerName="cleanup" Apr 16 15:48:33.402416 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:33.402369 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="96dcf52e-b3b9-450c-932d-ddc0fa7857a4" containerName="cleanup" Apr 16 15:48:33.402416 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:33.402380 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="96dcf52e-b3b9-450c-932d-ddc0fa7857a4" containerName="cleanup" Apr 16 15:48:33.402416 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:33.402385 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="96dcf52e-b3b9-450c-932d-ddc0fa7857a4" containerName="cleanup" Apr 16 15:48:33.402550 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:33.402455 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="e0ae1352-9e00-4775-bc3e-230c5a4af422" containerName="cleanup" Apr 16 15:48:33.402550 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:33.402464 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="96dcf52e-b3b9-450c-932d-ddc0fa7857a4" containerName="cleanup" Apr 16 15:48:33.402550 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:33.402473 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="96dcf52e-b3b9-450c-932d-ddc0fa7857a4" containerName="cleanup" Apr 16 15:48:33.405658 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:33.405628 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t57q5/perf-node-gather-daemonset-rfckk" Apr 16 15:48:33.408442 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:33.408419 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-t57q5\"/\"kube-root-ca.crt\"" Apr 16 15:48:33.408574 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:33.408422 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-t57q5\"/\"default-dockercfg-88qxm\"" Apr 16 15:48:33.408574 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:33.408470 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-t57q5\"/\"openshift-service-ca.crt\"" Apr 16 15:48:33.415318 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:33.415295 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-t57q5/perf-node-gather-daemonset-rfckk"] Apr 16 15:48:33.558314 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:33.558281 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/a1850b3f-33d2-448f-b3fc-1062ab270c21-proc\") pod \"perf-node-gather-daemonset-rfckk\" (UID: \"a1850b3f-33d2-448f-b3fc-1062ab270c21\") " pod="openshift-must-gather-t57q5/perf-node-gather-daemonset-rfckk" Apr 16 15:48:33.558454 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:33.558330 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwvvc\" (UniqueName: \"kubernetes.io/projected/a1850b3f-33d2-448f-b3fc-1062ab270c21-kube-api-access-pwvvc\") pod \"perf-node-gather-daemonset-rfckk\" (UID: \"a1850b3f-33d2-448f-b3fc-1062ab270c21\") " pod="openshift-must-gather-t57q5/perf-node-gather-daemonset-rfckk" Apr 16 15:48:33.558454 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:33.558436 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a1850b3f-33d2-448f-b3fc-1062ab270c21-lib-modules\") pod \"perf-node-gather-daemonset-rfckk\" (UID: \"a1850b3f-33d2-448f-b3fc-1062ab270c21\") " pod="openshift-must-gather-t57q5/perf-node-gather-daemonset-rfckk" Apr 16 15:48:33.558541 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:33.558470 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a1850b3f-33d2-448f-b3fc-1062ab270c21-sys\") pod \"perf-node-gather-daemonset-rfckk\" (UID: \"a1850b3f-33d2-448f-b3fc-1062ab270c21\") " pod="openshift-must-gather-t57q5/perf-node-gather-daemonset-rfckk" Apr 16 15:48:33.558541 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:33.558519 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/a1850b3f-33d2-448f-b3fc-1062ab270c21-podres\") pod \"perf-node-gather-daemonset-rfckk\" (UID: \"a1850b3f-33d2-448f-b3fc-1062ab270c21\") " pod="openshift-must-gather-t57q5/perf-node-gather-daemonset-rfckk" Apr 16 15:48:33.659678 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:33.659583 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a1850b3f-33d2-448f-b3fc-1062ab270c21-lib-modules\") pod \"perf-node-gather-daemonset-rfckk\" (UID: \"a1850b3f-33d2-448f-b3fc-1062ab270c21\") " pod="openshift-must-gather-t57q5/perf-node-gather-daemonset-rfckk" Apr 16 15:48:33.659678 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:33.659620 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a1850b3f-33d2-448f-b3fc-1062ab270c21-sys\") pod \"perf-node-gather-daemonset-rfckk\" (UID: \"a1850b3f-33d2-448f-b3fc-1062ab270c21\") " pod="openshift-must-gather-t57q5/perf-node-gather-daemonset-rfckk" Apr 16 15:48:33.659876 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:33.659687 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/a1850b3f-33d2-448f-b3fc-1062ab270c21-podres\") pod \"perf-node-gather-daemonset-rfckk\" (UID: \"a1850b3f-33d2-448f-b3fc-1062ab270c21\") " pod="openshift-must-gather-t57q5/perf-node-gather-daemonset-rfckk" Apr 16 15:48:33.659876 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:33.659707 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/a1850b3f-33d2-448f-b3fc-1062ab270c21-proc\") pod \"perf-node-gather-daemonset-rfckk\" (UID: \"a1850b3f-33d2-448f-b3fc-1062ab270c21\") " pod="openshift-must-gather-t57q5/perf-node-gather-daemonset-rfckk" Apr 16 15:48:33.659876 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:33.659755 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pwvvc\" (UniqueName: \"kubernetes.io/projected/a1850b3f-33d2-448f-b3fc-1062ab270c21-kube-api-access-pwvvc\") pod \"perf-node-gather-daemonset-rfckk\" (UID: \"a1850b3f-33d2-448f-b3fc-1062ab270c21\") " pod="openshift-must-gather-t57q5/perf-node-gather-daemonset-rfckk" Apr 16 15:48:33.659876 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:33.659784 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a1850b3f-33d2-448f-b3fc-1062ab270c21-lib-modules\") pod \"perf-node-gather-daemonset-rfckk\" (UID: \"a1850b3f-33d2-448f-b3fc-1062ab270c21\") " pod="openshift-must-gather-t57q5/perf-node-gather-daemonset-rfckk" Apr 16 15:48:33.659876 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:33.659795 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a1850b3f-33d2-448f-b3fc-1062ab270c21-sys\") pod \"perf-node-gather-daemonset-rfckk\" (UID: \"a1850b3f-33d2-448f-b3fc-1062ab270c21\") " pod="openshift-must-gather-t57q5/perf-node-gather-daemonset-rfckk" Apr 16 15:48:33.659876 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:33.659824 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/a1850b3f-33d2-448f-b3fc-1062ab270c21-proc\") pod \"perf-node-gather-daemonset-rfckk\" (UID: \"a1850b3f-33d2-448f-b3fc-1062ab270c21\") " pod="openshift-must-gather-t57q5/perf-node-gather-daemonset-rfckk" Apr 16 15:48:33.659876 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:33.659865 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/a1850b3f-33d2-448f-b3fc-1062ab270c21-podres\") pod \"perf-node-gather-daemonset-rfckk\" (UID: \"a1850b3f-33d2-448f-b3fc-1062ab270c21\") " pod="openshift-must-gather-t57q5/perf-node-gather-daemonset-rfckk" Apr 16 15:48:33.667204 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:33.667185 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwvvc\" (UniqueName: \"kubernetes.io/projected/a1850b3f-33d2-448f-b3fc-1062ab270c21-kube-api-access-pwvvc\") pod \"perf-node-gather-daemonset-rfckk\" (UID: \"a1850b3f-33d2-448f-b3fc-1062ab270c21\") " pod="openshift-must-gather-t57q5/perf-node-gather-daemonset-rfckk" Apr 16 15:48:33.717025 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:33.717004 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t57q5/perf-node-gather-daemonset-rfckk" Apr 16 15:48:33.766030 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:33.766002 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-xlsw6_22e6f93e-aa56-4c51-85da-4f105af8d76d/console-operator/2.log" Apr 16 15:48:33.773511 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:33.773447 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-xlsw6_22e6f93e-aa56-4c51-85da-4f105af8d76d/console-operator/3.log" Apr 16 15:48:33.841486 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:33.841461 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-t57q5/perf-node-gather-daemonset-rfckk"] Apr 16 15:48:33.842433 ip-10-0-136-151 kubenswrapper[2573]: W0416 15:48:33.842411 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-poda1850b3f_33d2_448f_b3fc_1062ab270c21.slice/crio-ff41e8c3db45d22bf1c37a38facfb24f6e15a89f5c0a973f69c6ba62936800fb WatchSource:0}: Error finding container ff41e8c3db45d22bf1c37a38facfb24f6e15a89f5c0a973f69c6ba62936800fb: Status 404 returned error can't find the container with id ff41e8c3db45d22bf1c37a38facfb24f6e15a89f5c0a973f69c6ba62936800fb Apr 16 15:48:34.260383 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:34.260311 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5bcb57dd94-r6xk2_d8d2ac3f-dd82-4da5-8eae-3d9fe0a7d38c/console/0.log" Apr 16 15:48:34.289963 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:34.289940 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-586b57c7b4-pggd8_3ae8c896-dcf3-4414-83ac-16e14e874dce/download-server/0.log" Apr 16 15:48:34.429973 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:34.429942 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t57q5/perf-node-gather-daemonset-rfckk" event={"ID":"a1850b3f-33d2-448f-b3fc-1062ab270c21","Type":"ContainerStarted","Data":"23e075e1fe0af6845ea04154321492fd714b2478078c174bfc1552c09fc9ecfa"} Apr 16 15:48:34.429973 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:34.429979 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t57q5/perf-node-gather-daemonset-rfckk" event={"ID":"a1850b3f-33d2-448f-b3fc-1062ab270c21","Type":"ContainerStarted","Data":"ff41e8c3db45d22bf1c37a38facfb24f6e15a89f5c0a973f69c6ba62936800fb"} Apr 16 15:48:34.430372 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:34.430058 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-t57q5/perf-node-gather-daemonset-rfckk" Apr 16 15:48:34.444872 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:34.444819 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-t57q5/perf-node-gather-daemonset-rfckk" podStartSLOduration=1.444807344 podStartE2EDuration="1.444807344s" podCreationTimestamp="2026-04-16 15:48:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:48:34.444791284 +0000 UTC m=+2193.030995094" watchObservedRunningTime="2026-04-16 15:48:34.444807344 +0000 UTC m=+2193.031011152" Apr 16 15:48:34.789938 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:34.789912 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7d955d5dd4-sx5pd_af0b227b-f370-498a-9afb-9106b7108461/volume-data-source-validator/0.log" Apr 16 15:48:35.625153 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:35.625119 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-mgxdh_5c03038e-c257-4668-a7fa-62c438af26c8/dns/0.log" Apr 16 15:48:35.646123 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:35.646103 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-mgxdh_5c03038e-c257-4668-a7fa-62c438af26c8/kube-rbac-proxy/0.log" Apr 16 15:48:35.761261 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:35.761234 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-sjt8j_1462d2c4-a9b3-43d9-a968-33a1359d6ccc/dns-node-resolver/0.log" Apr 16 15:48:36.232268 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:36.232235 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-4pfr6_bf188288-b071-4a5d-867d-17022354cd51/node-ca/0.log" Apr 16 15:48:37.256231 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:37.256204 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-5bb547c98c-s85c9_6b1f0daf-7085-452a-bcf8-246adaa9a7b1/kube-auth-proxy/0.log" Apr 16 15:48:37.340220 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:37.340177 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-58ff4b54fd-qzf5r_c378b37b-c6a3-4eee-8f58-73402bb067b1/router/0.log" Apr 16 15:48:37.875849 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:37.875822 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-kw9vq_8d53317c-dbd7-4c13-ba73-b51e72998223/serve-healthcheck-canary/0.log" Apr 16 15:48:38.298499 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:38.298461 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-5785d4fcdd-fc75r_03b174e8-ba16-492a-9a00-8245945ade22/insights-operator/0.log" Apr 16 15:48:38.298979 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:38.298555 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-5785d4fcdd-fc75r_03b174e8-ba16-492a-9a00-8245945ade22/insights-operator/1.log" Apr 16 15:48:38.450507 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:38.450476 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-d6m95_0554221f-bc0c-43b7-ac3f-0baa33a72fc7/kube-rbac-proxy/0.log" Apr 16 15:48:38.471073 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:38.471046 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-d6m95_0554221f-bc0c-43b7-ac3f-0baa33a72fc7/exporter/0.log" Apr 16 15:48:38.493897 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:38.493870 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-d6m95_0554221f-bc0c-43b7-ac3f-0baa33a72fc7/extractor/0.log" Apr 16 15:48:40.444694 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:40.444669 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-t57q5/perf-node-gather-daemonset-rfckk" Apr 16 15:48:40.464734 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:40.464698 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-68df4b58f7-fsbq9_426be61e-17e1-4f01-b19f-3ab462b45ca7/manager/0.log" Apr 16 15:48:40.570166 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:40.570138 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-29667_14b38495-0655-4ac1-a07f-f1bb931d9b6d/postgres/0.log" Apr 16 15:48:41.872027 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:41.872000 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-6fc585dfcd-2r765_2c8cc64e-869f-4398-8880-ccca3d33543f/manager/0.log" Apr 16 15:48:46.185262 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:46.185232 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-64d4d94569-gms79_3c33efa9-0d42-4773-96d7-a6aea9290b8b/migrator/0.log" Apr 16 15:48:46.204641 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:46.204612 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-64d4d94569-gms79_3c33efa9-0d42-4773-96d7-a6aea9290b8b/graceful-termination/0.log" Apr 16 15:48:46.562462 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:46.562378 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-756bb7d76f-qnf9j_8f600d4b-94cc-4150-8ca8-7002385a1b56/kube-storage-version-migrator-operator/1.log" Apr 16 15:48:46.563272 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:46.563256 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-756bb7d76f-qnf9j_8f600d4b-94cc-4150-8ca8-7002385a1b56/kube-storage-version-migrator-operator/0.log" Apr 16 15:48:47.566223 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:47.566196 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8cgfx_0b3e092e-d180-4205-ac3b-9d71123acb4d/kube-multus/0.log" Apr 16 15:48:47.592457 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:47.592436 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4kjbb_914c9c02-e314-4b0c-b5f6-aab4f4681c8d/kube-multus-additional-cni-plugins/0.log" Apr 16 15:48:47.613474 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:47.613449 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4kjbb_914c9c02-e314-4b0c-b5f6-aab4f4681c8d/egress-router-binary-copy/0.log" Apr 16 15:48:47.633886 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:47.633860 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4kjbb_914c9c02-e314-4b0c-b5f6-aab4f4681c8d/cni-plugins/0.log" Apr 16 15:48:47.656241 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:47.656214 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4kjbb_914c9c02-e314-4b0c-b5f6-aab4f4681c8d/bond-cni-plugin/0.log" Apr 16 15:48:47.676211 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:47.676189 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4kjbb_914c9c02-e314-4b0c-b5f6-aab4f4681c8d/routeoverride-cni/0.log" Apr 16 15:48:47.696779 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:47.696758 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4kjbb_914c9c02-e314-4b0c-b5f6-aab4f4681c8d/whereabouts-cni-bincopy/0.log" Apr 16 15:48:47.717372 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:47.717353 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4kjbb_914c9c02-e314-4b0c-b5f6-aab4f4681c8d/whereabouts-cni/0.log" Apr 16 15:48:48.091355 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:48.091328 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-h9c44_414282ac-76dc-4992-819a-bdf05043c09d/network-metrics-daemon/0.log" Apr 16 15:48:48.110275 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:48.110249 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-h9c44_414282ac-76dc-4992-819a-bdf05043c09d/kube-rbac-proxy/0.log" Apr 16 15:48:49.765896 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:49.765868 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n96lw_edf7b8a9-878b-451c-86c3-6a78d2ece872/ovn-controller/0.log" Apr 16 15:48:49.800726 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:49.800707 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n96lw_edf7b8a9-878b-451c-86c3-6a78d2ece872/ovn-acl-logging/0.log" Apr 16 15:48:49.809741 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:49.809720 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n96lw_edf7b8a9-878b-451c-86c3-6a78d2ece872/ovn-acl-logging/1.log" Apr 16 15:48:49.847004 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:49.846982 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n96lw_edf7b8a9-878b-451c-86c3-6a78d2ece872/kube-rbac-proxy-node/0.log" Apr 16 15:48:49.885710 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:49.885688 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n96lw_edf7b8a9-878b-451c-86c3-6a78d2ece872/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 15:48:49.913587 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:49.913567 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n96lw_edf7b8a9-878b-451c-86c3-6a78d2ece872/northd/0.log" Apr 16 15:48:49.950558 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:49.950538 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n96lw_edf7b8a9-878b-451c-86c3-6a78d2ece872/nbdb/0.log" Apr 16 15:48:49.991574 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:49.991549 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n96lw_edf7b8a9-878b-451c-86c3-6a78d2ece872/sbdb/0.log" Apr 16 15:48:50.107420 ip-10-0-136-151 kubenswrapper[2573]: I0416 15:48:50.107342 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n96lw_edf7b8a9-878b-451c-86c3-6a78d2ece872/ovnkube-controller/0.log"