Apr 17 20:50:37.887327 ip-10-0-128-248 systemd[1]: Starting Kubernetes Kubelet... Apr 17 20:50:38.280663 ip-10-0-128-248 kubenswrapper[2576]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 20:50:38.280663 ip-10-0-128-248 kubenswrapper[2576]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 17 20:50:38.280663 ip-10-0-128-248 kubenswrapper[2576]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 20:50:38.280663 ip-10-0-128-248 kubenswrapper[2576]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 17 20:50:38.280663 ip-10-0-128-248 kubenswrapper[2576]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 20:50:38.281322 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.281239 2576 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 17 20:50:38.284223 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284208 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 20:50:38.284223 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284222 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 20:50:38.284287 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284226 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 20:50:38.284287 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284229 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 20:50:38.284287 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284232 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 20:50:38.284287 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284235 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 20:50:38.284287 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284238 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 20:50:38.284287 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284240 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 20:50:38.284287 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284243 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 20:50:38.284287 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284246 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 20:50:38.284287 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284249 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 20:50:38.284287 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284252 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 20:50:38.284287 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284255 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 20:50:38.284287 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284258 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 20:50:38.284287 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284260 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 20:50:38.284287 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284263 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 20:50:38.284287 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284265 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 20:50:38.284287 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284268 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 20:50:38.284287 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284271 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 20:50:38.284287 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284274 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 20:50:38.284287 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284276 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 20:50:38.284742 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284279 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 20:50:38.284742 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284281 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 20:50:38.284742 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284285 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 20:50:38.284742 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284289 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 20:50:38.284742 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284293 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 20:50:38.284742 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284296 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 20:50:38.284742 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284299 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 17 20:50:38.284742 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284302 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 20:50:38.284742 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284305 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 20:50:38.284742 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284308 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 20:50:38.284742 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284310 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 20:50:38.284742 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284313 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 20:50:38.284742 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284316 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 20:50:38.284742 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284319 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 20:50:38.284742 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284322 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 20:50:38.284742 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284325 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 20:50:38.284742 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284327 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 20:50:38.284742 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284330 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 20:50:38.284742 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284333 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 20:50:38.284742 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284335 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 20:50:38.285314 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284337 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 20:50:38.285314 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284340 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 20:50:38.285314 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284343 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 20:50:38.285314 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284346 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 20:50:38.285314 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284348 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 20:50:38.285314 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284351 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 20:50:38.285314 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284353 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 20:50:38.285314 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284356 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 20:50:38.285314 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284358 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 20:50:38.285314 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284361 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 20:50:38.285314 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284363 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 20:50:38.285314 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284366 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 20:50:38.285314 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284369 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 20:50:38.285314 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284371 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 20:50:38.285314 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284375 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 20:50:38.285314 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284378 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 20:50:38.285314 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284380 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 20:50:38.285314 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284383 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 20:50:38.285314 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284386 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 20:50:38.285314 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284388 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 20:50:38.285806 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284391 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 20:50:38.285806 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284394 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 20:50:38.285806 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284396 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 20:50:38.285806 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284398 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 20:50:38.285806 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284401 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 20:50:38.285806 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284404 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 20:50:38.285806 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284407 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 20:50:38.285806 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284409 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 20:50:38.285806 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284411 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 20:50:38.285806 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284414 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 20:50:38.285806 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284416 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 20:50:38.285806 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284419 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 20:50:38.285806 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284422 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 20:50:38.285806 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284425 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 20:50:38.285806 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284428 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 20:50:38.285806 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284430 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 20:50:38.285806 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284433 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 20:50:38.285806 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284436 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 20:50:38.285806 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284438 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 20:50:38.285806 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284441 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 20:50:38.286320 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284444 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 20:50:38.286320 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284448 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 20:50:38.286320 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284452 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 20:50:38.286320 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284455 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 20:50:38.286320 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284458 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 20:50:38.286320 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284819 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 20:50:38.286320 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284823 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 20:50:38.286320 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284826 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 20:50:38.286320 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284829 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 20:50:38.286320 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284832 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 20:50:38.286320 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284834 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 20:50:38.286320 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284837 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 20:50:38.286320 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284840 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 20:50:38.286320 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284842 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 17 20:50:38.286320 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284845 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 20:50:38.286320 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284848 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 20:50:38.286320 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284851 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 20:50:38.286320 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284854 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 20:50:38.286320 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284857 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 20:50:38.286815 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284861 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 20:50:38.286815 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284863 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 20:50:38.286815 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284866 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 20:50:38.286815 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284869 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 20:50:38.286815 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284871 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 20:50:38.286815 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284874 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 20:50:38.286815 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284876 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 20:50:38.286815 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284879 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 20:50:38.286815 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284881 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 20:50:38.286815 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284884 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 20:50:38.286815 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284886 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 20:50:38.286815 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284889 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 20:50:38.286815 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284892 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 20:50:38.286815 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284894 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 20:50:38.286815 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284897 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 20:50:38.286815 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284900 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 20:50:38.286815 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284903 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 20:50:38.286815 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284905 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 20:50:38.286815 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284908 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 20:50:38.287329 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284910 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 20:50:38.287329 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284913 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 20:50:38.287329 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284915 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 20:50:38.287329 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284918 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 20:50:38.287329 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284920 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 20:50:38.287329 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284923 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 20:50:38.287329 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284926 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 20:50:38.287329 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284928 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 20:50:38.287329 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284932 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 20:50:38.287329 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284934 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 20:50:38.287329 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284937 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 20:50:38.287329 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284939 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 20:50:38.287329 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284943 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 20:50:38.287329 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284945 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 20:50:38.287329 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284949 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 20:50:38.287329 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284952 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 20:50:38.287329 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284955 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 20:50:38.287329 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284958 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 20:50:38.287329 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284961 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 20:50:38.287797 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284964 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 20:50:38.287797 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284967 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 20:50:38.287797 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284969 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 20:50:38.287797 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284973 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 20:50:38.287797 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284975 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 20:50:38.287797 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284978 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 20:50:38.287797 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284981 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 20:50:38.287797 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284984 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 20:50:38.287797 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284986 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 20:50:38.287797 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284989 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 20:50:38.287797 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284992 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 20:50:38.287797 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284994 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 20:50:38.287797 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.284998 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 20:50:38.287797 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.285000 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 20:50:38.287797 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.285003 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 20:50:38.287797 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.285005 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 20:50:38.287797 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.285008 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 20:50:38.287797 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.285010 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 20:50:38.287797 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.285013 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 20:50:38.287797 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.285015 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 20:50:38.288327 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.285018 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 20:50:38.288327 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.285033 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 20:50:38.288327 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.285036 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 20:50:38.288327 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.285039 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 20:50:38.288327 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.285041 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 20:50:38.288327 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.285044 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 20:50:38.288327 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.285047 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 20:50:38.288327 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.285050 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 20:50:38.288327 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.285053 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 20:50:38.288327 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.285055 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 20:50:38.288327 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.285058 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 20:50:38.288327 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.285060 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 20:50:38.288327 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.285063 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 20:50:38.288327 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.285066 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 20:50:38.288327 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285137 2576 flags.go:64] FLAG: --address="0.0.0.0" Apr 17 20:50:38.288327 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285144 2576 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 17 20:50:38.288327 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285150 2576 flags.go:64] FLAG: --anonymous-auth="true" Apr 17 20:50:38.288327 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285155 2576 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 17 20:50:38.288327 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285159 2576 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 17 20:50:38.288327 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285163 2576 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 17 20:50:38.288327 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285167 2576 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 17 20:50:38.288849 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285171 2576 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 17 20:50:38.288849 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285174 2576 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 17 20:50:38.288849 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285177 2576 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 17 20:50:38.288849 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285181 2576 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 17 20:50:38.288849 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285184 2576 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 17 20:50:38.288849 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285187 2576 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 17 20:50:38.288849 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285190 2576 flags.go:64] FLAG: --cgroup-root="" Apr 17 20:50:38.288849 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285194 2576 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 17 20:50:38.288849 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285197 2576 flags.go:64] FLAG: --client-ca-file="" Apr 17 20:50:38.288849 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285200 2576 flags.go:64] FLAG: --cloud-config="" Apr 17 20:50:38.288849 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285202 2576 flags.go:64] FLAG: --cloud-provider="external" Apr 17 20:50:38.288849 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285205 2576 flags.go:64] FLAG: --cluster-dns="[]" Apr 17 20:50:38.288849 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285210 2576 flags.go:64] FLAG: --cluster-domain="" Apr 17 20:50:38.288849 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285213 2576 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 17 20:50:38.288849 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285216 2576 flags.go:64] FLAG: --config-dir="" Apr 17 20:50:38.288849 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285219 2576 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 17 20:50:38.288849 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285222 2576 flags.go:64] FLAG: --container-log-max-files="5" Apr 17 20:50:38.288849 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285226 2576 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 17 20:50:38.288849 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285230 2576 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 17 20:50:38.288849 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285233 2576 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 17 20:50:38.288849 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285236 2576 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 17 20:50:38.288849 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285239 2576 flags.go:64] FLAG: --contention-profiling="false" Apr 17 20:50:38.288849 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285242 2576 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 17 20:50:38.288849 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285245 2576 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 17 20:50:38.289434 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285248 2576 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 17 20:50:38.289434 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285251 2576 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 17 20:50:38.289434 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285255 2576 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 17 20:50:38.289434 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285258 2576 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 17 20:50:38.289434 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285261 2576 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 17 20:50:38.289434 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285264 2576 flags.go:64] FLAG: --enable-load-reader="false" Apr 17 20:50:38.289434 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285267 2576 flags.go:64] FLAG: --enable-server="true" Apr 17 20:50:38.289434 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285269 2576 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 17 20:50:38.289434 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285273 2576 flags.go:64] FLAG: --event-burst="100" Apr 17 20:50:38.289434 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285277 2576 flags.go:64] FLAG: --event-qps="50" Apr 17 20:50:38.289434 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285280 2576 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 17 20:50:38.289434 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285283 2576 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 17 20:50:38.289434 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285286 2576 flags.go:64] FLAG: --eviction-hard="" Apr 17 20:50:38.289434 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285290 2576 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 17 20:50:38.289434 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285293 2576 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 17 20:50:38.289434 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285296 2576 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 17 20:50:38.289434 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285299 2576 flags.go:64] FLAG: --eviction-soft="" Apr 17 20:50:38.289434 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285302 2576 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 17 20:50:38.289434 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285305 2576 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 17 20:50:38.289434 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285308 2576 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 17 20:50:38.289434 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285311 2576 flags.go:64] FLAG: --experimental-mounter-path="" Apr 17 20:50:38.289434 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285314 2576 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 17 20:50:38.289434 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285318 2576 flags.go:64] FLAG: --fail-swap-on="true" Apr 17 20:50:38.289434 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285321 2576 flags.go:64] FLAG: --feature-gates="" Apr 17 20:50:38.289434 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285325 2576 flags.go:64] FLAG: --file-check-frequency="20s" Apr 17 20:50:38.290049 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285328 2576 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 17 20:50:38.290049 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285331 2576 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 17 20:50:38.290049 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285335 2576 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 17 20:50:38.290049 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285338 2576 flags.go:64] FLAG: --healthz-port="10248" Apr 17 20:50:38.290049 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285341 2576 flags.go:64] FLAG: --help="false" Apr 17 20:50:38.290049 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285344 2576 flags.go:64] FLAG: --hostname-override="ip-10-0-128-248.ec2.internal" Apr 17 20:50:38.290049 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285347 2576 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 17 20:50:38.290049 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285350 2576 flags.go:64] FLAG: --http-check-frequency="20s" Apr 17 20:50:38.290049 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285354 2576 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 17 20:50:38.290049 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285357 2576 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 17 20:50:38.290049 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285360 2576 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 17 20:50:38.290049 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285363 2576 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 17 20:50:38.290049 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285367 2576 flags.go:64] FLAG: --image-service-endpoint="" Apr 17 20:50:38.290049 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285369 2576 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 17 20:50:38.290049 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285372 2576 flags.go:64] FLAG: --kube-api-burst="100" Apr 17 20:50:38.290049 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285375 2576 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 17 20:50:38.290049 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285378 2576 flags.go:64] FLAG: --kube-api-qps="50" Apr 17 20:50:38.290049 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285381 2576 flags.go:64] FLAG: --kube-reserved="" Apr 17 20:50:38.290049 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285384 2576 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 17 20:50:38.290049 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285390 2576 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 17 20:50:38.290049 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285393 2576 flags.go:64] FLAG: --kubelet-cgroups="" Apr 17 20:50:38.290049 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285395 2576 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 17 20:50:38.290049 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285398 2576 flags.go:64] FLAG: --lock-file="" Apr 17 20:50:38.290049 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285401 2576 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 17 20:50:38.290622 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285404 2576 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 17 20:50:38.290622 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285407 2576 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 17 20:50:38.290622 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285412 2576 flags.go:64] FLAG: --log-json-split-stream="false" Apr 17 20:50:38.290622 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285415 2576 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 17 20:50:38.290622 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285418 2576 flags.go:64] FLAG: --log-text-split-stream="false" Apr 17 20:50:38.290622 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285421 2576 flags.go:64] FLAG: --logging-format="text" Apr 17 20:50:38.290622 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285424 2576 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 17 20:50:38.290622 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285428 2576 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 17 20:50:38.290622 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285431 2576 flags.go:64] FLAG: --manifest-url="" Apr 17 20:50:38.290622 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285433 2576 flags.go:64] FLAG: --manifest-url-header="" Apr 17 20:50:38.290622 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285438 2576 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 17 20:50:38.290622 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285441 2576 flags.go:64] FLAG: --max-open-files="1000000" Apr 17 20:50:38.290622 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285446 2576 flags.go:64] FLAG: --max-pods="110" Apr 17 20:50:38.290622 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285449 2576 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 17 20:50:38.290622 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285452 2576 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 17 20:50:38.290622 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285454 2576 flags.go:64] FLAG: --memory-manager-policy="None" Apr 17 20:50:38.290622 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285457 2576 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 17 20:50:38.290622 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285460 2576 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 17 20:50:38.290622 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285463 2576 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 17 20:50:38.290622 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285466 2576 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 17 20:50:38.290622 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285473 2576 flags.go:64] FLAG: --node-status-max-images="50" Apr 17 20:50:38.290622 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285476 2576 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 17 20:50:38.290622 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285480 2576 flags.go:64] FLAG: --oom-score-adj="-999" Apr 17 20:50:38.290622 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285483 2576 flags.go:64] FLAG: --pod-cidr="" Apr 17 20:50:38.291211 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285486 2576 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 17 20:50:38.291211 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285492 2576 flags.go:64] FLAG: --pod-manifest-path="" Apr 17 20:50:38.291211 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285495 2576 flags.go:64] FLAG: --pod-max-pids="-1" Apr 17 20:50:38.291211 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285499 2576 flags.go:64] FLAG: --pods-per-core="0" Apr 17 20:50:38.291211 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285502 2576 flags.go:64] FLAG: --port="10250" Apr 17 20:50:38.291211 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285505 2576 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 17 20:50:38.291211 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285508 2576 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0e4ee108847f9b090" Apr 17 20:50:38.291211 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285511 2576 flags.go:64] FLAG: --qos-reserved="" Apr 17 20:50:38.291211 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285515 2576 flags.go:64] FLAG: --read-only-port="10255" Apr 17 20:50:38.291211 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285518 2576 flags.go:64] FLAG: --register-node="true" Apr 17 20:50:38.291211 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285521 2576 flags.go:64] FLAG: --register-schedulable="true" Apr 17 20:50:38.291211 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285523 2576 flags.go:64] FLAG: --register-with-taints="" Apr 17 20:50:38.291211 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285527 2576 flags.go:64] FLAG: --registry-burst="10" Apr 17 20:50:38.291211 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285530 2576 flags.go:64] FLAG: --registry-qps="5" Apr 17 20:50:38.291211 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285532 2576 flags.go:64] FLAG: --reserved-cpus="" Apr 17 20:50:38.291211 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285535 2576 flags.go:64] FLAG: --reserved-memory="" Apr 17 20:50:38.291211 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285539 2576 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 17 20:50:38.291211 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285542 2576 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 17 20:50:38.291211 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285545 2576 flags.go:64] FLAG: --rotate-certificates="false" Apr 17 20:50:38.291211 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285548 2576 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 17 20:50:38.291211 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285551 2576 flags.go:64] FLAG: --runonce="false" Apr 17 20:50:38.291211 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285554 2576 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 17 20:50:38.291211 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285557 2576 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 17 20:50:38.291211 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285559 2576 flags.go:64] FLAG: --seccomp-default="false" Apr 17 20:50:38.291211 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285562 2576 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 17 20:50:38.291821 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285565 2576 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 17 20:50:38.291821 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285568 2576 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 17 20:50:38.291821 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285571 2576 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 17 20:50:38.291821 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285574 2576 flags.go:64] FLAG: --storage-driver-password="root" Apr 17 20:50:38.291821 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285580 2576 flags.go:64] FLAG: --storage-driver-secure="false" Apr 17 20:50:38.291821 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285583 2576 flags.go:64] FLAG: --storage-driver-table="stats" Apr 17 20:50:38.291821 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285585 2576 flags.go:64] FLAG: --storage-driver-user="root" Apr 17 20:50:38.291821 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285588 2576 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 17 20:50:38.291821 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285591 2576 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 17 20:50:38.291821 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285594 2576 flags.go:64] FLAG: --system-cgroups="" Apr 17 20:50:38.291821 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285598 2576 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 17 20:50:38.291821 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285604 2576 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 17 20:50:38.291821 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285606 2576 flags.go:64] FLAG: --tls-cert-file="" Apr 17 20:50:38.291821 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285609 2576 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 17 20:50:38.291821 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285615 2576 flags.go:64] FLAG: --tls-min-version="" Apr 17 20:50:38.291821 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285618 2576 flags.go:64] FLAG: --tls-private-key-file="" Apr 17 20:50:38.291821 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285621 2576 flags.go:64] FLAG: --topology-manager-policy="none" Apr 17 20:50:38.291821 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285624 2576 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 17 20:50:38.291821 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285626 2576 flags.go:64] FLAG: --topology-manager-scope="container" Apr 17 20:50:38.291821 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285629 2576 flags.go:64] FLAG: --v="2" Apr 17 20:50:38.291821 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285633 2576 flags.go:64] FLAG: --version="false" Apr 17 20:50:38.291821 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285637 2576 flags.go:64] FLAG: --vmodule="" Apr 17 20:50:38.291821 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285642 2576 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 17 20:50:38.291821 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.285645 2576 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 17 20:50:38.291821 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.285738 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 20:50:38.292453 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.285742 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 20:50:38.292453 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.285745 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 17 20:50:38.292453 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.285748 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 20:50:38.292453 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.285752 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 20:50:38.292453 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.285755 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 20:50:38.292453 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.285758 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 20:50:38.292453 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.285760 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 20:50:38.292453 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.285764 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 20:50:38.292453 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.285766 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 20:50:38.292453 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.285769 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 20:50:38.292453 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.285771 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 20:50:38.292453 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.285776 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 20:50:38.292453 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.285778 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 20:50:38.292453 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.285781 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 20:50:38.292453 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.285784 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 20:50:38.292453 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.285786 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 20:50:38.292453 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.285789 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 20:50:38.292453 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.285793 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 20:50:38.292453 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.285795 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 20:50:38.292453 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.285798 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 20:50:38.292988 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.285801 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 20:50:38.292988 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.285803 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 20:50:38.292988 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.285806 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 20:50:38.292988 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.285809 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 20:50:38.292988 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.285812 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 20:50:38.292988 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.285815 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 20:50:38.292988 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.285817 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 20:50:38.292988 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.285820 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 20:50:38.292988 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.285822 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 20:50:38.292988 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.285825 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 20:50:38.292988 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.285827 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 20:50:38.292988 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.285830 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 20:50:38.292988 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.285833 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 20:50:38.292988 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.285835 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 20:50:38.292988 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.285837 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 20:50:38.292988 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.285840 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 20:50:38.292988 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.285844 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 20:50:38.292988 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.285848 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 20:50:38.292988 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.285851 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 20:50:38.293459 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.285854 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 20:50:38.293459 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.285857 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 20:50:38.293459 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.285859 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 20:50:38.293459 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.285862 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 20:50:38.293459 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.285865 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 20:50:38.293459 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.285868 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 20:50:38.293459 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.285870 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 20:50:38.293459 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.285873 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 20:50:38.293459 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.285876 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 20:50:38.293459 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.285878 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 20:50:38.293459 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.285882 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 20:50:38.293459 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.285885 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 20:50:38.293459 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.285887 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 20:50:38.293459 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.285890 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 20:50:38.293459 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.285892 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 20:50:38.293459 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.285895 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 20:50:38.293459 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.285898 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 20:50:38.293459 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.285900 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 20:50:38.293459 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.285903 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 20:50:38.293914 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.285905 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 20:50:38.293914 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.285908 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 20:50:38.293914 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.285910 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 20:50:38.293914 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.285913 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 20:50:38.293914 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.285916 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 20:50:38.293914 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.285919 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 20:50:38.293914 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.285921 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 20:50:38.293914 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.285924 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 20:50:38.293914 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.285926 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 20:50:38.293914 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.285929 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 20:50:38.293914 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.285931 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 20:50:38.293914 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.285934 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 20:50:38.293914 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.285937 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 20:50:38.293914 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.285939 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 20:50:38.293914 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.285941 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 20:50:38.293914 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.285944 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 20:50:38.293914 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.285947 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 20:50:38.293914 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.285960 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 20:50:38.293914 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.285963 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 20:50:38.293914 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.285966 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 20:50:38.294434 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.285970 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 20:50:38.294434 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.285974 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 20:50:38.294434 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.285977 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 20:50:38.294434 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.285981 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 20:50:38.294434 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.285984 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 20:50:38.294434 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.285986 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 20:50:38.294434 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.285989 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 20:50:38.294434 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.286510 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 20:50:38.294434 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.293280 2576 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 17 20:50:38.294434 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.293388 2576 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 17 20:50:38.294434 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293435 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 20:50:38.294434 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293441 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 20:50:38.294434 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293444 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 20:50:38.294434 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293447 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 20:50:38.294434 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293451 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 20:50:38.294812 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293454 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 20:50:38.294812 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293457 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 20:50:38.294812 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293460 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 20:50:38.294812 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293462 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 20:50:38.294812 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293465 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 20:50:38.294812 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293468 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 20:50:38.294812 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293471 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 20:50:38.294812 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293473 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 20:50:38.294812 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293483 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 20:50:38.294812 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293486 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 20:50:38.294812 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293488 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 20:50:38.294812 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293491 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 20:50:38.294812 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293493 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 20:50:38.294812 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293496 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 20:50:38.294812 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293499 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 20:50:38.294812 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293502 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 20:50:38.294812 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293504 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 20:50:38.294812 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293508 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 20:50:38.294812 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293511 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 20:50:38.294812 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293514 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 20:50:38.295348 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293518 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 20:50:38.295348 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293522 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 20:50:38.295348 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293525 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 20:50:38.295348 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293528 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 20:50:38.295348 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293531 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 20:50:38.295348 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293534 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 20:50:38.295348 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293537 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 20:50:38.295348 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293540 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 20:50:38.295348 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293543 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 20:50:38.295348 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293545 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 20:50:38.295348 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293548 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 20:50:38.295348 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293551 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 20:50:38.295348 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293553 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 20:50:38.295348 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293556 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 20:50:38.295348 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293559 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 17 20:50:38.295348 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293561 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 20:50:38.295348 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293564 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 20:50:38.295348 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293567 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 20:50:38.295348 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293569 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 20:50:38.295809 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293572 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 20:50:38.295809 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293575 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 20:50:38.295809 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293583 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 20:50:38.295809 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293586 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 20:50:38.295809 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293588 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 20:50:38.295809 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293591 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 20:50:38.295809 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293593 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 20:50:38.295809 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293596 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 20:50:38.295809 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293598 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 20:50:38.295809 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293601 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 20:50:38.295809 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293603 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 20:50:38.295809 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293606 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 20:50:38.295809 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293609 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 20:50:38.295809 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293611 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 20:50:38.295809 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293614 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 20:50:38.295809 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293616 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 20:50:38.295809 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293619 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 20:50:38.295809 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293622 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 20:50:38.295809 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293624 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 20:50:38.295809 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293627 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 20:50:38.296351 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293630 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 20:50:38.296351 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293633 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 20:50:38.296351 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293635 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 20:50:38.296351 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293637 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 20:50:38.296351 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293640 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 20:50:38.296351 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293643 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 20:50:38.296351 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293645 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 20:50:38.296351 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293647 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 20:50:38.296351 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293650 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 20:50:38.296351 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293653 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 20:50:38.296351 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293655 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 20:50:38.296351 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293658 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 20:50:38.296351 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293660 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 20:50:38.296351 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293663 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 20:50:38.296351 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293665 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 20:50:38.296351 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293674 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 20:50:38.296351 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293677 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 20:50:38.296351 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293680 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 20:50:38.296351 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293683 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 20:50:38.296807 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293685 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 20:50:38.296807 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293688 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 20:50:38.296807 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293691 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 20:50:38.296807 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.293695 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 20:50:38.296807 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293852 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 20:50:38.296807 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293857 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 20:50:38.296807 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293860 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 20:50:38.296807 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293863 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 20:50:38.296807 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293866 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 20:50:38.296807 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293870 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 20:50:38.296807 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293872 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 20:50:38.296807 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293875 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 20:50:38.296807 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293878 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 20:50:38.296807 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293880 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 20:50:38.296807 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293883 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 20:50:38.296807 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293886 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 20:50:38.297212 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293888 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 20:50:38.297212 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293890 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 20:50:38.297212 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293893 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 20:50:38.297212 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293896 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 20:50:38.297212 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293898 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 20:50:38.297212 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293900 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 20:50:38.297212 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293903 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 20:50:38.297212 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293905 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 20:50:38.297212 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293908 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 20:50:38.297212 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293910 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 20:50:38.297212 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293913 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 20:50:38.297212 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293915 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 20:50:38.297212 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293918 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 20:50:38.297212 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293926 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 20:50:38.297212 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293929 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 20:50:38.297212 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293932 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 20:50:38.297212 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293934 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 20:50:38.297212 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293937 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 20:50:38.297212 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293939 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 20:50:38.297212 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293942 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 20:50:38.297700 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293944 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 20:50:38.297700 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293947 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 20:50:38.297700 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293965 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 20:50:38.297700 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293970 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 20:50:38.297700 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293973 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 20:50:38.297700 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293977 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 20:50:38.297700 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293980 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 20:50:38.297700 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293983 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 20:50:38.297700 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293985 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 20:50:38.297700 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293988 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 20:50:38.297700 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293991 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 20:50:38.297700 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293993 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 20:50:38.297700 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293996 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 20:50:38.297700 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.293998 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 20:50:38.297700 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.294002 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 20:50:38.297700 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.294005 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 20:50:38.297700 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.294008 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 20:50:38.297700 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.294011 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 20:50:38.297700 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.294014 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 20:50:38.298181 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.294016 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 20:50:38.298181 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.294019 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 20:50:38.298181 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.294041 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 20:50:38.298181 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.294044 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 20:50:38.298181 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.294047 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 20:50:38.298181 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.294049 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 20:50:38.298181 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.294052 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 20:50:38.298181 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.294061 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 20:50:38.298181 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.294064 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 20:50:38.298181 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.294067 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 20:50:38.298181 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.294069 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 20:50:38.298181 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.294072 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 20:50:38.298181 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.294074 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 20:50:38.298181 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.294077 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 20:50:38.298181 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.294079 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 20:50:38.298181 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.294082 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 20:50:38.298181 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.294085 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 20:50:38.298181 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.294087 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 20:50:38.298181 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.294090 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 20:50:38.298181 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.294092 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 20:50:38.298665 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.294094 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 20:50:38.298665 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.294103 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 20:50:38.298665 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.294106 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 20:50:38.298665 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.294108 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 17 20:50:38.298665 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.294110 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 20:50:38.298665 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.294113 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 20:50:38.298665 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.294115 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 20:50:38.298665 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.294118 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 20:50:38.298665 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.294120 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 20:50:38.298665 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.294123 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 20:50:38.298665 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.294125 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 20:50:38.298665 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.294128 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 20:50:38.298665 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.294130 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 20:50:38.298665 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.294133 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 20:50:38.298665 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:38.294135 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 20:50:38.298665 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.294139 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 20:50:38.299065 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.294750 2576 server.go:962] "Client rotation is on, will bootstrap in background" Apr 17 20:50:38.299065 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.297986 2576 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 17 20:50:38.299065 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.298984 2576 server.go:1019] "Starting client certificate rotation" Apr 17 20:50:38.299065 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.299050 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 20:50:38.299681 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.299669 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 20:50:38.321811 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.321796 2576 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 20:50:38.326867 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.326841 2576 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 20:50:38.340390 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.340371 2576 log.go:25] "Validated CRI v1 runtime API" Apr 17 20:50:38.345063 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.345048 2576 log.go:25] "Validated CRI v1 image API" Apr 17 20:50:38.347019 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.347004 2576 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 17 20:50:38.350490 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.350466 2576 fs.go:135] Filesystem UUIDs: map[065ef022-c710-4bd9-a37c-d643b5d06da1:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 8df7075d-e0d9-4b23-adf2-03e949b54bb1:/dev/nvme0n1p4] Apr 17 20:50:38.350555 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.350489 2576 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 17 20:50:38.356127 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.355985 2576 manager.go:217] Machine: {Timestamp:2026-04-17 20:50:38.354260414 +0000 UTC m=+0.360674439 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3099950 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec29e29dd58b4e4521a02dfdf790e551 SystemUUID:ec29e29d-d58b-4e45-21a0-2dfdf790e551 BootID:30bcc22f-54f8-4fdc-a097-6ebf9537b4a2 Filesystems:[{Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:c4:ee:f1:43:7d Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:c4:ee:f1:43:7d Speed:0 Mtu:9001} {Name:ovs-system MacAddress:f2:26:d4:da:86:8a Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 17 20:50:38.356127 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.356115 2576 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 17 20:50:38.356278 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.356215 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 20:50:38.356278 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.356232 2576 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 17 20:50:38.357200 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.357171 2576 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 17 20:50:38.357329 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.357201 2576 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-128-248.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 17 20:50:38.357377 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.357339 2576 topology_manager.go:138] "Creating topology manager with none policy" Apr 17 20:50:38.357377 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.357347 2576 container_manager_linux.go:306] "Creating device plugin manager" Apr 17 20:50:38.357377 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.357361 2576 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 20:50:38.358201 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.358190 2576 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 20:50:38.358918 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.358907 2576 state_mem.go:36] "Initialized new in-memory state store" Apr 17 20:50:38.359037 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.359016 2576 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 17 20:50:38.361944 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.361933 2576 kubelet.go:491] "Attempting to sync node with API server" Apr 17 20:50:38.361995 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.361962 2576 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 17 20:50:38.361995 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.361982 2576 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 17 20:50:38.362083 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.361996 2576 kubelet.go:397] "Adding apiserver pod source" Apr 17 20:50:38.362083 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.362008 2576 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 17 20:50:38.363043 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.363019 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 20:50:38.363080 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.363052 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 20:50:38.365930 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.365913 2576 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 17 20:50:38.367220 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.367208 2576 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 17 20:50:38.368825 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.368814 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 17 20:50:38.368859 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.368830 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 17 20:50:38.368859 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.368837 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 17 20:50:38.368859 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.368842 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 17 20:50:38.368859 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.368849 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 17 20:50:38.368859 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.368854 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 17 20:50:38.368859 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.368860 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 17 20:50:38.369059 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.368865 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 17 20:50:38.369059 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.368873 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 17 20:50:38.369059 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.368880 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 17 20:50:38.369059 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.368887 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 17 20:50:38.369059 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.368896 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 17 20:50:38.369703 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.369693 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 17 20:50:38.369703 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.369704 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 17 20:50:38.373051 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.373019 2576 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 17 20:50:38.373097 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.373078 2576 server.go:1295] "Started kubelet" Apr 17 20:50:38.373338 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.373288 2576 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 17 20:50:38.373424 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.373360 2576 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 17 20:50:38.373805 ip-10-0-128-248 systemd[1]: Started Kubernetes Kubelet. Apr 17 20:50:38.374596 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.374403 2576 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 17 20:50:38.374596 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.374456 2576 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 17 20:50:38.375416 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.375391 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-2fbjg" Apr 17 20:50:38.375484 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.375416 2576 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-128-248.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 17 20:50:38.375601 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:50:38.375560 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 17 20:50:38.375647 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:50:38.375597 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-128-248.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 17 20:50:38.377521 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.377503 2576 server.go:317] "Adding debug handlers to kubelet server" Apr 17 20:50:38.380547 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.380532 2576 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 17 20:50:38.380708 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.380692 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 17 20:50:38.380783 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.380723 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-2fbjg" Apr 17 20:50:38.381076 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.381056 2576 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 17 20:50:38.381143 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.381079 2576 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 17 20:50:38.381198 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.381190 2576 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 17 20:50:38.381284 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.381262 2576 reconstruct.go:97] "Volume reconstruction finished" Apr 17 20:50:38.381284 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.381270 2576 reconciler.go:26] "Reconciler: start to sync state" Apr 17 20:50:38.382146 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:50:38.381470 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-248.ec2.internal\" not found" Apr 17 20:50:38.382146 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:50:38.380468 2576 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-128-248.ec2.internal.18a740155441f91e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-128-248.ec2.internal,UID:ip-10-0-128-248.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-128-248.ec2.internal,},FirstTimestamp:2026-04-17 20:50:38.373050654 +0000 UTC m=+0.379464687,LastTimestamp:2026-04-17 20:50:38.373050654 +0000 UTC m=+0.379464687,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-128-248.ec2.internal,}" Apr 17 20:50:38.384742 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.384723 2576 factory.go:55] Registering systemd factory Apr 17 20:50:38.384830 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.384784 2576 factory.go:223] Registration of the systemd container factory successfully Apr 17 20:50:38.385506 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.385489 2576 factory.go:153] Registering CRI-O factory Apr 17 20:50:38.385587 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.385510 2576 factory.go:223] Registration of the crio container factory successfully Apr 17 20:50:38.385587 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.385555 2576 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 17 20:50:38.385587 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.385579 2576 factory.go:103] Registering Raw factory Apr 17 20:50:38.385690 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.385593 2576 manager.go:1196] Started watching for new ooms in manager Apr 17 20:50:38.386059 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.386046 2576 manager.go:319] Starting recovery of all containers Apr 17 20:50:38.386519 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:50:38.386479 2576 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 17 20:50:38.395818 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.395806 2576 manager.go:324] Recovery completed Apr 17 20:50:38.398540 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.398518 2576 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 20:50:38.399725 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.399713 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 20:50:38.400711 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:50:38.400696 2576 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-128-248.ec2.internal\" not found" node="ip-10-0-128-248.ec2.internal" Apr 17 20:50:38.401816 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.401802 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-248.ec2.internal" event="NodeHasSufficientMemory" Apr 17 20:50:38.401892 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.401832 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-248.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 20:50:38.401892 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.401847 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-248.ec2.internal" event="NodeHasSufficientPID" Apr 17 20:50:38.402442 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.402425 2576 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 17 20:50:38.402442 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.402439 2576 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 17 20:50:38.402538 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.402454 2576 state_mem.go:36] "Initialized new in-memory state store" Apr 17 20:50:38.405219 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.405207 2576 policy_none.go:49] "None policy: Start" Apr 17 20:50:38.405265 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.405223 2576 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 17 20:50:38.405265 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.405232 2576 state_mem.go:35] "Initializing new in-memory state store" Apr 17 20:50:38.444678 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.444664 2576 manager.go:341] "Starting Device Plugin manager" Apr 17 20:50:38.466248 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:50:38.444716 2576 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 17 20:50:38.466248 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.444730 2576 server.go:85] "Starting device plugin registration server" Apr 17 20:50:38.466248 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.444940 2576 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 17 20:50:38.466248 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.444951 2576 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 17 20:50:38.466248 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.445057 2576 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 17 20:50:38.466248 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.445129 2576 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 17 20:50:38.466248 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.445141 2576 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 17 20:50:38.466248 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:50:38.445845 2576 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 17 20:50:38.466248 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:50:38.445886 2576 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-128-248.ec2.internal\" not found" Apr 17 20:50:38.481072 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.481040 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 17 20:50:38.482306 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.482289 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 17 20:50:38.482413 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.482319 2576 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 17 20:50:38.482413 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.482343 2576 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 17 20:50:38.482413 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.482353 2576 kubelet.go:2451] "Starting kubelet main sync loop" Apr 17 20:50:38.482543 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:50:38.482432 2576 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 17 20:50:38.485189 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.485170 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 20:50:38.545845 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.545785 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 20:50:38.546673 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.546657 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-248.ec2.internal" event="NodeHasSufficientMemory" Apr 17 20:50:38.546756 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.546688 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-248.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 20:50:38.546756 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.546700 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-248.ec2.internal" event="NodeHasSufficientPID" Apr 17 20:50:38.546756 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.546722 2576 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-128-248.ec2.internal" Apr 17 20:50:38.552644 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.552617 2576 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-128-248.ec2.internal" Apr 17 20:50:38.552644 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:50:38.552641 2576 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-128-248.ec2.internal\": node \"ip-10-0-128-248.ec2.internal\" not found" Apr 17 20:50:38.570206 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:50:38.570187 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-248.ec2.internal\" not found" Apr 17 20:50:38.582782 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.582761 2576 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["kube-system/kube-apiserver-proxy-ip-10-0-128-248.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-248.ec2.internal"] Apr 17 20:50:38.582842 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.582815 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 20:50:38.583546 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.583532 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-248.ec2.internal" event="NodeHasSufficientMemory" Apr 17 20:50:38.583596 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.583560 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-248.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 20:50:38.583596 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.583569 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-248.ec2.internal" event="NodeHasSufficientPID" Apr 17 20:50:38.585896 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.585883 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 20:50:38.586050 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.586036 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-248.ec2.internal" Apr 17 20:50:38.586090 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.586062 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 20:50:38.586494 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.586480 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-248.ec2.internal" event="NodeHasSufficientMemory" Apr 17 20:50:38.586585 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.586499 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-248.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 20:50:38.586585 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.586509 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-248.ec2.internal" event="NodeHasSufficientPID" Apr 17 20:50:38.586585 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.586531 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-248.ec2.internal" event="NodeHasSufficientMemory" Apr 17 20:50:38.586585 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.586553 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-248.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 20:50:38.586585 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.586567 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-248.ec2.internal" event="NodeHasSufficientPID" Apr 17 20:50:38.588738 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.588724 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-248.ec2.internal" Apr 17 20:50:38.588809 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.588748 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 20:50:38.590193 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.590169 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-248.ec2.internal" event="NodeHasSufficientMemory" Apr 17 20:50:38.590282 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.590201 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-248.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 20:50:38.590282 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.590216 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-248.ec2.internal" event="NodeHasSufficientPID" Apr 17 20:50:38.615656 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:50:38.615638 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-128-248.ec2.internal\" not found" node="ip-10-0-128-248.ec2.internal" Apr 17 20:50:38.619909 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:50:38.619893 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-128-248.ec2.internal\" not found" node="ip-10-0-128-248.ec2.internal" Apr 17 20:50:38.670942 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:50:38.670923 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-248.ec2.internal\" not found" Apr 17 20:50:38.683461 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.683439 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/d579a59ca2c3bf4b3f744c41961ff1e1-config\") pod \"kube-apiserver-proxy-ip-10-0-128-248.ec2.internal\" (UID: \"d579a59ca2c3bf4b3f744c41961ff1e1\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-248.ec2.internal" Apr 17 20:50:38.683544 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.683464 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/933ffdfca6e87b798592801ce6979396-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-248.ec2.internal\" (UID: \"933ffdfca6e87b798592801ce6979396\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-248.ec2.internal" Apr 17 20:50:38.683544 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.683482 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/933ffdfca6e87b798592801ce6979396-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-248.ec2.internal\" (UID: \"933ffdfca6e87b798592801ce6979396\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-248.ec2.internal" Apr 17 20:50:38.771154 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:50:38.771132 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-248.ec2.internal\" not found" Apr 17 20:50:38.784463 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.784445 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/d579a59ca2c3bf4b3f744c41961ff1e1-config\") pod \"kube-apiserver-proxy-ip-10-0-128-248.ec2.internal\" (UID: \"d579a59ca2c3bf4b3f744c41961ff1e1\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-248.ec2.internal" Apr 17 20:50:38.784528 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.784502 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/933ffdfca6e87b798592801ce6979396-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-248.ec2.internal\" (UID: \"933ffdfca6e87b798592801ce6979396\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-248.ec2.internal" Apr 17 20:50:38.784589 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.784533 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/933ffdfca6e87b798592801ce6979396-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-248.ec2.internal\" (UID: \"933ffdfca6e87b798592801ce6979396\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-248.ec2.internal" Apr 17 20:50:38.784589 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.784554 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/d579a59ca2c3bf4b3f744c41961ff1e1-config\") pod \"kube-apiserver-proxy-ip-10-0-128-248.ec2.internal\" (UID: \"d579a59ca2c3bf4b3f744c41961ff1e1\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-248.ec2.internal" Apr 17 20:50:38.784589 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.784563 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/933ffdfca6e87b798592801ce6979396-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-248.ec2.internal\" (UID: \"933ffdfca6e87b798592801ce6979396\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-248.ec2.internal" Apr 17 20:50:38.784683 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.784590 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/933ffdfca6e87b798592801ce6979396-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-248.ec2.internal\" (UID: \"933ffdfca6e87b798592801ce6979396\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-248.ec2.internal" Apr 17 20:50:38.871899 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:50:38.871870 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-248.ec2.internal\" not found" Apr 17 20:50:38.917341 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.917312 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-248.ec2.internal" Apr 17 20:50:38.921585 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:38.921565 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-248.ec2.internal" Apr 17 20:50:38.973000 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:50:38.972970 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-248.ec2.internal\" not found" Apr 17 20:50:39.073391 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:50:39.073369 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-248.ec2.internal\" not found" Apr 17 20:50:39.173874 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:50:39.173824 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-248.ec2.internal\" not found" Apr 17 20:50:39.274306 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:50:39.274276 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-248.ec2.internal\" not found" Apr 17 20:50:39.299723 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:39.299703 2576 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 17 20:50:39.300173 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:39.299832 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 20:50:39.300173 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:39.299868 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 20:50:39.375253 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:50:39.375223 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-248.ec2.internal\" not found" Apr 17 20:50:39.381376 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:39.381357 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 17 20:50:39.383161 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:39.383129 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-16 20:45:38 +0000 UTC" deadline="2028-01-25 11:14:56.703484542 +0000 UTC" Apr 17 20:50:39.383161 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:39.383159 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15542h24m17.320327754s" Apr 17 20:50:39.397994 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:39.397972 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 20:50:39.419746 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:39.419727 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-psl52" Apr 17 20:50:39.428493 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:39.428453 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-psl52" Apr 17 20:50:39.437915 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:39.437775 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod933ffdfca6e87b798592801ce6979396.slice/crio-e49c807afaa800a55bd74bc71c2aefdc514a284751faa73cc6e4eb7c8fe361c2 WatchSource:0}: Error finding container e49c807afaa800a55bd74bc71c2aefdc514a284751faa73cc6e4eb7c8fe361c2: Status 404 returned error can't find the container with id e49c807afaa800a55bd74bc71c2aefdc514a284751faa73cc6e4eb7c8fe361c2 Apr 17 20:50:39.438151 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:39.438134 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd579a59ca2c3bf4b3f744c41961ff1e1.slice/crio-91af27451204370390c6d9d57217b18c6648cdd51b0425669bfbb72a124cd4f3 WatchSource:0}: Error finding container 91af27451204370390c6d9d57217b18c6648cdd51b0425669bfbb72a124cd4f3: Status 404 returned error can't find the container with id 91af27451204370390c6d9d57217b18c6648cdd51b0425669bfbb72a124cd4f3 Apr 17 20:50:39.442679 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:39.442665 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 20:50:39.475994 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:50:39.475971 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-248.ec2.internal\" not found" Apr 17 20:50:39.484646 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:39.484608 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-248.ec2.internal" event={"ID":"933ffdfca6e87b798592801ce6979396","Type":"ContainerStarted","Data":"e49c807afaa800a55bd74bc71c2aefdc514a284751faa73cc6e4eb7c8fe361c2"} Apr 17 20:50:39.485399 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:39.485380 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-248.ec2.internal" event={"ID":"d579a59ca2c3bf4b3f744c41961ff1e1","Type":"ContainerStarted","Data":"91af27451204370390c6d9d57217b18c6648cdd51b0425669bfbb72a124cd4f3"} Apr 17 20:50:39.521850 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:39.521833 2576 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 20:50:39.576242 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:50:39.576220 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-248.ec2.internal\" not found" Apr 17 20:50:39.676722 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:50:39.676698 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-248.ec2.internal\" not found" Apr 17 20:50:39.746847 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:39.746790 2576 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 20:50:39.781811 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:39.781792 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-248.ec2.internal" Apr 17 20:50:39.792033 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:39.792004 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 20:50:39.792846 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:39.792834 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-248.ec2.internal" Apr 17 20:50:39.801797 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:39.801780 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 20:50:40.363579 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.363550 2576 apiserver.go:52] "Watching apiserver" Apr 17 20:50:40.371871 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.371841 2576 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 17 20:50:40.372248 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.372222 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-g2jbd","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ttw76","openshift-dns/node-resolver-bv5tl","openshift-image-registry/node-ca-q6mlz","openshift-multus/multus-additional-cni-plugins-hzs8v","openshift-multus/multus-ckt8c","openshift-network-diagnostics/network-check-target-hzbw9","openshift-ovn-kubernetes/ovnkube-node-pwcjq","kube-system/kube-apiserver-proxy-ip-10-0-128-248.ec2.internal","openshift-cluster-node-tuning-operator/tuned-w7g97","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-248.ec2.internal","openshift-multus/network-metrics-daemon-z2lfh","openshift-network-operator/iptables-alerter-zsnm9"] Apr 17 20:50:40.375158 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.375136 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hzbw9" Apr 17 20:50:40.375266 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:50:40.375230 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hzbw9" podUID="276cca8f-28b9-4e8c-918c-32c78b8870e7" Apr 17 20:50:40.377243 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.377225 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ttw76" Apr 17 20:50:40.379461 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.379427 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-bv5tl" Apr 17 20:50:40.379695 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.379668 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 17 20:50:40.379810 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.379797 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 17 20:50:40.379877 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.379827 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-7qbl7\"" Apr 17 20:50:40.379932 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.379918 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 17 20:50:40.381694 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.381673 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-q6mlz" Apr 17 20:50:40.381861 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.381842 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-cf6db\"" Apr 17 20:50:40.382480 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.382464 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 17 20:50:40.382574 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.382531 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 17 20:50:40.383902 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.383873 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-hzs8v" Apr 17 20:50:40.383902 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.383899 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 17 20:50:40.384062 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.383932 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 17 20:50:40.384151 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.384136 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 17 20:50:40.384199 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.384136 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-scxbk\"" Apr 17 20:50:40.386172 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.385894 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-ckt8c" Apr 17 20:50:40.388666 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.386712 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 17 20:50:40.388804 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.388786 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 17 20:50:40.388870 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.388823 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-ggbzf\"" Apr 17 20:50:40.389273 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.389246 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 17 20:50:40.389534 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.389369 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 17 20:50:40.389534 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.389385 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 17 20:50:40.389534 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.389412 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-w29pc\"" Apr 17 20:50:40.389534 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.389440 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 17 20:50:40.392394 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.392367 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-g2jbd" Apr 17 20:50:40.392394 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.392384 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2z8zw\" (UniqueName: \"kubernetes.io/projected/872463a9-2d67-4fef-aa20-3db5ecfe6fd8-kube-api-access-2z8zw\") pod \"aws-ebs-csi-driver-node-ttw76\" (UID: \"872463a9-2d67-4fef-aa20-3db5ecfe6fd8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ttw76" Apr 17 20:50:40.392539 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.392421 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/872463a9-2d67-4fef-aa20-3db5ecfe6fd8-registration-dir\") pod \"aws-ebs-csi-driver-node-ttw76\" (UID: \"872463a9-2d67-4fef-aa20-3db5ecfe6fd8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ttw76" Apr 17 20:50:40.392539 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.392449 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/872463a9-2d67-4fef-aa20-3db5ecfe6fd8-device-dir\") pod \"aws-ebs-csi-driver-node-ttw76\" (UID: \"872463a9-2d67-4fef-aa20-3db5ecfe6fd8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ttw76" Apr 17 20:50:40.392539 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.392474 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/872463a9-2d67-4fef-aa20-3db5ecfe6fd8-sys-fs\") pod \"aws-ebs-csi-driver-node-ttw76\" (UID: \"872463a9-2d67-4fef-aa20-3db5ecfe6fd8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ttw76" Apr 17 20:50:40.392539 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.392502 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/bfe609fc-0a8b-47f5-ac0d-e7e0c47917de-hosts-file\") pod \"node-resolver-bv5tl\" (UID: \"bfe609fc-0a8b-47f5-ac0d-e7e0c47917de\") " pod="openshift-dns/node-resolver-bv5tl" Apr 17 20:50:40.392539 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.392530 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-pwcjq" Apr 17 20:50:40.392802 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.392530 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/bd8e668b-8da4-40de-9724-891b41810e2d-serviceca\") pod \"node-ca-q6mlz\" (UID: \"bd8e668b-8da4-40de-9724-891b41810e2d\") " pod="openshift-image-registry/node-ca-q6mlz" Apr 17 20:50:40.392802 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.392743 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/872463a9-2d67-4fef-aa20-3db5ecfe6fd8-kubelet-dir\") pod \"aws-ebs-csi-driver-node-ttw76\" (UID: \"872463a9-2d67-4fef-aa20-3db5ecfe6fd8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ttw76" Apr 17 20:50:40.392802 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.392774 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/872463a9-2d67-4fef-aa20-3db5ecfe6fd8-socket-dir\") pod \"aws-ebs-csi-driver-node-ttw76\" (UID: \"872463a9-2d67-4fef-aa20-3db5ecfe6fd8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ttw76" Apr 17 20:50:40.392994 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.392799 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgxcv\" (UniqueName: \"kubernetes.io/projected/bd8e668b-8da4-40de-9724-891b41810e2d-kube-api-access-sgxcv\") pod \"node-ca-q6mlz\" (UID: \"bd8e668b-8da4-40de-9724-891b41810e2d\") " pod="openshift-image-registry/node-ca-q6mlz" Apr 17 20:50:40.392994 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.392868 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xptqq\" (UniqueName: \"kubernetes.io/projected/276cca8f-28b9-4e8c-918c-32c78b8870e7-kube-api-access-xptqq\") pod \"network-check-target-hzbw9\" (UID: \"276cca8f-28b9-4e8c-918c-32c78b8870e7\") " pod="openshift-network-diagnostics/network-check-target-hzbw9" Apr 17 20:50:40.392994 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.392914 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/872463a9-2d67-4fef-aa20-3db5ecfe6fd8-etc-selinux\") pod \"aws-ebs-csi-driver-node-ttw76\" (UID: \"872463a9-2d67-4fef-aa20-3db5ecfe6fd8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ttw76" Apr 17 20:50:40.392994 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.392938 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/bfe609fc-0a8b-47f5-ac0d-e7e0c47917de-tmp-dir\") pod \"node-resolver-bv5tl\" (UID: \"bfe609fc-0a8b-47f5-ac0d-e7e0c47917de\") " pod="openshift-dns/node-resolver-bv5tl" Apr 17 20:50:40.392994 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.392979 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sz484\" (UniqueName: \"kubernetes.io/projected/bfe609fc-0a8b-47f5-ac0d-e7e0c47917de-kube-api-access-sz484\") pod \"node-resolver-bv5tl\" (UID: \"bfe609fc-0a8b-47f5-ac0d-e7e0c47917de\") " pod="openshift-dns/node-resolver-bv5tl" Apr 17 20:50:40.393243 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.393003 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bd8e668b-8da4-40de-9724-891b41810e2d-host\") pod \"node-ca-q6mlz\" (UID: \"bd8e668b-8da4-40de-9724-891b41810e2d\") " pod="openshift-image-registry/node-ca-q6mlz" Apr 17 20:50:40.394743 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.394725 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-w7g97" Apr 17 20:50:40.395107 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.395072 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 17 20:50:40.395533 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.395519 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 17 20:50:40.396623 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.396602 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 17 20:50:40.396715 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.396622 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-hkpd2\"" Apr 17 20:50:40.396715 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.396628 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-bxtgt\"" Apr 17 20:50:40.396715 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.396628 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 17 20:50:40.396906 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.396891 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 17 20:50:40.397076 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.397062 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 17 20:50:40.397168 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.397095 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-bv6n4\"" Apr 17 20:50:40.397168 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.397119 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z2lfh" Apr 17 20:50:40.397273 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:50:40.397185 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z2lfh" podUID="f35870d4-a814-46cd-8764-30e79f3a7d79" Apr 17 20:50:40.397273 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.397209 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 17 20:50:40.397273 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.397216 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 17 20:50:40.397920 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.397902 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 17 20:50:40.398001 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.397904 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 17 20:50:40.399346 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.399326 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-zsnm9" Apr 17 20:50:40.401579 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.401563 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 17 20:50:40.401858 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.401839 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 17 20:50:40.401972 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.401874 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 17 20:50:40.401972 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.401926 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-2gpzj\"" Apr 17 20:50:40.430056 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.430008 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 20:45:39 +0000 UTC" deadline="2027-10-03 21:33:38.404120458 +0000 UTC" Apr 17 20:50:40.430056 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.430050 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12816h42m57.9740745s" Apr 17 20:50:40.482473 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.482452 2576 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 17 20:50:40.494224 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.494202 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f9a06ede-5ace-4172-9e77-3eb6558ce0b1-os-release\") pod \"multus-additional-cni-plugins-hzs8v\" (UID: \"f9a06ede-5ace-4172-9e77-3eb6558ce0b1\") " pod="openshift-multus/multus-additional-cni-plugins-hzs8v" Apr 17 20:50:40.494314 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.494230 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f9a06ede-5ace-4172-9e77-3eb6558ce0b1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-hzs8v\" (UID: \"f9a06ede-5ace-4172-9e77-3eb6558ce0b1\") " pod="openshift-multus/multus-additional-cni-plugins-hzs8v" Apr 17 20:50:40.494314 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.494246 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4ca6cd71-981d-410b-b6f0-e306196de4b3-sys\") pod \"tuned-w7g97\" (UID: \"4ca6cd71-981d-410b-b6f0-e306196de4b3\") " pod="openshift-cluster-node-tuning-operator/tuned-w7g97" Apr 17 20:50:40.494314 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.494265 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/872463a9-2d67-4fef-aa20-3db5ecfe6fd8-socket-dir\") pod \"aws-ebs-csi-driver-node-ttw76\" (UID: \"872463a9-2d67-4fef-aa20-3db5ecfe6fd8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ttw76" Apr 17 20:50:40.494314 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.494291 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sgxcv\" (UniqueName: \"kubernetes.io/projected/bd8e668b-8da4-40de-9724-891b41810e2d-kube-api-access-sgxcv\") pod \"node-ca-q6mlz\" (UID: \"bd8e668b-8da4-40de-9724-891b41810e2d\") " pod="openshift-image-registry/node-ca-q6mlz" Apr 17 20:50:40.494475 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.494338 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f9a06ede-5ace-4172-9e77-3eb6558ce0b1-cnibin\") pod \"multus-additional-cni-plugins-hzs8v\" (UID: \"f9a06ede-5ace-4172-9e77-3eb6558ce0b1\") " pod="openshift-multus/multus-additional-cni-plugins-hzs8v" Apr 17 20:50:40.494475 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.494379 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sm2r4\" (UniqueName: \"kubernetes.io/projected/f35870d4-a814-46cd-8764-30e79f3a7d79-kube-api-access-sm2r4\") pod \"network-metrics-daemon-z2lfh\" (UID: \"f35870d4-a814-46cd-8764-30e79f3a7d79\") " pod="openshift-multus/network-metrics-daemon-z2lfh" Apr 17 20:50:40.494475 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.494417 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/872463a9-2d67-4fef-aa20-3db5ecfe6fd8-etc-selinux\") pod \"aws-ebs-csi-driver-node-ttw76\" (UID: \"872463a9-2d67-4fef-aa20-3db5ecfe6fd8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ttw76" Apr 17 20:50:40.494475 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.494469 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/16f9d82e-a351-4b70-9872-d4e4ed6dd1b2-host-run-multus-certs\") pod \"multus-ckt8c\" (UID: \"16f9d82e-a351-4b70-9872-d4e4ed6dd1b2\") " pod="openshift-multus/multus-ckt8c" Apr 17 20:50:40.494653 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.494473 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/872463a9-2d67-4fef-aa20-3db5ecfe6fd8-socket-dir\") pod \"aws-ebs-csi-driver-node-ttw76\" (UID: \"872463a9-2d67-4fef-aa20-3db5ecfe6fd8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ttw76" Apr 17 20:50:40.494653 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.494503 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/872463a9-2d67-4fef-aa20-3db5ecfe6fd8-etc-selinux\") pod \"aws-ebs-csi-driver-node-ttw76\" (UID: \"872463a9-2d67-4fef-aa20-3db5ecfe6fd8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ttw76" Apr 17 20:50:40.494653 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.494520 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9dtr\" (UniqueName: \"kubernetes.io/projected/16f9d82e-a351-4b70-9872-d4e4ed6dd1b2-kube-api-access-v9dtr\") pod \"multus-ckt8c\" (UID: \"16f9d82e-a351-4b70-9872-d4e4ed6dd1b2\") " pod="openshift-multus/multus-ckt8c" Apr 17 20:50:40.494653 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.494576 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c5da4be1-c2a5-411b-a362-4b558f88d556-node-log\") pod \"ovnkube-node-pwcjq\" (UID: \"c5da4be1-c2a5-411b-a362-4b558f88d556\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcjq" Apr 17 20:50:40.494653 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.494613 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c5da4be1-c2a5-411b-a362-4b558f88d556-host-cni-bin\") pod \"ovnkube-node-pwcjq\" (UID: \"c5da4be1-c2a5-411b-a362-4b558f88d556\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcjq" Apr 17 20:50:40.494653 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.494633 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/4ca6cd71-981d-410b-b6f0-e306196de4b3-etc-sysctl-conf\") pod \"tuned-w7g97\" (UID: \"4ca6cd71-981d-410b-b6f0-e306196de4b3\") " pod="openshift-cluster-node-tuning-operator/tuned-w7g97" Apr 17 20:50:40.494653 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.494647 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/16f9d82e-a351-4b70-9872-d4e4ed6dd1b2-multus-cni-dir\") pod \"multus-ckt8c\" (UID: \"16f9d82e-a351-4b70-9872-d4e4ed6dd1b2\") " pod="openshift-multus/multus-ckt8c" Apr 17 20:50:40.494892 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.494664 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/16f9d82e-a351-4b70-9872-d4e4ed6dd1b2-cnibin\") pod \"multus-ckt8c\" (UID: \"16f9d82e-a351-4b70-9872-d4e4ed6dd1b2\") " pod="openshift-multus/multus-ckt8c" Apr 17 20:50:40.494892 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.494686 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c5da4be1-c2a5-411b-a362-4b558f88d556-host-cni-netd\") pod \"ovnkube-node-pwcjq\" (UID: \"c5da4be1-c2a5-411b-a362-4b558f88d556\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcjq" Apr 17 20:50:40.494892 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.494708 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c5da4be1-c2a5-411b-a362-4b558f88d556-ovnkube-config\") pod \"ovnkube-node-pwcjq\" (UID: \"c5da4be1-c2a5-411b-a362-4b558f88d556\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcjq" Apr 17 20:50:40.494892 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.494749 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f9a06ede-5ace-4172-9e77-3eb6558ce0b1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-hzs8v\" (UID: \"f9a06ede-5ace-4172-9e77-3eb6558ce0b1\") " pod="openshift-multus/multus-additional-cni-plugins-hzs8v" Apr 17 20:50:40.494892 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.494807 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/f9a06ede-5ace-4172-9e77-3eb6558ce0b1-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-hzs8v\" (UID: \"f9a06ede-5ace-4172-9e77-3eb6558ce0b1\") " pod="openshift-multus/multus-additional-cni-plugins-hzs8v" Apr 17 20:50:40.494892 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.494834 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4ca6cd71-981d-410b-b6f0-e306196de4b3-etc-kubernetes\") pod \"tuned-w7g97\" (UID: \"4ca6cd71-981d-410b-b6f0-e306196de4b3\") " pod="openshift-cluster-node-tuning-operator/tuned-w7g97" Apr 17 20:50:40.494892 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.494862 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/872463a9-2d67-4fef-aa20-3db5ecfe6fd8-kubelet-dir\") pod \"aws-ebs-csi-driver-node-ttw76\" (UID: \"872463a9-2d67-4fef-aa20-3db5ecfe6fd8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ttw76" Apr 17 20:50:40.494892 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.494888 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/16f9d82e-a351-4b70-9872-d4e4ed6dd1b2-host-run-netns\") pod \"multus-ckt8c\" (UID: \"16f9d82e-a351-4b70-9872-d4e4ed6dd1b2\") " pod="openshift-multus/multus-ckt8c" Apr 17 20:50:40.495252 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.494911 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c5da4be1-c2a5-411b-a362-4b558f88d556-systemd-units\") pod \"ovnkube-node-pwcjq\" (UID: \"c5da4be1-c2a5-411b-a362-4b558f88d556\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcjq" Apr 17 20:50:40.495252 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.494970 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/bfe609fc-0a8b-47f5-ac0d-e7e0c47917de-tmp-dir\") pod \"node-resolver-bv5tl\" (UID: \"bfe609fc-0a8b-47f5-ac0d-e7e0c47917de\") " pod="openshift-dns/node-resolver-bv5tl" Apr 17 20:50:40.495252 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.494993 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sz484\" (UniqueName: \"kubernetes.io/projected/bfe609fc-0a8b-47f5-ac0d-e7e0c47917de-kube-api-access-sz484\") pod \"node-resolver-bv5tl\" (UID: \"bfe609fc-0a8b-47f5-ac0d-e7e0c47917de\") " pod="openshift-dns/node-resolver-bv5tl" Apr 17 20:50:40.495252 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.495013 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bd8e668b-8da4-40de-9724-891b41810e2d-host\") pod \"node-ca-q6mlz\" (UID: \"bd8e668b-8da4-40de-9724-891b41810e2d\") " pod="openshift-image-registry/node-ca-q6mlz" Apr 17 20:50:40.495252 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.495051 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/0c584a0a-c907-42fd-b3d5-c4705787e26a-iptables-alerter-script\") pod \"iptables-alerter-zsnm9\" (UID: \"0c584a0a-c907-42fd-b3d5-c4705787e26a\") " pod="openshift-network-operator/iptables-alerter-zsnm9" Apr 17 20:50:40.495252 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.495080 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/16f9d82e-a351-4b70-9872-d4e4ed6dd1b2-multus-socket-dir-parent\") pod \"multus-ckt8c\" (UID: \"16f9d82e-a351-4b70-9872-d4e4ed6dd1b2\") " pod="openshift-multus/multus-ckt8c" Apr 17 20:50:40.495252 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.494988 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/872463a9-2d67-4fef-aa20-3db5ecfe6fd8-kubelet-dir\") pod \"aws-ebs-csi-driver-node-ttw76\" (UID: \"872463a9-2d67-4fef-aa20-3db5ecfe6fd8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ttw76" Apr 17 20:50:40.495252 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.495116 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/16f9d82e-a351-4b70-9872-d4e4ed6dd1b2-multus-conf-dir\") pod \"multus-ckt8c\" (UID: \"16f9d82e-a351-4b70-9872-d4e4ed6dd1b2\") " pod="openshift-multus/multus-ckt8c" Apr 17 20:50:40.495252 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.495123 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bd8e668b-8da4-40de-9724-891b41810e2d-host\") pod \"node-ca-q6mlz\" (UID: \"bd8e668b-8da4-40de-9724-891b41810e2d\") " pod="openshift-image-registry/node-ca-q6mlz" Apr 17 20:50:40.495252 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.495146 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/216f383a-4c2f-482f-b8e6-c36230d9ffbf-konnectivity-ca\") pod \"konnectivity-agent-g2jbd\" (UID: \"216f383a-4c2f-482f-b8e6-c36230d9ffbf\") " pod="kube-system/konnectivity-agent-g2jbd" Apr 17 20:50:40.495252 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.495182 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c5da4be1-c2a5-411b-a362-4b558f88d556-run-systemd\") pod \"ovnkube-node-pwcjq\" (UID: \"c5da4be1-c2a5-411b-a362-4b558f88d556\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcjq" Apr 17 20:50:40.495252 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.495209 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c5da4be1-c2a5-411b-a362-4b558f88d556-ovnkube-script-lib\") pod \"ovnkube-node-pwcjq\" (UID: \"c5da4be1-c2a5-411b-a362-4b558f88d556\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcjq" Apr 17 20:50:40.495252 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.495255 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/bfe609fc-0a8b-47f5-ac0d-e7e0c47917de-tmp-dir\") pod \"node-resolver-bv5tl\" (UID: \"bfe609fc-0a8b-47f5-ac0d-e7e0c47917de\") " pod="openshift-dns/node-resolver-bv5tl" Apr 17 20:50:40.495667 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.495257 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/4ca6cd71-981d-410b-b6f0-e306196de4b3-etc-modprobe-d\") pod \"tuned-w7g97\" (UID: \"4ca6cd71-981d-410b-b6f0-e306196de4b3\") " pod="openshift-cluster-node-tuning-operator/tuned-w7g97" Apr 17 20:50:40.495667 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.495306 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/4ca6cd71-981d-410b-b6f0-e306196de4b3-etc-systemd\") pod \"tuned-w7g97\" (UID: \"4ca6cd71-981d-410b-b6f0-e306196de4b3\") " pod="openshift-cluster-node-tuning-operator/tuned-w7g97" Apr 17 20:50:40.495667 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.495326 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4ca6cd71-981d-410b-b6f0-e306196de4b3-tmp\") pod \"tuned-w7g97\" (UID: \"4ca6cd71-981d-410b-b6f0-e306196de4b3\") " pod="openshift-cluster-node-tuning-operator/tuned-w7g97" Apr 17 20:50:40.495667 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.495343 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/bfe609fc-0a8b-47f5-ac0d-e7e0c47917de-hosts-file\") pod \"node-resolver-bv5tl\" (UID: \"bfe609fc-0a8b-47f5-ac0d-e7e0c47917de\") " pod="openshift-dns/node-resolver-bv5tl" Apr 17 20:50:40.495667 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.495358 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/bd8e668b-8da4-40de-9724-891b41810e2d-serviceca\") pod \"node-ca-q6mlz\" (UID: \"bd8e668b-8da4-40de-9724-891b41810e2d\") " pod="openshift-image-registry/node-ca-q6mlz" Apr 17 20:50:40.495667 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.495376 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/16f9d82e-a351-4b70-9872-d4e4ed6dd1b2-system-cni-dir\") pod \"multus-ckt8c\" (UID: \"16f9d82e-a351-4b70-9872-d4e4ed6dd1b2\") " pod="openshift-multus/multus-ckt8c" Apr 17 20:50:40.495667 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.495396 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/16f9d82e-a351-4b70-9872-d4e4ed6dd1b2-etc-kubernetes\") pod \"multus-ckt8c\" (UID: \"16f9d82e-a351-4b70-9872-d4e4ed6dd1b2\") " pod="openshift-multus/multus-ckt8c" Apr 17 20:50:40.495667 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.495404 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/bfe609fc-0a8b-47f5-ac0d-e7e0c47917de-hosts-file\") pod \"node-resolver-bv5tl\" (UID: \"bfe609fc-0a8b-47f5-ac0d-e7e0c47917de\") " pod="openshift-dns/node-resolver-bv5tl" Apr 17 20:50:40.495667 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.495414 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c5da4be1-c2a5-411b-a362-4b558f88d556-log-socket\") pod \"ovnkube-node-pwcjq\" (UID: \"c5da4be1-c2a5-411b-a362-4b558f88d556\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcjq" Apr 17 20:50:40.495667 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.495430 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c5da4be1-c2a5-411b-a362-4b558f88d556-ovn-node-metrics-cert\") pod \"ovnkube-node-pwcjq\" (UID: \"c5da4be1-c2a5-411b-a362-4b558f88d556\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcjq" Apr 17 20:50:40.495667 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.495523 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0c584a0a-c907-42fd-b3d5-c4705787e26a-host-slash\") pod \"iptables-alerter-zsnm9\" (UID: \"0c584a0a-c907-42fd-b3d5-c4705787e26a\") " pod="openshift-network-operator/iptables-alerter-zsnm9" Apr 17 20:50:40.495667 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.495559 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/16f9d82e-a351-4b70-9872-d4e4ed6dd1b2-hostroot\") pod \"multus-ckt8c\" (UID: \"16f9d82e-a351-4b70-9872-d4e4ed6dd1b2\") " pod="openshift-multus/multus-ckt8c" Apr 17 20:50:40.495667 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.495581 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c5da4be1-c2a5-411b-a362-4b558f88d556-host-kubelet\") pod \"ovnkube-node-pwcjq\" (UID: \"c5da4be1-c2a5-411b-a362-4b558f88d556\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcjq" Apr 17 20:50:40.495667 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.495608 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c5da4be1-c2a5-411b-a362-4b558f88d556-host-run-netns\") pod \"ovnkube-node-pwcjq\" (UID: \"c5da4be1-c2a5-411b-a362-4b558f88d556\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcjq" Apr 17 20:50:40.495667 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.495635 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f9a06ede-5ace-4172-9e77-3eb6558ce0b1-cni-binary-copy\") pod \"multus-additional-cni-plugins-hzs8v\" (UID: \"f9a06ede-5ace-4172-9e77-3eb6558ce0b1\") " pod="openshift-multus/multus-additional-cni-plugins-hzs8v" Apr 17 20:50:40.495667 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.495659 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsgxc\" (UniqueName: \"kubernetes.io/projected/f9a06ede-5ace-4172-9e77-3eb6558ce0b1-kube-api-access-jsgxc\") pod \"multus-additional-cni-plugins-hzs8v\" (UID: \"f9a06ede-5ace-4172-9e77-3eb6558ce0b1\") " pod="openshift-multus/multus-additional-cni-plugins-hzs8v" Apr 17 20:50:40.496335 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.495679 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jgdt\" (UniqueName: \"kubernetes.io/projected/4ca6cd71-981d-410b-b6f0-e306196de4b3-kube-api-access-6jgdt\") pod \"tuned-w7g97\" (UID: \"4ca6cd71-981d-410b-b6f0-e306196de4b3\") " pod="openshift-cluster-node-tuning-operator/tuned-w7g97" Apr 17 20:50:40.496335 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.495734 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xptqq\" (UniqueName: \"kubernetes.io/projected/276cca8f-28b9-4e8c-918c-32c78b8870e7-kube-api-access-xptqq\") pod \"network-check-target-hzbw9\" (UID: \"276cca8f-28b9-4e8c-918c-32c78b8870e7\") " pod="openshift-network-diagnostics/network-check-target-hzbw9" Apr 17 20:50:40.496335 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.495766 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/16f9d82e-a351-4b70-9872-d4e4ed6dd1b2-host-run-k8s-cni-cncf-io\") pod \"multus-ckt8c\" (UID: \"16f9d82e-a351-4b70-9872-d4e4ed6dd1b2\") " pod="openshift-multus/multus-ckt8c" Apr 17 20:50:40.496335 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.495817 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/bd8e668b-8da4-40de-9724-891b41810e2d-serviceca\") pod \"node-ca-q6mlz\" (UID: \"bd8e668b-8da4-40de-9724-891b41810e2d\") " pod="openshift-image-registry/node-ca-q6mlz" Apr 17 20:50:40.496335 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.495824 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/16f9d82e-a351-4b70-9872-d4e4ed6dd1b2-host-var-lib-cni-bin\") pod \"multus-ckt8c\" (UID: \"16f9d82e-a351-4b70-9872-d4e4ed6dd1b2\") " pod="openshift-multus/multus-ckt8c" Apr 17 20:50:40.496335 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.495864 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c5da4be1-c2a5-411b-a362-4b558f88d556-host-slash\") pod \"ovnkube-node-pwcjq\" (UID: \"c5da4be1-c2a5-411b-a362-4b558f88d556\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcjq" Apr 17 20:50:40.496335 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.495901 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c5da4be1-c2a5-411b-a362-4b558f88d556-run-ovn\") pod \"ovnkube-node-pwcjq\" (UID: \"c5da4be1-c2a5-411b-a362-4b558f88d556\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcjq" Apr 17 20:50:40.496335 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.495929 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2z8zw\" (UniqueName: \"kubernetes.io/projected/872463a9-2d67-4fef-aa20-3db5ecfe6fd8-kube-api-access-2z8zw\") pod \"aws-ebs-csi-driver-node-ttw76\" (UID: \"872463a9-2d67-4fef-aa20-3db5ecfe6fd8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ttw76" Apr 17 20:50:40.496335 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.495950 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f59cq\" (UniqueName: \"kubernetes.io/projected/0c584a0a-c907-42fd-b3d5-c4705787e26a-kube-api-access-f59cq\") pod \"iptables-alerter-zsnm9\" (UID: \"0c584a0a-c907-42fd-b3d5-c4705787e26a\") " pod="openshift-network-operator/iptables-alerter-zsnm9" Apr 17 20:50:40.496335 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.495966 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/16f9d82e-a351-4b70-9872-d4e4ed6dd1b2-host-var-lib-cni-multus\") pod \"multus-ckt8c\" (UID: \"16f9d82e-a351-4b70-9872-d4e4ed6dd1b2\") " pod="openshift-multus/multus-ckt8c" Apr 17 20:50:40.496335 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.495989 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/16f9d82e-a351-4b70-9872-d4e4ed6dd1b2-multus-daemon-config\") pod \"multus-ckt8c\" (UID: \"16f9d82e-a351-4b70-9872-d4e4ed6dd1b2\") " pod="openshift-multus/multus-ckt8c" Apr 17 20:50:40.496335 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.496018 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5da4be1-c2a5-411b-a362-4b558f88d556-etc-openvswitch\") pod \"ovnkube-node-pwcjq\" (UID: \"c5da4be1-c2a5-411b-a362-4b558f88d556\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcjq" Apr 17 20:50:40.496335 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.496069 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c5da4be1-c2a5-411b-a362-4b558f88d556-host-run-ovn-kubernetes\") pod \"ovnkube-node-pwcjq\" (UID: \"c5da4be1-c2a5-411b-a362-4b558f88d556\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcjq" Apr 17 20:50:40.496335 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.496098 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c5da4be1-c2a5-411b-a362-4b558f88d556-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pwcjq\" (UID: \"c5da4be1-c2a5-411b-a362-4b558f88d556\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcjq" Apr 17 20:50:40.496335 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.496118 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c5da4be1-c2a5-411b-a362-4b558f88d556-env-overrides\") pod \"ovnkube-node-pwcjq\" (UID: \"c5da4be1-c2a5-411b-a362-4b558f88d556\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcjq" Apr 17 20:50:40.496335 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.496160 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/872463a9-2d67-4fef-aa20-3db5ecfe6fd8-registration-dir\") pod \"aws-ebs-csi-driver-node-ttw76\" (UID: \"872463a9-2d67-4fef-aa20-3db5ecfe6fd8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ttw76" Apr 17 20:50:40.496918 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.496193 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/872463a9-2d67-4fef-aa20-3db5ecfe6fd8-device-dir\") pod \"aws-ebs-csi-driver-node-ttw76\" (UID: \"872463a9-2d67-4fef-aa20-3db5ecfe6fd8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ttw76" Apr 17 20:50:40.496918 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.496195 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/872463a9-2d67-4fef-aa20-3db5ecfe6fd8-registration-dir\") pod \"aws-ebs-csi-driver-node-ttw76\" (UID: \"872463a9-2d67-4fef-aa20-3db5ecfe6fd8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ttw76" Apr 17 20:50:40.496918 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.496220 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlzk2\" (UniqueName: \"kubernetes.io/projected/c5da4be1-c2a5-411b-a362-4b558f88d556-kube-api-access-vlzk2\") pod \"ovnkube-node-pwcjq\" (UID: \"c5da4be1-c2a5-411b-a362-4b558f88d556\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcjq" Apr 17 20:50:40.496918 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.496268 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/872463a9-2d67-4fef-aa20-3db5ecfe6fd8-device-dir\") pod \"aws-ebs-csi-driver-node-ttw76\" (UID: \"872463a9-2d67-4fef-aa20-3db5ecfe6fd8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ttw76" Apr 17 20:50:40.496918 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.496303 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/4ca6cd71-981d-410b-b6f0-e306196de4b3-etc-sysconfig\") pod \"tuned-w7g97\" (UID: \"4ca6cd71-981d-410b-b6f0-e306196de4b3\") " pod="openshift-cluster-node-tuning-operator/tuned-w7g97" Apr 17 20:50:40.496918 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.496346 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4ca6cd71-981d-410b-b6f0-e306196de4b3-lib-modules\") pod \"tuned-w7g97\" (UID: \"4ca6cd71-981d-410b-b6f0-e306196de4b3\") " pod="openshift-cluster-node-tuning-operator/tuned-w7g97" Apr 17 20:50:40.496918 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.496379 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/4ca6cd71-981d-410b-b6f0-e306196de4b3-etc-tuned\") pod \"tuned-w7g97\" (UID: \"4ca6cd71-981d-410b-b6f0-e306196de4b3\") " pod="openshift-cluster-node-tuning-operator/tuned-w7g97" Apr 17 20:50:40.496918 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.496423 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5da4be1-c2a5-411b-a362-4b558f88d556-var-lib-openvswitch\") pod \"ovnkube-node-pwcjq\" (UID: \"c5da4be1-c2a5-411b-a362-4b558f88d556\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcjq" Apr 17 20:50:40.496918 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.496447 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f35870d4-a814-46cd-8764-30e79f3a7d79-metrics-certs\") pod \"network-metrics-daemon-z2lfh\" (UID: \"f35870d4-a814-46cd-8764-30e79f3a7d79\") " pod="openshift-multus/network-metrics-daemon-z2lfh" Apr 17 20:50:40.496918 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.496470 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/16f9d82e-a351-4b70-9872-d4e4ed6dd1b2-os-release\") pod \"multus-ckt8c\" (UID: \"16f9d82e-a351-4b70-9872-d4e4ed6dd1b2\") " pod="openshift-multus/multus-ckt8c" Apr 17 20:50:40.496918 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.496493 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5da4be1-c2a5-411b-a362-4b558f88d556-run-openvswitch\") pod \"ovnkube-node-pwcjq\" (UID: \"c5da4be1-c2a5-411b-a362-4b558f88d556\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcjq" Apr 17 20:50:40.496918 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.496521 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/4ca6cd71-981d-410b-b6f0-e306196de4b3-etc-sysctl-d\") pod \"tuned-w7g97\" (UID: \"4ca6cd71-981d-410b-b6f0-e306196de4b3\") " pod="openshift-cluster-node-tuning-operator/tuned-w7g97" Apr 17 20:50:40.496918 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.496553 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4ca6cd71-981d-410b-b6f0-e306196de4b3-run\") pod \"tuned-w7g97\" (UID: \"4ca6cd71-981d-410b-b6f0-e306196de4b3\") " pod="openshift-cluster-node-tuning-operator/tuned-w7g97" Apr 17 20:50:40.496918 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.496584 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4ca6cd71-981d-410b-b6f0-e306196de4b3-var-lib-kubelet\") pod \"tuned-w7g97\" (UID: \"4ca6cd71-981d-410b-b6f0-e306196de4b3\") " pod="openshift-cluster-node-tuning-operator/tuned-w7g97" Apr 17 20:50:40.496918 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.496606 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4ca6cd71-981d-410b-b6f0-e306196de4b3-host\") pod \"tuned-w7g97\" (UID: \"4ca6cd71-981d-410b-b6f0-e306196de4b3\") " pod="openshift-cluster-node-tuning-operator/tuned-w7g97" Apr 17 20:50:40.496918 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.496630 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/16f9d82e-a351-4b70-9872-d4e4ed6dd1b2-cni-binary-copy\") pod \"multus-ckt8c\" (UID: \"16f9d82e-a351-4b70-9872-d4e4ed6dd1b2\") " pod="openshift-multus/multus-ckt8c" Apr 17 20:50:40.496918 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.496650 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/872463a9-2d67-4fef-aa20-3db5ecfe6fd8-sys-fs\") pod \"aws-ebs-csi-driver-node-ttw76\" (UID: \"872463a9-2d67-4fef-aa20-3db5ecfe6fd8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ttw76" Apr 17 20:50:40.497493 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.496678 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/16f9d82e-a351-4b70-9872-d4e4ed6dd1b2-host-var-lib-kubelet\") pod \"multus-ckt8c\" (UID: \"16f9d82e-a351-4b70-9872-d4e4ed6dd1b2\") " pod="openshift-multus/multus-ckt8c" Apr 17 20:50:40.497493 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.496733 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/872463a9-2d67-4fef-aa20-3db5ecfe6fd8-sys-fs\") pod \"aws-ebs-csi-driver-node-ttw76\" (UID: \"872463a9-2d67-4fef-aa20-3db5ecfe6fd8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ttw76" Apr 17 20:50:40.497493 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.496768 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/216f383a-4c2f-482f-b8e6-c36230d9ffbf-agent-certs\") pod \"konnectivity-agent-g2jbd\" (UID: \"216f383a-4c2f-482f-b8e6-c36230d9ffbf\") " pod="kube-system/konnectivity-agent-g2jbd" Apr 17 20:50:40.497493 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.496813 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f9a06ede-5ace-4172-9e77-3eb6558ce0b1-system-cni-dir\") pod \"multus-additional-cni-plugins-hzs8v\" (UID: \"f9a06ede-5ace-4172-9e77-3eb6558ce0b1\") " pod="openshift-multus/multus-additional-cni-plugins-hzs8v" Apr 17 20:50:40.501995 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.501966 2576 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 20:50:40.503925 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:50:40.503905 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 20:50:40.504042 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:50:40.503930 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 20:50:40.504042 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:50:40.503944 2576 projected.go:194] Error preparing data for projected volume kube-api-access-xptqq for pod openshift-network-diagnostics/network-check-target-hzbw9: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:50:40.504042 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:50:40.504014 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/276cca8f-28b9-4e8c-918c-32c78b8870e7-kube-api-access-xptqq podName:276cca8f-28b9-4e8c-918c-32c78b8870e7 nodeName:}" failed. No retries permitted until 2026-04-17 20:50:41.00398535 +0000 UTC m=+3.010399366 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-xptqq" (UniqueName: "kubernetes.io/projected/276cca8f-28b9-4e8c-918c-32c78b8870e7-kube-api-access-xptqq") pod "network-check-target-hzbw9" (UID: "276cca8f-28b9-4e8c-918c-32c78b8870e7") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:50:40.504203 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.504079 2576 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 17 20:50:40.507240 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.507222 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sz484\" (UniqueName: \"kubernetes.io/projected/bfe609fc-0a8b-47f5-ac0d-e7e0c47917de-kube-api-access-sz484\") pod \"node-resolver-bv5tl\" (UID: \"bfe609fc-0a8b-47f5-ac0d-e7e0c47917de\") " pod="openshift-dns/node-resolver-bv5tl" Apr 17 20:50:40.507352 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.507300 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgxcv\" (UniqueName: \"kubernetes.io/projected/bd8e668b-8da4-40de-9724-891b41810e2d-kube-api-access-sgxcv\") pod \"node-ca-q6mlz\" (UID: \"bd8e668b-8da4-40de-9724-891b41810e2d\") " pod="openshift-image-registry/node-ca-q6mlz" Apr 17 20:50:40.507426 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.507407 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2z8zw\" (UniqueName: \"kubernetes.io/projected/872463a9-2d67-4fef-aa20-3db5ecfe6fd8-kube-api-access-2z8zw\") pod \"aws-ebs-csi-driver-node-ttw76\" (UID: \"872463a9-2d67-4fef-aa20-3db5ecfe6fd8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ttw76" Apr 17 20:50:40.598126 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.598095 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f9a06ede-5ace-4172-9e77-3eb6558ce0b1-cnibin\") pod \"multus-additional-cni-plugins-hzs8v\" (UID: \"f9a06ede-5ace-4172-9e77-3eb6558ce0b1\") " pod="openshift-multus/multus-additional-cni-plugins-hzs8v" Apr 17 20:50:40.598126 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.598131 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sm2r4\" (UniqueName: \"kubernetes.io/projected/f35870d4-a814-46cd-8764-30e79f3a7d79-kube-api-access-sm2r4\") pod \"network-metrics-daemon-z2lfh\" (UID: \"f35870d4-a814-46cd-8764-30e79f3a7d79\") " pod="openshift-multus/network-metrics-daemon-z2lfh" Apr 17 20:50:40.598321 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.598149 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/16f9d82e-a351-4b70-9872-d4e4ed6dd1b2-host-run-multus-certs\") pod \"multus-ckt8c\" (UID: \"16f9d82e-a351-4b70-9872-d4e4ed6dd1b2\") " pod="openshift-multus/multus-ckt8c" Apr 17 20:50:40.598321 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.598164 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v9dtr\" (UniqueName: \"kubernetes.io/projected/16f9d82e-a351-4b70-9872-d4e4ed6dd1b2-kube-api-access-v9dtr\") pod \"multus-ckt8c\" (UID: \"16f9d82e-a351-4b70-9872-d4e4ed6dd1b2\") " pod="openshift-multus/multus-ckt8c" Apr 17 20:50:40.598321 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.598221 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f9a06ede-5ace-4172-9e77-3eb6558ce0b1-cnibin\") pod \"multus-additional-cni-plugins-hzs8v\" (UID: \"f9a06ede-5ace-4172-9e77-3eb6558ce0b1\") " pod="openshift-multus/multus-additional-cni-plugins-hzs8v" Apr 17 20:50:40.598321 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.598217 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c5da4be1-c2a5-411b-a362-4b558f88d556-node-log\") pod \"ovnkube-node-pwcjq\" (UID: \"c5da4be1-c2a5-411b-a362-4b558f88d556\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcjq" Apr 17 20:50:40.598321 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.598231 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/16f9d82e-a351-4b70-9872-d4e4ed6dd1b2-host-run-multus-certs\") pod \"multus-ckt8c\" (UID: \"16f9d82e-a351-4b70-9872-d4e4ed6dd1b2\") " pod="openshift-multus/multus-ckt8c" Apr 17 20:50:40.598321 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.598271 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c5da4be1-c2a5-411b-a362-4b558f88d556-node-log\") pod \"ovnkube-node-pwcjq\" (UID: \"c5da4be1-c2a5-411b-a362-4b558f88d556\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcjq" Apr 17 20:50:40.598602 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.598326 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c5da4be1-c2a5-411b-a362-4b558f88d556-host-cni-bin\") pod \"ovnkube-node-pwcjq\" (UID: \"c5da4be1-c2a5-411b-a362-4b558f88d556\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcjq" Apr 17 20:50:40.598602 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.598353 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/4ca6cd71-981d-410b-b6f0-e306196de4b3-etc-sysctl-conf\") pod \"tuned-w7g97\" (UID: \"4ca6cd71-981d-410b-b6f0-e306196de4b3\") " pod="openshift-cluster-node-tuning-operator/tuned-w7g97" Apr 17 20:50:40.598602 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.598378 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/16f9d82e-a351-4b70-9872-d4e4ed6dd1b2-multus-cni-dir\") pod \"multus-ckt8c\" (UID: \"16f9d82e-a351-4b70-9872-d4e4ed6dd1b2\") " pod="openshift-multus/multus-ckt8c" Apr 17 20:50:40.598602 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.598401 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/16f9d82e-a351-4b70-9872-d4e4ed6dd1b2-cnibin\") pod \"multus-ckt8c\" (UID: \"16f9d82e-a351-4b70-9872-d4e4ed6dd1b2\") " pod="openshift-multus/multus-ckt8c" Apr 17 20:50:40.598602 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.598416 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c5da4be1-c2a5-411b-a362-4b558f88d556-host-cni-bin\") pod \"ovnkube-node-pwcjq\" (UID: \"c5da4be1-c2a5-411b-a362-4b558f88d556\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcjq" Apr 17 20:50:40.598602 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.598440 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c5da4be1-c2a5-411b-a362-4b558f88d556-host-cni-netd\") pod \"ovnkube-node-pwcjq\" (UID: \"c5da4be1-c2a5-411b-a362-4b558f88d556\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcjq" Apr 17 20:50:40.598602 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.598465 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c5da4be1-c2a5-411b-a362-4b558f88d556-ovnkube-config\") pod \"ovnkube-node-pwcjq\" (UID: \"c5da4be1-c2a5-411b-a362-4b558f88d556\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcjq" Apr 17 20:50:40.598602 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.598489 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f9a06ede-5ace-4172-9e77-3eb6558ce0b1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-hzs8v\" (UID: \"f9a06ede-5ace-4172-9e77-3eb6558ce0b1\") " pod="openshift-multus/multus-additional-cni-plugins-hzs8v" Apr 17 20:50:40.598602 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.598508 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/16f9d82e-a351-4b70-9872-d4e4ed6dd1b2-multus-cni-dir\") pod \"multus-ckt8c\" (UID: \"16f9d82e-a351-4b70-9872-d4e4ed6dd1b2\") " pod="openshift-multus/multus-ckt8c" Apr 17 20:50:40.598602 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.598533 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c5da4be1-c2a5-411b-a362-4b558f88d556-host-cni-netd\") pod \"ovnkube-node-pwcjq\" (UID: \"c5da4be1-c2a5-411b-a362-4b558f88d556\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcjq" Apr 17 20:50:40.598602 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.598516 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/f9a06ede-5ace-4172-9e77-3eb6558ce0b1-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-hzs8v\" (UID: \"f9a06ede-5ace-4172-9e77-3eb6558ce0b1\") " pod="openshift-multus/multus-additional-cni-plugins-hzs8v" Apr 17 20:50:40.598602 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.598576 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/16f9d82e-a351-4b70-9872-d4e4ed6dd1b2-cnibin\") pod \"multus-ckt8c\" (UID: \"16f9d82e-a351-4b70-9872-d4e4ed6dd1b2\") " pod="openshift-multus/multus-ckt8c" Apr 17 20:50:40.598602 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.598589 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4ca6cd71-981d-410b-b6f0-e306196de4b3-etc-kubernetes\") pod \"tuned-w7g97\" (UID: \"4ca6cd71-981d-410b-b6f0-e306196de4b3\") " pod="openshift-cluster-node-tuning-operator/tuned-w7g97" Apr 17 20:50:40.599209 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.598615 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/16f9d82e-a351-4b70-9872-d4e4ed6dd1b2-host-run-netns\") pod \"multus-ckt8c\" (UID: \"16f9d82e-a351-4b70-9872-d4e4ed6dd1b2\") " pod="openshift-multus/multus-ckt8c" Apr 17 20:50:40.599209 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.598627 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4ca6cd71-981d-410b-b6f0-e306196de4b3-etc-kubernetes\") pod \"tuned-w7g97\" (UID: \"4ca6cd71-981d-410b-b6f0-e306196de4b3\") " pod="openshift-cluster-node-tuning-operator/tuned-w7g97" Apr 17 20:50:40.599209 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.598513 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/4ca6cd71-981d-410b-b6f0-e306196de4b3-etc-sysctl-conf\") pod \"tuned-w7g97\" (UID: \"4ca6cd71-981d-410b-b6f0-e306196de4b3\") " pod="openshift-cluster-node-tuning-operator/tuned-w7g97" Apr 17 20:50:40.599209 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.598640 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c5da4be1-c2a5-411b-a362-4b558f88d556-systemd-units\") pod \"ovnkube-node-pwcjq\" (UID: \"c5da4be1-c2a5-411b-a362-4b558f88d556\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcjq" Apr 17 20:50:40.599209 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.598653 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f9a06ede-5ace-4172-9e77-3eb6558ce0b1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-hzs8v\" (UID: \"f9a06ede-5ace-4172-9e77-3eb6558ce0b1\") " pod="openshift-multus/multus-additional-cni-plugins-hzs8v" Apr 17 20:50:40.599209 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.598668 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/16f9d82e-a351-4b70-9872-d4e4ed6dd1b2-host-run-netns\") pod \"multus-ckt8c\" (UID: \"16f9d82e-a351-4b70-9872-d4e4ed6dd1b2\") " pod="openshift-multus/multus-ckt8c" Apr 17 20:50:40.599209 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.598669 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/0c584a0a-c907-42fd-b3d5-c4705787e26a-iptables-alerter-script\") pod \"iptables-alerter-zsnm9\" (UID: \"0c584a0a-c907-42fd-b3d5-c4705787e26a\") " pod="openshift-network-operator/iptables-alerter-zsnm9" Apr 17 20:50:40.599209 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.598692 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c5da4be1-c2a5-411b-a362-4b558f88d556-systemd-units\") pod \"ovnkube-node-pwcjq\" (UID: \"c5da4be1-c2a5-411b-a362-4b558f88d556\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcjq" Apr 17 20:50:40.599209 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.598718 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/16f9d82e-a351-4b70-9872-d4e4ed6dd1b2-multus-socket-dir-parent\") pod \"multus-ckt8c\" (UID: \"16f9d82e-a351-4b70-9872-d4e4ed6dd1b2\") " pod="openshift-multus/multus-ckt8c" Apr 17 20:50:40.599209 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.598745 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/16f9d82e-a351-4b70-9872-d4e4ed6dd1b2-multus-conf-dir\") pod \"multus-ckt8c\" (UID: \"16f9d82e-a351-4b70-9872-d4e4ed6dd1b2\") " pod="openshift-multus/multus-ckt8c" Apr 17 20:50:40.599209 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.598771 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/216f383a-4c2f-482f-b8e6-c36230d9ffbf-konnectivity-ca\") pod \"konnectivity-agent-g2jbd\" (UID: \"216f383a-4c2f-482f-b8e6-c36230d9ffbf\") " pod="kube-system/konnectivity-agent-g2jbd" Apr 17 20:50:40.599209 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.598797 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c5da4be1-c2a5-411b-a362-4b558f88d556-run-systemd\") pod \"ovnkube-node-pwcjq\" (UID: \"c5da4be1-c2a5-411b-a362-4b558f88d556\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcjq" Apr 17 20:50:40.599209 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.598818 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/16f9d82e-a351-4b70-9872-d4e4ed6dd1b2-multus-conf-dir\") pod \"multus-ckt8c\" (UID: \"16f9d82e-a351-4b70-9872-d4e4ed6dd1b2\") " pod="openshift-multus/multus-ckt8c" Apr 17 20:50:40.599209 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.598822 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c5da4be1-c2a5-411b-a362-4b558f88d556-ovnkube-script-lib\") pod \"ovnkube-node-pwcjq\" (UID: \"c5da4be1-c2a5-411b-a362-4b558f88d556\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcjq" Apr 17 20:50:40.599209 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.598831 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/16f9d82e-a351-4b70-9872-d4e4ed6dd1b2-multus-socket-dir-parent\") pod \"multus-ckt8c\" (UID: \"16f9d82e-a351-4b70-9872-d4e4ed6dd1b2\") " pod="openshift-multus/multus-ckt8c" Apr 17 20:50:40.599209 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.598860 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/4ca6cd71-981d-410b-b6f0-e306196de4b3-etc-modprobe-d\") pod \"tuned-w7g97\" (UID: \"4ca6cd71-981d-410b-b6f0-e306196de4b3\") " pod="openshift-cluster-node-tuning-operator/tuned-w7g97" Apr 17 20:50:40.599209 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.598884 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/4ca6cd71-981d-410b-b6f0-e306196de4b3-etc-systemd\") pod \"tuned-w7g97\" (UID: \"4ca6cd71-981d-410b-b6f0-e306196de4b3\") " pod="openshift-cluster-node-tuning-operator/tuned-w7g97" Apr 17 20:50:40.599209 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.598908 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4ca6cd71-981d-410b-b6f0-e306196de4b3-tmp\") pod \"tuned-w7g97\" (UID: \"4ca6cd71-981d-410b-b6f0-e306196de4b3\") " pod="openshift-cluster-node-tuning-operator/tuned-w7g97" Apr 17 20:50:40.600003 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.598933 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/16f9d82e-a351-4b70-9872-d4e4ed6dd1b2-system-cni-dir\") pod \"multus-ckt8c\" (UID: \"16f9d82e-a351-4b70-9872-d4e4ed6dd1b2\") " pod="openshift-multus/multus-ckt8c" Apr 17 20:50:40.600003 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.598955 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/16f9d82e-a351-4b70-9872-d4e4ed6dd1b2-etc-kubernetes\") pod \"multus-ckt8c\" (UID: \"16f9d82e-a351-4b70-9872-d4e4ed6dd1b2\") " pod="openshift-multus/multus-ckt8c" Apr 17 20:50:40.600003 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.598979 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c5da4be1-c2a5-411b-a362-4b558f88d556-log-socket\") pod \"ovnkube-node-pwcjq\" (UID: \"c5da4be1-c2a5-411b-a362-4b558f88d556\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcjq" Apr 17 20:50:40.600003 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.599004 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c5da4be1-c2a5-411b-a362-4b558f88d556-ovn-node-metrics-cert\") pod \"ovnkube-node-pwcjq\" (UID: \"c5da4be1-c2a5-411b-a362-4b558f88d556\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcjq" Apr 17 20:50:40.600003 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.599049 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0c584a0a-c907-42fd-b3d5-c4705787e26a-host-slash\") pod \"iptables-alerter-zsnm9\" (UID: \"0c584a0a-c907-42fd-b3d5-c4705787e26a\") " pod="openshift-network-operator/iptables-alerter-zsnm9" Apr 17 20:50:40.600003 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.599075 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/16f9d82e-a351-4b70-9872-d4e4ed6dd1b2-hostroot\") pod \"multus-ckt8c\" (UID: \"16f9d82e-a351-4b70-9872-d4e4ed6dd1b2\") " pod="openshift-multus/multus-ckt8c" Apr 17 20:50:40.600003 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.599094 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/f9a06ede-5ace-4172-9e77-3eb6558ce0b1-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-hzs8v\" (UID: \"f9a06ede-5ace-4172-9e77-3eb6558ce0b1\") " pod="openshift-multus/multus-additional-cni-plugins-hzs8v" Apr 17 20:50:40.600003 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.599098 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c5da4be1-c2a5-411b-a362-4b558f88d556-host-kubelet\") pod \"ovnkube-node-pwcjq\" (UID: \"c5da4be1-c2a5-411b-a362-4b558f88d556\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcjq" Apr 17 20:50:40.600003 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.599137 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/16f9d82e-a351-4b70-9872-d4e4ed6dd1b2-etc-kubernetes\") pod \"multus-ckt8c\" (UID: \"16f9d82e-a351-4b70-9872-d4e4ed6dd1b2\") " pod="openshift-multus/multus-ckt8c" Apr 17 20:50:40.600003 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.599141 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c5da4be1-c2a5-411b-a362-4b558f88d556-host-run-netns\") pod \"ovnkube-node-pwcjq\" (UID: \"c5da4be1-c2a5-411b-a362-4b558f88d556\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcjq" Apr 17 20:50:40.600003 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.599177 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c5da4be1-c2a5-411b-a362-4b558f88d556-host-run-netns\") pod \"ovnkube-node-pwcjq\" (UID: \"c5da4be1-c2a5-411b-a362-4b558f88d556\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcjq" Apr 17 20:50:40.600003 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.599182 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f9a06ede-5ace-4172-9e77-3eb6558ce0b1-cni-binary-copy\") pod \"multus-additional-cni-plugins-hzs8v\" (UID: \"f9a06ede-5ace-4172-9e77-3eb6558ce0b1\") " pod="openshift-multus/multus-additional-cni-plugins-hzs8v" Apr 17 20:50:40.600003 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.599212 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jsgxc\" (UniqueName: \"kubernetes.io/projected/f9a06ede-5ace-4172-9e77-3eb6558ce0b1-kube-api-access-jsgxc\") pod \"multus-additional-cni-plugins-hzs8v\" (UID: \"f9a06ede-5ace-4172-9e77-3eb6558ce0b1\") " pod="openshift-multus/multus-additional-cni-plugins-hzs8v" Apr 17 20:50:40.600003 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.599217 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0c584a0a-c907-42fd-b3d5-c4705787e26a-host-slash\") pod \"iptables-alerter-zsnm9\" (UID: \"0c584a0a-c907-42fd-b3d5-c4705787e26a\") " pod="openshift-network-operator/iptables-alerter-zsnm9" Apr 17 20:50:40.600003 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.599228 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/0c584a0a-c907-42fd-b3d5-c4705787e26a-iptables-alerter-script\") pod \"iptables-alerter-zsnm9\" (UID: \"0c584a0a-c907-42fd-b3d5-c4705787e26a\") " pod="openshift-network-operator/iptables-alerter-zsnm9" Apr 17 20:50:40.600003 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.599242 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6jgdt\" (UniqueName: \"kubernetes.io/projected/4ca6cd71-981d-410b-b6f0-e306196de4b3-kube-api-access-6jgdt\") pod \"tuned-w7g97\" (UID: \"4ca6cd71-981d-410b-b6f0-e306196de4b3\") " pod="openshift-cluster-node-tuning-operator/tuned-w7g97" Apr 17 20:50:40.600003 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.599278 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/16f9d82e-a351-4b70-9872-d4e4ed6dd1b2-host-run-k8s-cni-cncf-io\") pod \"multus-ckt8c\" (UID: \"16f9d82e-a351-4b70-9872-d4e4ed6dd1b2\") " pod="openshift-multus/multus-ckt8c" Apr 17 20:50:40.600774 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.599298 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/16f9d82e-a351-4b70-9872-d4e4ed6dd1b2-host-var-lib-cni-bin\") pod \"multus-ckt8c\" (UID: \"16f9d82e-a351-4b70-9872-d4e4ed6dd1b2\") " pod="openshift-multus/multus-ckt8c" Apr 17 20:50:40.600774 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.599322 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c5da4be1-c2a5-411b-a362-4b558f88d556-host-slash\") pod \"ovnkube-node-pwcjq\" (UID: \"c5da4be1-c2a5-411b-a362-4b558f88d556\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcjq" Apr 17 20:50:40.600774 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.599337 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/4ca6cd71-981d-410b-b6f0-e306196de4b3-etc-modprobe-d\") pod \"tuned-w7g97\" (UID: \"4ca6cd71-981d-410b-b6f0-e306196de4b3\") " pod="openshift-cluster-node-tuning-operator/tuned-w7g97" Apr 17 20:50:40.600774 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.599346 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c5da4be1-c2a5-411b-a362-4b558f88d556-run-ovn\") pod \"ovnkube-node-pwcjq\" (UID: \"c5da4be1-c2a5-411b-a362-4b558f88d556\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcjq" Apr 17 20:50:40.600774 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.599386 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c5da4be1-c2a5-411b-a362-4b558f88d556-ovnkube-script-lib\") pod \"ovnkube-node-pwcjq\" (UID: \"c5da4be1-c2a5-411b-a362-4b558f88d556\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcjq" Apr 17 20:50:40.600774 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.599394 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/216f383a-4c2f-482f-b8e6-c36230d9ffbf-konnectivity-ca\") pod \"konnectivity-agent-g2jbd\" (UID: \"216f383a-4c2f-482f-b8e6-c36230d9ffbf\") " pod="kube-system/konnectivity-agent-g2jbd" Apr 17 20:50:40.600774 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.599402 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c5da4be1-c2a5-411b-a362-4b558f88d556-run-ovn\") pod \"ovnkube-node-pwcjq\" (UID: \"c5da4be1-c2a5-411b-a362-4b558f88d556\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcjq" Apr 17 20:50:40.600774 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.599402 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c5da4be1-c2a5-411b-a362-4b558f88d556-host-kubelet\") pod \"ovnkube-node-pwcjq\" (UID: \"c5da4be1-c2a5-411b-a362-4b558f88d556\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcjq" Apr 17 20:50:40.600774 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.599094 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c5da4be1-c2a5-411b-a362-4b558f88d556-ovnkube-config\") pod \"ovnkube-node-pwcjq\" (UID: \"c5da4be1-c2a5-411b-a362-4b558f88d556\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcjq" Apr 17 20:50:40.600774 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.599442 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/16f9d82e-a351-4b70-9872-d4e4ed6dd1b2-host-run-k8s-cni-cncf-io\") pod \"multus-ckt8c\" (UID: \"16f9d82e-a351-4b70-9872-d4e4ed6dd1b2\") " pod="openshift-multus/multus-ckt8c" Apr 17 20:50:40.600774 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.599460 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c5da4be1-c2a5-411b-a362-4b558f88d556-run-systemd\") pod \"ovnkube-node-pwcjq\" (UID: \"c5da4be1-c2a5-411b-a362-4b558f88d556\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcjq" Apr 17 20:50:40.600774 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.599471 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f59cq\" (UniqueName: \"kubernetes.io/projected/0c584a0a-c907-42fd-b3d5-c4705787e26a-kube-api-access-f59cq\") pod \"iptables-alerter-zsnm9\" (UID: \"0c584a0a-c907-42fd-b3d5-c4705787e26a\") " pod="openshift-network-operator/iptables-alerter-zsnm9" Apr 17 20:50:40.600774 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.599485 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/16f9d82e-a351-4b70-9872-d4e4ed6dd1b2-host-var-lib-cni-bin\") pod \"multus-ckt8c\" (UID: \"16f9d82e-a351-4b70-9872-d4e4ed6dd1b2\") " pod="openshift-multus/multus-ckt8c" Apr 17 20:50:40.600774 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.599510 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/16f9d82e-a351-4b70-9872-d4e4ed6dd1b2-hostroot\") pod \"multus-ckt8c\" (UID: \"16f9d82e-a351-4b70-9872-d4e4ed6dd1b2\") " pod="openshift-multus/multus-ckt8c" Apr 17 20:50:40.600774 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.599520 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/16f9d82e-a351-4b70-9872-d4e4ed6dd1b2-host-var-lib-cni-multus\") pod \"multus-ckt8c\" (UID: \"16f9d82e-a351-4b70-9872-d4e4ed6dd1b2\") " pod="openshift-multus/multus-ckt8c" Apr 17 20:50:40.600774 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.599526 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c5da4be1-c2a5-411b-a362-4b558f88d556-host-slash\") pod \"ovnkube-node-pwcjq\" (UID: \"c5da4be1-c2a5-411b-a362-4b558f88d556\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcjq" Apr 17 20:50:40.600774 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.599552 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/16f9d82e-a351-4b70-9872-d4e4ed6dd1b2-multus-daemon-config\") pod \"multus-ckt8c\" (UID: \"16f9d82e-a351-4b70-9872-d4e4ed6dd1b2\") " pod="openshift-multus/multus-ckt8c" Apr 17 20:50:40.600774 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.599551 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/16f9d82e-a351-4b70-9872-d4e4ed6dd1b2-host-var-lib-cni-multus\") pod \"multus-ckt8c\" (UID: \"16f9d82e-a351-4b70-9872-d4e4ed6dd1b2\") " pod="openshift-multus/multus-ckt8c" Apr 17 20:50:40.601588 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.599579 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/16f9d82e-a351-4b70-9872-d4e4ed6dd1b2-system-cni-dir\") pod \"multus-ckt8c\" (UID: \"16f9d82e-a351-4b70-9872-d4e4ed6dd1b2\") " pod="openshift-multus/multus-ckt8c" Apr 17 20:50:40.601588 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.599581 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5da4be1-c2a5-411b-a362-4b558f88d556-etc-openvswitch\") pod \"ovnkube-node-pwcjq\" (UID: \"c5da4be1-c2a5-411b-a362-4b558f88d556\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcjq" Apr 17 20:50:40.601588 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.599618 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5da4be1-c2a5-411b-a362-4b558f88d556-etc-openvswitch\") pod \"ovnkube-node-pwcjq\" (UID: \"c5da4be1-c2a5-411b-a362-4b558f88d556\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcjq" Apr 17 20:50:40.601588 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.599628 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c5da4be1-c2a5-411b-a362-4b558f88d556-host-run-ovn-kubernetes\") pod \"ovnkube-node-pwcjq\" (UID: \"c5da4be1-c2a5-411b-a362-4b558f88d556\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcjq" Apr 17 20:50:40.601588 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.599656 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c5da4be1-c2a5-411b-a362-4b558f88d556-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pwcjq\" (UID: \"c5da4be1-c2a5-411b-a362-4b558f88d556\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcjq" Apr 17 20:50:40.601588 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.599673 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c5da4be1-c2a5-411b-a362-4b558f88d556-log-socket\") pod \"ovnkube-node-pwcjq\" (UID: \"c5da4be1-c2a5-411b-a362-4b558f88d556\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcjq" Apr 17 20:50:40.601588 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.599683 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c5da4be1-c2a5-411b-a362-4b558f88d556-env-overrides\") pod \"ovnkube-node-pwcjq\" (UID: \"c5da4be1-c2a5-411b-a362-4b558f88d556\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcjq" Apr 17 20:50:40.601588 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.599689 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c5da4be1-c2a5-411b-a362-4b558f88d556-host-run-ovn-kubernetes\") pod \"ovnkube-node-pwcjq\" (UID: \"c5da4be1-c2a5-411b-a362-4b558f88d556\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcjq" Apr 17 20:50:40.601588 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.599713 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vlzk2\" (UniqueName: \"kubernetes.io/projected/c5da4be1-c2a5-411b-a362-4b558f88d556-kube-api-access-vlzk2\") pod \"ovnkube-node-pwcjq\" (UID: \"c5da4be1-c2a5-411b-a362-4b558f88d556\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcjq" Apr 17 20:50:40.601588 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.599738 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/4ca6cd71-981d-410b-b6f0-e306196de4b3-etc-sysconfig\") pod \"tuned-w7g97\" (UID: \"4ca6cd71-981d-410b-b6f0-e306196de4b3\") " pod="openshift-cluster-node-tuning-operator/tuned-w7g97" Apr 17 20:50:40.601588 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.599749 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c5da4be1-c2a5-411b-a362-4b558f88d556-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pwcjq\" (UID: \"c5da4be1-c2a5-411b-a362-4b558f88d556\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcjq" Apr 17 20:50:40.601588 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.599762 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4ca6cd71-981d-410b-b6f0-e306196de4b3-lib-modules\") pod \"tuned-w7g97\" (UID: \"4ca6cd71-981d-410b-b6f0-e306196de4b3\") " pod="openshift-cluster-node-tuning-operator/tuned-w7g97" Apr 17 20:50:40.601588 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.599789 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/4ca6cd71-981d-410b-b6f0-e306196de4b3-etc-tuned\") pod \"tuned-w7g97\" (UID: \"4ca6cd71-981d-410b-b6f0-e306196de4b3\") " pod="openshift-cluster-node-tuning-operator/tuned-w7g97" Apr 17 20:50:40.601588 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.599822 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5da4be1-c2a5-411b-a362-4b558f88d556-var-lib-openvswitch\") pod \"ovnkube-node-pwcjq\" (UID: \"c5da4be1-c2a5-411b-a362-4b558f88d556\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcjq" Apr 17 20:50:40.601588 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.599851 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f35870d4-a814-46cd-8764-30e79f3a7d79-metrics-certs\") pod \"network-metrics-daemon-z2lfh\" (UID: \"f35870d4-a814-46cd-8764-30e79f3a7d79\") " pod="openshift-multus/network-metrics-daemon-z2lfh" Apr 17 20:50:40.601588 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.599875 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/16f9d82e-a351-4b70-9872-d4e4ed6dd1b2-os-release\") pod \"multus-ckt8c\" (UID: \"16f9d82e-a351-4b70-9872-d4e4ed6dd1b2\") " pod="openshift-multus/multus-ckt8c" Apr 17 20:50:40.601588 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.599898 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5da4be1-c2a5-411b-a362-4b558f88d556-run-openvswitch\") pod \"ovnkube-node-pwcjq\" (UID: \"c5da4be1-c2a5-411b-a362-4b558f88d556\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcjq" Apr 17 20:50:40.602354 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.599920 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/4ca6cd71-981d-410b-b6f0-e306196de4b3-etc-sysctl-d\") pod \"tuned-w7g97\" (UID: \"4ca6cd71-981d-410b-b6f0-e306196de4b3\") " pod="openshift-cluster-node-tuning-operator/tuned-w7g97" Apr 17 20:50:40.602354 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.599943 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4ca6cd71-981d-410b-b6f0-e306196de4b3-run\") pod \"tuned-w7g97\" (UID: \"4ca6cd71-981d-410b-b6f0-e306196de4b3\") " pod="openshift-cluster-node-tuning-operator/tuned-w7g97" Apr 17 20:50:40.602354 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.599966 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5da4be1-c2a5-411b-a362-4b558f88d556-var-lib-openvswitch\") pod \"ovnkube-node-pwcjq\" (UID: \"c5da4be1-c2a5-411b-a362-4b558f88d556\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcjq" Apr 17 20:50:40.602354 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.599983 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f9a06ede-5ace-4172-9e77-3eb6558ce0b1-cni-binary-copy\") pod \"multus-additional-cni-plugins-hzs8v\" (UID: \"f9a06ede-5ace-4172-9e77-3eb6558ce0b1\") " pod="openshift-multus/multus-additional-cni-plugins-hzs8v" Apr 17 20:50:40.602354 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.600020 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4ca6cd71-981d-410b-b6f0-e306196de4b3-var-lib-kubelet\") pod \"tuned-w7g97\" (UID: \"4ca6cd71-981d-410b-b6f0-e306196de4b3\") " pod="openshift-cluster-node-tuning-operator/tuned-w7g97" Apr 17 20:50:40.602354 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.600054 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/16f9d82e-a351-4b70-9872-d4e4ed6dd1b2-os-release\") pod \"multus-ckt8c\" (UID: \"16f9d82e-a351-4b70-9872-d4e4ed6dd1b2\") " pod="openshift-multus/multus-ckt8c" Apr 17 20:50:40.602354 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:50:40.600104 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:50:40.602354 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.600121 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4ca6cd71-981d-410b-b6f0-e306196de4b3-run\") pod \"tuned-w7g97\" (UID: \"4ca6cd71-981d-410b-b6f0-e306196de4b3\") " pod="openshift-cluster-node-tuning-operator/tuned-w7g97" Apr 17 20:50:40.602354 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.600141 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/4ca6cd71-981d-410b-b6f0-e306196de4b3-etc-sysctl-d\") pod \"tuned-w7g97\" (UID: \"4ca6cd71-981d-410b-b6f0-e306196de4b3\") " pod="openshift-cluster-node-tuning-operator/tuned-w7g97" Apr 17 20:50:40.602354 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.600164 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5da4be1-c2a5-411b-a362-4b558f88d556-run-openvswitch\") pod \"ovnkube-node-pwcjq\" (UID: \"c5da4be1-c2a5-411b-a362-4b558f88d556\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcjq" Apr 17 20:50:40.602354 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:50:40.600170 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f35870d4-a814-46cd-8764-30e79f3a7d79-metrics-certs podName:f35870d4-a814-46cd-8764-30e79f3a7d79 nodeName:}" failed. No retries permitted until 2026-04-17 20:50:41.100153458 +0000 UTC m=+3.106567480 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f35870d4-a814-46cd-8764-30e79f3a7d79-metrics-certs") pod "network-metrics-daemon-z2lfh" (UID: "f35870d4-a814-46cd-8764-30e79f3a7d79") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:50:40.602354 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.599967 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4ca6cd71-981d-410b-b6f0-e306196de4b3-var-lib-kubelet\") pod \"tuned-w7g97\" (UID: \"4ca6cd71-981d-410b-b6f0-e306196de4b3\") " pod="openshift-cluster-node-tuning-operator/tuned-w7g97" Apr 17 20:50:40.602354 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.600202 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4ca6cd71-981d-410b-b6f0-e306196de4b3-host\") pod \"tuned-w7g97\" (UID: \"4ca6cd71-981d-410b-b6f0-e306196de4b3\") " pod="openshift-cluster-node-tuning-operator/tuned-w7g97" Apr 17 20:50:40.602354 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.600230 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/16f9d82e-a351-4b70-9872-d4e4ed6dd1b2-cni-binary-copy\") pod \"multus-ckt8c\" (UID: \"16f9d82e-a351-4b70-9872-d4e4ed6dd1b2\") " pod="openshift-multus/multus-ckt8c" Apr 17 20:50:40.602354 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.600237 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4ca6cd71-981d-410b-b6f0-e306196de4b3-lib-modules\") pod \"tuned-w7g97\" (UID: \"4ca6cd71-981d-410b-b6f0-e306196de4b3\") " pod="openshift-cluster-node-tuning-operator/tuned-w7g97" Apr 17 20:50:40.602354 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.600254 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/16f9d82e-a351-4b70-9872-d4e4ed6dd1b2-host-var-lib-kubelet\") pod \"multus-ckt8c\" (UID: \"16f9d82e-a351-4b70-9872-d4e4ed6dd1b2\") " pod="openshift-multus/multus-ckt8c" Apr 17 20:50:40.602354 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.600278 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/216f383a-4c2f-482f-b8e6-c36230d9ffbf-agent-certs\") pod \"konnectivity-agent-g2jbd\" (UID: \"216f383a-4c2f-482f-b8e6-c36230d9ffbf\") " pod="kube-system/konnectivity-agent-g2jbd" Apr 17 20:50:40.602354 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.600298 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/4ca6cd71-981d-410b-b6f0-e306196de4b3-etc-sysconfig\") pod \"tuned-w7g97\" (UID: \"4ca6cd71-981d-410b-b6f0-e306196de4b3\") " pod="openshift-cluster-node-tuning-operator/tuned-w7g97" Apr 17 20:50:40.603182 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.600301 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f9a06ede-5ace-4172-9e77-3eb6558ce0b1-system-cni-dir\") pod \"multus-additional-cni-plugins-hzs8v\" (UID: \"f9a06ede-5ace-4172-9e77-3eb6558ce0b1\") " pod="openshift-multus/multus-additional-cni-plugins-hzs8v" Apr 17 20:50:40.603182 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.600327 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f9a06ede-5ace-4172-9e77-3eb6558ce0b1-os-release\") pod \"multus-additional-cni-plugins-hzs8v\" (UID: \"f9a06ede-5ace-4172-9e77-3eb6558ce0b1\") " pod="openshift-multus/multus-additional-cni-plugins-hzs8v" Apr 17 20:50:40.603182 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.600328 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c5da4be1-c2a5-411b-a362-4b558f88d556-env-overrides\") pod \"ovnkube-node-pwcjq\" (UID: \"c5da4be1-c2a5-411b-a362-4b558f88d556\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcjq" Apr 17 20:50:40.603182 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.600332 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/16f9d82e-a351-4b70-9872-d4e4ed6dd1b2-multus-daemon-config\") pod \"multus-ckt8c\" (UID: \"16f9d82e-a351-4b70-9872-d4e4ed6dd1b2\") " pod="openshift-multus/multus-ckt8c" Apr 17 20:50:40.603182 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.600351 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f9a06ede-5ace-4172-9e77-3eb6558ce0b1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-hzs8v\" (UID: \"f9a06ede-5ace-4172-9e77-3eb6558ce0b1\") " pod="openshift-multus/multus-additional-cni-plugins-hzs8v" Apr 17 20:50:40.603182 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.600374 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/16f9d82e-a351-4b70-9872-d4e4ed6dd1b2-host-var-lib-kubelet\") pod \"multus-ckt8c\" (UID: \"16f9d82e-a351-4b70-9872-d4e4ed6dd1b2\") " pod="openshift-multus/multus-ckt8c" Apr 17 20:50:40.603182 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.600393 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f9a06ede-5ace-4172-9e77-3eb6558ce0b1-system-cni-dir\") pod \"multus-additional-cni-plugins-hzs8v\" (UID: \"f9a06ede-5ace-4172-9e77-3eb6558ce0b1\") " pod="openshift-multus/multus-additional-cni-plugins-hzs8v" Apr 17 20:50:40.603182 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.600410 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f9a06ede-5ace-4172-9e77-3eb6558ce0b1-os-release\") pod \"multus-additional-cni-plugins-hzs8v\" (UID: \"f9a06ede-5ace-4172-9e77-3eb6558ce0b1\") " pod="openshift-multus/multus-additional-cni-plugins-hzs8v" Apr 17 20:50:40.603182 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.600421 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4ca6cd71-981d-410b-b6f0-e306196de4b3-host\") pod \"tuned-w7g97\" (UID: \"4ca6cd71-981d-410b-b6f0-e306196de4b3\") " pod="openshift-cluster-node-tuning-operator/tuned-w7g97" Apr 17 20:50:40.603182 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.600439 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4ca6cd71-981d-410b-b6f0-e306196de4b3-sys\") pod \"tuned-w7g97\" (UID: \"4ca6cd71-981d-410b-b6f0-e306196de4b3\") " pod="openshift-cluster-node-tuning-operator/tuned-w7g97" Apr 17 20:50:40.603182 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.600475 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4ca6cd71-981d-410b-b6f0-e306196de4b3-sys\") pod \"tuned-w7g97\" (UID: \"4ca6cd71-981d-410b-b6f0-e306196de4b3\") " pod="openshift-cluster-node-tuning-operator/tuned-w7g97" Apr 17 20:50:40.603182 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.599530 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/4ca6cd71-981d-410b-b6f0-e306196de4b3-etc-systemd\") pod \"tuned-w7g97\" (UID: \"4ca6cd71-981d-410b-b6f0-e306196de4b3\") " pod="openshift-cluster-node-tuning-operator/tuned-w7g97" Apr 17 20:50:40.603182 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.601067 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/16f9d82e-a351-4b70-9872-d4e4ed6dd1b2-cni-binary-copy\") pod \"multus-ckt8c\" (UID: \"16f9d82e-a351-4b70-9872-d4e4ed6dd1b2\") " pod="openshift-multus/multus-ckt8c" Apr 17 20:50:40.603182 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.601309 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f9a06ede-5ace-4172-9e77-3eb6558ce0b1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-hzs8v\" (UID: \"f9a06ede-5ace-4172-9e77-3eb6558ce0b1\") " pod="openshift-multus/multus-additional-cni-plugins-hzs8v" Apr 17 20:50:40.603182 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.602063 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4ca6cd71-981d-410b-b6f0-e306196de4b3-tmp\") pod \"tuned-w7g97\" (UID: \"4ca6cd71-981d-410b-b6f0-e306196de4b3\") " pod="openshift-cluster-node-tuning-operator/tuned-w7g97" Apr 17 20:50:40.603182 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.602129 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c5da4be1-c2a5-411b-a362-4b558f88d556-ovn-node-metrics-cert\") pod \"ovnkube-node-pwcjq\" (UID: \"c5da4be1-c2a5-411b-a362-4b558f88d556\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcjq" Apr 17 20:50:40.603182 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.602901 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/4ca6cd71-981d-410b-b6f0-e306196de4b3-etc-tuned\") pod \"tuned-w7g97\" (UID: \"4ca6cd71-981d-410b-b6f0-e306196de4b3\") " pod="openshift-cluster-node-tuning-operator/tuned-w7g97" Apr 17 20:50:40.603182 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.602943 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/216f383a-4c2f-482f-b8e6-c36230d9ffbf-agent-certs\") pod \"konnectivity-agent-g2jbd\" (UID: \"216f383a-4c2f-482f-b8e6-c36230d9ffbf\") " pod="kube-system/konnectivity-agent-g2jbd" Apr 17 20:50:40.608137 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.608093 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9dtr\" (UniqueName: \"kubernetes.io/projected/16f9d82e-a351-4b70-9872-d4e4ed6dd1b2-kube-api-access-v9dtr\") pod \"multus-ckt8c\" (UID: \"16f9d82e-a351-4b70-9872-d4e4ed6dd1b2\") " pod="openshift-multus/multus-ckt8c" Apr 17 20:50:40.608335 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.608317 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sm2r4\" (UniqueName: \"kubernetes.io/projected/f35870d4-a814-46cd-8764-30e79f3a7d79-kube-api-access-sm2r4\") pod \"network-metrics-daemon-z2lfh\" (UID: \"f35870d4-a814-46cd-8764-30e79f3a7d79\") " pod="openshift-multus/network-metrics-daemon-z2lfh" Apr 17 20:50:40.609612 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.609545 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f59cq\" (UniqueName: \"kubernetes.io/projected/0c584a0a-c907-42fd-b3d5-c4705787e26a-kube-api-access-f59cq\") pod \"iptables-alerter-zsnm9\" (UID: \"0c584a0a-c907-42fd-b3d5-c4705787e26a\") " pod="openshift-network-operator/iptables-alerter-zsnm9" Apr 17 20:50:40.609725 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.609670 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jgdt\" (UniqueName: \"kubernetes.io/projected/4ca6cd71-981d-410b-b6f0-e306196de4b3-kube-api-access-6jgdt\") pod \"tuned-w7g97\" (UID: \"4ca6cd71-981d-410b-b6f0-e306196de4b3\") " pod="openshift-cluster-node-tuning-operator/tuned-w7g97" Apr 17 20:50:40.610082 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.610057 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsgxc\" (UniqueName: \"kubernetes.io/projected/f9a06ede-5ace-4172-9e77-3eb6558ce0b1-kube-api-access-jsgxc\") pod \"multus-additional-cni-plugins-hzs8v\" (UID: \"f9a06ede-5ace-4172-9e77-3eb6558ce0b1\") " pod="openshift-multus/multus-additional-cni-plugins-hzs8v" Apr 17 20:50:40.610181 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.610138 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlzk2\" (UniqueName: \"kubernetes.io/projected/c5da4be1-c2a5-411b-a362-4b558f88d556-kube-api-access-vlzk2\") pod \"ovnkube-node-pwcjq\" (UID: \"c5da4be1-c2a5-411b-a362-4b558f88d556\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcjq" Apr 17 20:50:40.690271 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.690167 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ttw76" Apr 17 20:50:40.696358 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.696337 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-bv5tl" Apr 17 20:50:40.704960 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.704941 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-q6mlz" Apr 17 20:50:40.710488 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.710468 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-hzs8v" Apr 17 20:50:40.717124 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.717107 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-ckt8c" Apr 17 20:50:40.724676 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.724659 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-g2jbd" Apr 17 20:50:40.731195 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.731174 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-pwcjq" Apr 17 20:50:40.738887 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.738867 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-w7g97" Apr 17 20:50:40.743899 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.743882 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 20:50:40.744927 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:40.744913 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-zsnm9" Apr 17 20:50:41.095998 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:41.095962 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod216f383a_4c2f_482f_b8e6_c36230d9ffbf.slice/crio-3a0f7543d7a127b1611a7e95def7b3eab903189e7888af230dba0d98bc5e7c60 WatchSource:0}: Error finding container 3a0f7543d7a127b1611a7e95def7b3eab903189e7888af230dba0d98bc5e7c60: Status 404 returned error can't find the container with id 3a0f7543d7a127b1611a7e95def7b3eab903189e7888af230dba0d98bc5e7c60 Apr 17 20:50:41.097139 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:41.097119 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9a06ede_5ace_4172_9e77_3eb6558ce0b1.slice/crio-4b9be62fa817978bf55e02cf936f160a4d7d1436812498d059d01eb4d3865403 WatchSource:0}: Error finding container 4b9be62fa817978bf55e02cf936f160a4d7d1436812498d059d01eb4d3865403: Status 404 returned error can't find the container with id 4b9be62fa817978bf55e02cf936f160a4d7d1436812498d059d01eb4d3865403 Apr 17 20:50:41.098395 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:41.098374 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbfe609fc_0a8b_47f5_ac0d_e7e0c47917de.slice/crio-b68e6ec35edb6615b1d055c51814cd84ab99b6d9e17cc7a4d7c2e30a24794cda WatchSource:0}: Error finding container b68e6ec35edb6615b1d055c51814cd84ab99b6d9e17cc7a4d7c2e30a24794cda: Status 404 returned error can't find the container with id b68e6ec35edb6615b1d055c51814cd84ab99b6d9e17cc7a4d7c2e30a24794cda Apr 17 20:50:41.099322 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:41.099305 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16f9d82e_a351_4b70_9872_d4e4ed6dd1b2.slice/crio-78f90926a42c623e2e97d80df882bb32f5f29c46656d882a823ff663a6d64da4 WatchSource:0}: Error finding container 78f90926a42c623e2e97d80df882bb32f5f29c46656d882a823ff663a6d64da4: Status 404 returned error can't find the container with id 78f90926a42c623e2e97d80df882bb32f5f29c46656d882a823ff663a6d64da4 Apr 17 20:50:41.102158 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:41.102135 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f35870d4-a814-46cd-8764-30e79f3a7d79-metrics-certs\") pod \"network-metrics-daemon-z2lfh\" (UID: \"f35870d4-a814-46cd-8764-30e79f3a7d79\") " pod="openshift-multus/network-metrics-daemon-z2lfh" Apr 17 20:50:41.102229 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:41.102203 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xptqq\" (UniqueName: \"kubernetes.io/projected/276cca8f-28b9-4e8c-918c-32c78b8870e7-kube-api-access-xptqq\") pod \"network-check-target-hzbw9\" (UID: \"276cca8f-28b9-4e8c-918c-32c78b8870e7\") " pod="openshift-network-diagnostics/network-check-target-hzbw9" Apr 17 20:50:41.102292 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:50:41.102275 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:50:41.102347 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:50:41.102296 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 20:50:41.102347 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:50:41.102310 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 20:50:41.102347 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:50:41.102322 2576 projected.go:194] Error preparing data for projected volume kube-api-access-xptqq for pod openshift-network-diagnostics/network-check-target-hzbw9: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:50:41.102347 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:50:41.102328 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f35870d4-a814-46cd-8764-30e79f3a7d79-metrics-certs podName:f35870d4-a814-46cd-8764-30e79f3a7d79 nodeName:}" failed. No retries permitted until 2026-04-17 20:50:42.102309925 +0000 UTC m=+4.108723940 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f35870d4-a814-46cd-8764-30e79f3a7d79-metrics-certs") pod "network-metrics-daemon-z2lfh" (UID: "f35870d4-a814-46cd-8764-30e79f3a7d79") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:50:41.102347 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:41.102323 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c584a0a_c907_42fd_b3d5_c4705787e26a.slice/crio-a9057ddcd540e20024bd208360310a0fb31c2883c4c699a09326936d060f6c18 WatchSource:0}: Error finding container a9057ddcd540e20024bd208360310a0fb31c2883c4c699a09326936d060f6c18: Status 404 returned error can't find the container with id a9057ddcd540e20024bd208360310a0fb31c2883c4c699a09326936d060f6c18 Apr 17 20:50:41.102511 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:50:41.102356 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/276cca8f-28b9-4e8c-918c-32c78b8870e7-kube-api-access-xptqq podName:276cca8f-28b9-4e8c-918c-32c78b8870e7 nodeName:}" failed. No retries permitted until 2026-04-17 20:50:42.102344481 +0000 UTC m=+4.108758492 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-xptqq" (UniqueName: "kubernetes.io/projected/276cca8f-28b9-4e8c-918c-32c78b8870e7-kube-api-access-xptqq") pod "network-check-target-hzbw9" (UID: "276cca8f-28b9-4e8c-918c-32c78b8870e7") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:50:41.103315 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:41.103282 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc5da4be1_c2a5_411b_a362_4b558f88d556.slice/crio-b91a2668442ec3d8fa0d9cbba4daa2c2a807257237cffbd9768e3b1caa3f45e9 WatchSource:0}: Error finding container b91a2668442ec3d8fa0d9cbba4daa2c2a807257237cffbd9768e3b1caa3f45e9: Status 404 returned error can't find the container with id b91a2668442ec3d8fa0d9cbba4daa2c2a807257237cffbd9768e3b1caa3f45e9 Apr 17 20:50:41.104804 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:41.104694 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod872463a9_2d67_4fef_aa20_3db5ecfe6fd8.slice/crio-a724ddc88fe0b3bf49929e74dba6359947cd65a45ec5af45da594e776b5cb20c WatchSource:0}: Error finding container a724ddc88fe0b3bf49929e74dba6359947cd65a45ec5af45da594e776b5cb20c: Status 404 returned error can't find the container with id a724ddc88fe0b3bf49929e74dba6359947cd65a45ec5af45da594e776b5cb20c Apr 17 20:50:41.105735 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:41.105435 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd8e668b_8da4_40de_9724_891b41810e2d.slice/crio-c9345ba58222df18138afe2837195385ac887583980c116a34ead5ca90ad7cbd WatchSource:0}: Error finding container c9345ba58222df18138afe2837195385ac887583980c116a34ead5ca90ad7cbd: Status 404 returned error can't find the container with id c9345ba58222df18138afe2837195385ac887583980c116a34ead5ca90ad7cbd Apr 17 20:50:41.106570 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:50:41.106479 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ca6cd71_981d_410b_b6f0_e306196de4b3.slice/crio-136f91c81d9a4123fd97b33600654a2b24b8577eb1a55b9e33800a4b0f2fa270 WatchSource:0}: Error finding container 136f91c81d9a4123fd97b33600654a2b24b8577eb1a55b9e33800a4b0f2fa270: Status 404 returned error can't find the container with id 136f91c81d9a4123fd97b33600654a2b24b8577eb1a55b9e33800a4b0f2fa270 Apr 17 20:50:41.431196 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:41.431120 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 20:45:39 +0000 UTC" deadline="2027-10-28 14:25:04.149206913 +0000 UTC" Apr 17 20:50:41.431196 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:41.431155 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13409h34m22.718055106s" Apr 17 20:50:41.489355 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:41.489315 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-zsnm9" event={"ID":"0c584a0a-c907-42fd-b3d5-c4705787e26a","Type":"ContainerStarted","Data":"a9057ddcd540e20024bd208360310a0fb31c2883c4c699a09326936d060f6c18"} Apr 17 20:50:41.491317 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:41.491266 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-g2jbd" event={"ID":"216f383a-4c2f-482f-b8e6-c36230d9ffbf","Type":"ContainerStarted","Data":"3a0f7543d7a127b1611a7e95def7b3eab903189e7888af230dba0d98bc5e7c60"} Apr 17 20:50:41.494823 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:41.494796 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-q6mlz" event={"ID":"bd8e668b-8da4-40de-9724-891b41810e2d","Type":"ContainerStarted","Data":"c9345ba58222df18138afe2837195385ac887583980c116a34ead5ca90ad7cbd"} Apr 17 20:50:41.496554 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:41.496534 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwcjq" event={"ID":"c5da4be1-c2a5-411b-a362-4b558f88d556","Type":"ContainerStarted","Data":"b91a2668442ec3d8fa0d9cbba4daa2c2a807257237cffbd9768e3b1caa3f45e9"} Apr 17 20:50:41.501273 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:41.501247 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-ckt8c" event={"ID":"16f9d82e-a351-4b70-9872-d4e4ed6dd1b2","Type":"ContainerStarted","Data":"78f90926a42c623e2e97d80df882bb32f5f29c46656d882a823ff663a6d64da4"} Apr 17 20:50:41.503592 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:41.503571 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-bv5tl" event={"ID":"bfe609fc-0a8b-47f5-ac0d-e7e0c47917de","Type":"ContainerStarted","Data":"b68e6ec35edb6615b1d055c51814cd84ab99b6d9e17cc7a4d7c2e30a24794cda"} Apr 17 20:50:41.506871 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:41.506848 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hzs8v" event={"ID":"f9a06ede-5ace-4172-9e77-3eb6558ce0b1","Type":"ContainerStarted","Data":"4b9be62fa817978bf55e02cf936f160a4d7d1436812498d059d01eb4d3865403"} Apr 17 20:50:41.511173 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:41.511148 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-248.ec2.internal" event={"ID":"d579a59ca2c3bf4b3f744c41961ff1e1","Type":"ContainerStarted","Data":"3724c43e7d28b5b7708053b79e9d6882c6595d9615e282589b814b341e0e6462"} Apr 17 20:50:41.514346 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:41.514314 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-w7g97" event={"ID":"4ca6cd71-981d-410b-b6f0-e306196de4b3","Type":"ContainerStarted","Data":"136f91c81d9a4123fd97b33600654a2b24b8577eb1a55b9e33800a4b0f2fa270"} Apr 17 20:50:41.516124 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:41.516073 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ttw76" event={"ID":"872463a9-2d67-4fef-aa20-3db5ecfe6fd8","Type":"ContainerStarted","Data":"a724ddc88fe0b3bf49929e74dba6359947cd65a45ec5af45da594e776b5cb20c"} Apr 17 20:50:42.109631 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:42.109587 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xptqq\" (UniqueName: \"kubernetes.io/projected/276cca8f-28b9-4e8c-918c-32c78b8870e7-kube-api-access-xptqq\") pod \"network-check-target-hzbw9\" (UID: \"276cca8f-28b9-4e8c-918c-32c78b8870e7\") " pod="openshift-network-diagnostics/network-check-target-hzbw9" Apr 17 20:50:42.109806 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:42.109653 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f35870d4-a814-46cd-8764-30e79f3a7d79-metrics-certs\") pod \"network-metrics-daemon-z2lfh\" (UID: \"f35870d4-a814-46cd-8764-30e79f3a7d79\") " pod="openshift-multus/network-metrics-daemon-z2lfh" Apr 17 20:50:42.109806 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:50:42.109777 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:50:42.109915 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:50:42.109847 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f35870d4-a814-46cd-8764-30e79f3a7d79-metrics-certs podName:f35870d4-a814-46cd-8764-30e79f3a7d79 nodeName:}" failed. No retries permitted until 2026-04-17 20:50:44.109827418 +0000 UTC m=+6.116241431 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f35870d4-a814-46cd-8764-30e79f3a7d79-metrics-certs") pod "network-metrics-daemon-z2lfh" (UID: "f35870d4-a814-46cd-8764-30e79f3a7d79") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:50:42.109988 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:50:42.109959 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 20:50:42.109988 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:50:42.109974 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 20:50:42.109988 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:50:42.109986 2576 projected.go:194] Error preparing data for projected volume kube-api-access-xptqq for pod openshift-network-diagnostics/network-check-target-hzbw9: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:50:42.110189 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:50:42.110037 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/276cca8f-28b9-4e8c-918c-32c78b8870e7-kube-api-access-xptqq podName:276cca8f-28b9-4e8c-918c-32c78b8870e7 nodeName:}" failed. No retries permitted until 2026-04-17 20:50:44.110012403 +0000 UTC m=+6.116426416 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-xptqq" (UniqueName: "kubernetes.io/projected/276cca8f-28b9-4e8c-918c-32c78b8870e7-kube-api-access-xptqq") pod "network-check-target-hzbw9" (UID: "276cca8f-28b9-4e8c-918c-32c78b8870e7") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:50:42.483486 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:42.483405 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z2lfh" Apr 17 20:50:42.483924 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:50:42.483509 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z2lfh" podUID="f35870d4-a814-46cd-8764-30e79f3a7d79" Apr 17 20:50:42.486097 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:42.486069 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hzbw9" Apr 17 20:50:42.486221 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:50:42.486177 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hzbw9" podUID="276cca8f-28b9-4e8c-918c-32c78b8870e7" Apr 17 20:50:42.526019 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:42.525986 2576 generic.go:358] "Generic (PLEG): container finished" podID="933ffdfca6e87b798592801ce6979396" containerID="f6db75ce6c956093e9f3323a3bd4cb95c895feb45bdf3b40d455974b604d97e1" exitCode=0 Apr 17 20:50:42.526903 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:42.526878 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-248.ec2.internal" event={"ID":"933ffdfca6e87b798592801ce6979396","Type":"ContainerDied","Data":"f6db75ce6c956093e9f3323a3bd4cb95c895feb45bdf3b40d455974b604d97e1"} Apr 17 20:50:42.541144 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:42.541085 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-248.ec2.internal" podStartSLOduration=3.541069221 podStartE2EDuration="3.541069221s" podCreationTimestamp="2026-04-17 20:50:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 20:50:41.528327513 +0000 UTC m=+3.534741547" watchObservedRunningTime="2026-04-17 20:50:42.541069221 +0000 UTC m=+4.547483257" Apr 17 20:50:43.538146 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:43.538110 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-248.ec2.internal" event={"ID":"933ffdfca6e87b798592801ce6979396","Type":"ContainerStarted","Data":"ac6c4d4ffdd843d3302b0057a0419df69d003965f7518855fe015cdc3d95d06a"} Apr 17 20:50:43.553432 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:43.553382 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-248.ec2.internal" podStartSLOduration=4.553364676 podStartE2EDuration="4.553364676s" podCreationTimestamp="2026-04-17 20:50:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 20:50:43.552990266 +0000 UTC m=+5.559404301" watchObservedRunningTime="2026-04-17 20:50:43.553364676 +0000 UTC m=+5.559778704" Apr 17 20:50:44.128747 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:44.128694 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f35870d4-a814-46cd-8764-30e79f3a7d79-metrics-certs\") pod \"network-metrics-daemon-z2lfh\" (UID: \"f35870d4-a814-46cd-8764-30e79f3a7d79\") " pod="openshift-multus/network-metrics-daemon-z2lfh" Apr 17 20:50:44.128994 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:44.128776 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xptqq\" (UniqueName: \"kubernetes.io/projected/276cca8f-28b9-4e8c-918c-32c78b8870e7-kube-api-access-xptqq\") pod \"network-check-target-hzbw9\" (UID: \"276cca8f-28b9-4e8c-918c-32c78b8870e7\") " pod="openshift-network-diagnostics/network-check-target-hzbw9" Apr 17 20:50:44.128994 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:50:44.128913 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 20:50:44.128994 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:50:44.128932 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 20:50:44.128994 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:50:44.128945 2576 projected.go:194] Error preparing data for projected volume kube-api-access-xptqq for pod openshift-network-diagnostics/network-check-target-hzbw9: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:50:44.129234 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:50:44.129002 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/276cca8f-28b9-4e8c-918c-32c78b8870e7-kube-api-access-xptqq podName:276cca8f-28b9-4e8c-918c-32c78b8870e7 nodeName:}" failed. No retries permitted until 2026-04-17 20:50:48.128982964 +0000 UTC m=+10.135396990 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-xptqq" (UniqueName: "kubernetes.io/projected/276cca8f-28b9-4e8c-918c-32c78b8870e7-kube-api-access-xptqq") pod "network-check-target-hzbw9" (UID: "276cca8f-28b9-4e8c-918c-32c78b8870e7") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:50:44.129504 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:50:44.129416 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:50:44.129504 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:50:44.129476 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f35870d4-a814-46cd-8764-30e79f3a7d79-metrics-certs podName:f35870d4-a814-46cd-8764-30e79f3a7d79 nodeName:}" failed. No retries permitted until 2026-04-17 20:50:48.129460987 +0000 UTC m=+10.135875003 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f35870d4-a814-46cd-8764-30e79f3a7d79-metrics-certs") pod "network-metrics-daemon-z2lfh" (UID: "f35870d4-a814-46cd-8764-30e79f3a7d79") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:50:44.483510 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:44.483432 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z2lfh" Apr 17 20:50:44.483510 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:44.483464 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hzbw9" Apr 17 20:50:44.483729 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:50:44.483573 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z2lfh" podUID="f35870d4-a814-46cd-8764-30e79f3a7d79" Apr 17 20:50:44.483729 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:50:44.483684 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hzbw9" podUID="276cca8f-28b9-4e8c-918c-32c78b8870e7" Apr 17 20:50:46.483470 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:46.483419 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hzbw9" Apr 17 20:50:46.483932 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:50:46.483557 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hzbw9" podUID="276cca8f-28b9-4e8c-918c-32c78b8870e7" Apr 17 20:50:46.483932 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:46.483723 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z2lfh" Apr 17 20:50:46.483932 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:50:46.483870 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z2lfh" podUID="f35870d4-a814-46cd-8764-30e79f3a7d79" Apr 17 20:50:48.163219 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:48.163176 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xptqq\" (UniqueName: \"kubernetes.io/projected/276cca8f-28b9-4e8c-918c-32c78b8870e7-kube-api-access-xptqq\") pod \"network-check-target-hzbw9\" (UID: \"276cca8f-28b9-4e8c-918c-32c78b8870e7\") " pod="openshift-network-diagnostics/network-check-target-hzbw9" Apr 17 20:50:48.163709 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:48.163246 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f35870d4-a814-46cd-8764-30e79f3a7d79-metrics-certs\") pod \"network-metrics-daemon-z2lfh\" (UID: \"f35870d4-a814-46cd-8764-30e79f3a7d79\") " pod="openshift-multus/network-metrics-daemon-z2lfh" Apr 17 20:50:48.163709 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:50:48.163375 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:50:48.163709 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:50:48.163435 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f35870d4-a814-46cd-8764-30e79f3a7d79-metrics-certs podName:f35870d4-a814-46cd-8764-30e79f3a7d79 nodeName:}" failed. No retries permitted until 2026-04-17 20:50:56.163416944 +0000 UTC m=+18.169830966 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f35870d4-a814-46cd-8764-30e79f3a7d79-metrics-certs") pod "network-metrics-daemon-z2lfh" (UID: "f35870d4-a814-46cd-8764-30e79f3a7d79") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:50:48.163709 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:50:48.163452 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 20:50:48.163709 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:50:48.163468 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 20:50:48.163709 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:50:48.163479 2576 projected.go:194] Error preparing data for projected volume kube-api-access-xptqq for pod openshift-network-diagnostics/network-check-target-hzbw9: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:50:48.163709 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:50:48.163515 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/276cca8f-28b9-4e8c-918c-32c78b8870e7-kube-api-access-xptqq podName:276cca8f-28b9-4e8c-918c-32c78b8870e7 nodeName:}" failed. No retries permitted until 2026-04-17 20:50:56.163504532 +0000 UTC m=+18.169918555 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-xptqq" (UniqueName: "kubernetes.io/projected/276cca8f-28b9-4e8c-918c-32c78b8870e7-kube-api-access-xptqq") pod "network-check-target-hzbw9" (UID: "276cca8f-28b9-4e8c-918c-32c78b8870e7") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:50:48.487056 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:48.486966 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z2lfh" Apr 17 20:50:48.487056 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:48.486991 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hzbw9" Apr 17 20:50:48.487263 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:50:48.487106 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z2lfh" podUID="f35870d4-a814-46cd-8764-30e79f3a7d79" Apr 17 20:50:48.487263 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:50:48.487192 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hzbw9" podUID="276cca8f-28b9-4e8c-918c-32c78b8870e7" Apr 17 20:50:50.482792 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:50.482757 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hzbw9" Apr 17 20:50:50.483253 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:50:50.482868 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hzbw9" podUID="276cca8f-28b9-4e8c-918c-32c78b8870e7" Apr 17 20:50:50.483253 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:50.482918 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z2lfh" Apr 17 20:50:50.483253 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:50:50.483004 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z2lfh" podUID="f35870d4-a814-46cd-8764-30e79f3a7d79" Apr 17 20:50:52.483561 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:52.483520 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hzbw9" Apr 17 20:50:52.484138 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:52.483520 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z2lfh" Apr 17 20:50:52.484138 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:50:52.483654 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hzbw9" podUID="276cca8f-28b9-4e8c-918c-32c78b8870e7" Apr 17 20:50:52.484138 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:50:52.483741 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z2lfh" podUID="f35870d4-a814-46cd-8764-30e79f3a7d79" Apr 17 20:50:54.482801 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:54.482770 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hzbw9" Apr 17 20:50:54.483226 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:54.482777 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z2lfh" Apr 17 20:50:54.483226 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:50:54.482893 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hzbw9" podUID="276cca8f-28b9-4e8c-918c-32c78b8870e7" Apr 17 20:50:54.483226 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:50:54.482988 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z2lfh" podUID="f35870d4-a814-46cd-8764-30e79f3a7d79" Apr 17 20:50:56.223107 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:56.223071 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f35870d4-a814-46cd-8764-30e79f3a7d79-metrics-certs\") pod \"network-metrics-daemon-z2lfh\" (UID: \"f35870d4-a814-46cd-8764-30e79f3a7d79\") " pod="openshift-multus/network-metrics-daemon-z2lfh" Apr 17 20:50:56.223442 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:56.223127 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xptqq\" (UniqueName: \"kubernetes.io/projected/276cca8f-28b9-4e8c-918c-32c78b8870e7-kube-api-access-xptqq\") pod \"network-check-target-hzbw9\" (UID: \"276cca8f-28b9-4e8c-918c-32c78b8870e7\") " pod="openshift-network-diagnostics/network-check-target-hzbw9" Apr 17 20:50:56.223442 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:50:56.223213 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:50:56.223442 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:50:56.223276 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f35870d4-a814-46cd-8764-30e79f3a7d79-metrics-certs podName:f35870d4-a814-46cd-8764-30e79f3a7d79 nodeName:}" failed. No retries permitted until 2026-04-17 20:51:12.223261024 +0000 UTC m=+34.229675039 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f35870d4-a814-46cd-8764-30e79f3a7d79-metrics-certs") pod "network-metrics-daemon-z2lfh" (UID: "f35870d4-a814-46cd-8764-30e79f3a7d79") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:50:56.223442 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:50:56.223220 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 20:50:56.223442 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:50:56.223302 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 20:50:56.223442 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:50:56.223311 2576 projected.go:194] Error preparing data for projected volume kube-api-access-xptqq for pod openshift-network-diagnostics/network-check-target-hzbw9: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:50:56.223442 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:50:56.223338 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/276cca8f-28b9-4e8c-918c-32c78b8870e7-kube-api-access-xptqq podName:276cca8f-28b9-4e8c-918c-32c78b8870e7 nodeName:}" failed. No retries permitted until 2026-04-17 20:51:12.2233314 +0000 UTC m=+34.229745411 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-xptqq" (UniqueName: "kubernetes.io/projected/276cca8f-28b9-4e8c-918c-32c78b8870e7-kube-api-access-xptqq") pod "network-check-target-hzbw9" (UID: "276cca8f-28b9-4e8c-918c-32c78b8870e7") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:50:56.420727 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:56.420693 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-5ws25"] Apr 17 20:50:56.508662 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:56.508593 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5ws25" Apr 17 20:50:56.508802 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:50:56.508671 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5ws25" podUID="1690cc02-1c37-4fea-8728-f08db138a40f" Apr 17 20:50:56.510785 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:56.510764 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z2lfh" Apr 17 20:50:56.510880 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:56.510770 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hzbw9" Apr 17 20:50:56.510929 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:50:56.510885 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z2lfh" podUID="f35870d4-a814-46cd-8764-30e79f3a7d79" Apr 17 20:50:56.510984 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:50:56.510926 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hzbw9" podUID="276cca8f-28b9-4e8c-918c-32c78b8870e7" Apr 17 20:50:56.626100 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:56.626061 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/1690cc02-1c37-4fea-8728-f08db138a40f-dbus\") pod \"global-pull-secret-syncer-5ws25\" (UID: \"1690cc02-1c37-4fea-8728-f08db138a40f\") " pod="kube-system/global-pull-secret-syncer-5ws25" Apr 17 20:50:56.626269 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:56.626116 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/1690cc02-1c37-4fea-8728-f08db138a40f-kubelet-config\") pod \"global-pull-secret-syncer-5ws25\" (UID: \"1690cc02-1c37-4fea-8728-f08db138a40f\") " pod="kube-system/global-pull-secret-syncer-5ws25" Apr 17 20:50:56.626269 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:56.626145 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/1690cc02-1c37-4fea-8728-f08db138a40f-original-pull-secret\") pod \"global-pull-secret-syncer-5ws25\" (UID: \"1690cc02-1c37-4fea-8728-f08db138a40f\") " pod="kube-system/global-pull-secret-syncer-5ws25" Apr 17 20:50:56.726811 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:56.726776 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/1690cc02-1c37-4fea-8728-f08db138a40f-dbus\") pod \"global-pull-secret-syncer-5ws25\" (UID: \"1690cc02-1c37-4fea-8728-f08db138a40f\") " pod="kube-system/global-pull-secret-syncer-5ws25" Apr 17 20:50:56.727001 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:56.726819 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/1690cc02-1c37-4fea-8728-f08db138a40f-kubelet-config\") pod \"global-pull-secret-syncer-5ws25\" (UID: \"1690cc02-1c37-4fea-8728-f08db138a40f\") " pod="kube-system/global-pull-secret-syncer-5ws25" Apr 17 20:50:56.727001 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:56.726841 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/1690cc02-1c37-4fea-8728-f08db138a40f-original-pull-secret\") pod \"global-pull-secret-syncer-5ws25\" (UID: \"1690cc02-1c37-4fea-8728-f08db138a40f\") " pod="kube-system/global-pull-secret-syncer-5ws25" Apr 17 20:50:56.727001 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:56.726917 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/1690cc02-1c37-4fea-8728-f08db138a40f-kubelet-config\") pod \"global-pull-secret-syncer-5ws25\" (UID: \"1690cc02-1c37-4fea-8728-f08db138a40f\") " pod="kube-system/global-pull-secret-syncer-5ws25" Apr 17 20:50:56.727001 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:56.726917 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/1690cc02-1c37-4fea-8728-f08db138a40f-dbus\") pod \"global-pull-secret-syncer-5ws25\" (UID: \"1690cc02-1c37-4fea-8728-f08db138a40f\") " pod="kube-system/global-pull-secret-syncer-5ws25" Apr 17 20:50:56.727001 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:50:56.726959 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 20:50:56.727274 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:50:56.727016 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1690cc02-1c37-4fea-8728-f08db138a40f-original-pull-secret podName:1690cc02-1c37-4fea-8728-f08db138a40f nodeName:}" failed. No retries permitted until 2026-04-17 20:50:57.226998742 +0000 UTC m=+19.233412766 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/1690cc02-1c37-4fea-8728-f08db138a40f-original-pull-secret") pod "global-pull-secret-syncer-5ws25" (UID: "1690cc02-1c37-4fea-8728-f08db138a40f") : object "kube-system"/"original-pull-secret" not registered Apr 17 20:50:57.231161 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:57.231123 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/1690cc02-1c37-4fea-8728-f08db138a40f-original-pull-secret\") pod \"global-pull-secret-syncer-5ws25\" (UID: \"1690cc02-1c37-4fea-8728-f08db138a40f\") " pod="kube-system/global-pull-secret-syncer-5ws25" Apr 17 20:50:57.231585 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:50:57.231260 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 20:50:57.231585 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:50:57.231325 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1690cc02-1c37-4fea-8728-f08db138a40f-original-pull-secret podName:1690cc02-1c37-4fea-8728-f08db138a40f nodeName:}" failed. No retries permitted until 2026-04-17 20:50:58.231306781 +0000 UTC m=+20.237720791 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/1690cc02-1c37-4fea-8728-f08db138a40f-original-pull-secret") pod "global-pull-secret-syncer-5ws25" (UID: "1690cc02-1c37-4fea-8728-f08db138a40f") : object "kube-system"/"original-pull-secret" not registered Apr 17 20:50:58.240979 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:58.240659 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/1690cc02-1c37-4fea-8728-f08db138a40f-original-pull-secret\") pod \"global-pull-secret-syncer-5ws25\" (UID: \"1690cc02-1c37-4fea-8728-f08db138a40f\") " pod="kube-system/global-pull-secret-syncer-5ws25" Apr 17 20:50:58.240979 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:50:58.240813 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 20:50:58.240979 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:50:58.240865 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1690cc02-1c37-4fea-8728-f08db138a40f-original-pull-secret podName:1690cc02-1c37-4fea-8728-f08db138a40f nodeName:}" failed. No retries permitted until 2026-04-17 20:51:00.24084655 +0000 UTC m=+22.247260565 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/1690cc02-1c37-4fea-8728-f08db138a40f-original-pull-secret") pod "global-pull-secret-syncer-5ws25" (UID: "1690cc02-1c37-4fea-8728-f08db138a40f") : object "kube-system"/"original-pull-secret" not registered Apr 17 20:50:58.483150 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:58.483123 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hzbw9" Apr 17 20:50:58.483263 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:50:58.483210 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hzbw9" podUID="276cca8f-28b9-4e8c-918c-32c78b8870e7" Apr 17 20:50:58.483263 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:58.483252 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z2lfh" Apr 17 20:50:58.483342 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:50:58.483305 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z2lfh" podUID="f35870d4-a814-46cd-8764-30e79f3a7d79" Apr 17 20:50:58.483342 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:58.483327 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5ws25" Apr 17 20:50:58.483419 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:50:58.483369 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5ws25" podUID="1690cc02-1c37-4fea-8728-f08db138a40f" Apr 17 20:50:58.564173 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:58.564144 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-q6mlz" event={"ID":"bd8e668b-8da4-40de-9724-891b41810e2d","Type":"ContainerStarted","Data":"93082995a5b6102d61ad317988b95446093206a077682c1c19b91d60acc4562c"} Apr 17 20:50:58.565921 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:58.565878 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pwcjq_c5da4be1-c2a5-411b-a362-4b558f88d556/ovn-acl-logging/0.log" Apr 17 20:50:58.566234 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:58.566216 2576 generic.go:358] "Generic (PLEG): container finished" podID="c5da4be1-c2a5-411b-a362-4b558f88d556" containerID="22391fbac96f3af380454350b459b0f5f0d71e63768f42b0d32e6a7b3f824d8e" exitCode=1 Apr 17 20:50:58.566317 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:58.566278 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwcjq" event={"ID":"c5da4be1-c2a5-411b-a362-4b558f88d556","Type":"ContainerStarted","Data":"f717d35614eac85a20501a7b259d9d6fd15962d71a531e692cdfa1cd13155346"} Apr 17 20:50:58.566317 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:58.566302 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwcjq" event={"ID":"c5da4be1-c2a5-411b-a362-4b558f88d556","Type":"ContainerDied","Data":"22391fbac96f3af380454350b459b0f5f0d71e63768f42b0d32e6a7b3f824d8e"} Apr 17 20:50:58.566317 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:58.566314 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwcjq" event={"ID":"c5da4be1-c2a5-411b-a362-4b558f88d556","Type":"ContainerStarted","Data":"3eacb26b4d02402bb6f9da807a0fa51344c3be4d3881eebaf48a1693c979f84c"} Apr 17 20:50:58.567506 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:58.567483 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-ckt8c" event={"ID":"16f9d82e-a351-4b70-9872-d4e4ed6dd1b2","Type":"ContainerStarted","Data":"46dd2039a30983afec1a958dd653d23a1c0be31606a1637b1a031cd6171eb69e"} Apr 17 20:50:58.568761 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:58.568738 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-bv5tl" event={"ID":"bfe609fc-0a8b-47f5-ac0d-e7e0c47917de","Type":"ContainerStarted","Data":"8312126532f22a70d33aba8f5338ec11f2983341f3a77acc2b83b27acabee13d"} Apr 17 20:50:58.569948 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:58.569928 2576 generic.go:358] "Generic (PLEG): container finished" podID="f9a06ede-5ace-4172-9e77-3eb6558ce0b1" containerID="2e3f18f34ddee04cc50e9d148108467dd2b5575d6fcfdf7c992e0ac9782ab655" exitCode=0 Apr 17 20:50:58.570057 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:58.570008 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hzs8v" event={"ID":"f9a06ede-5ace-4172-9e77-3eb6558ce0b1","Type":"ContainerDied","Data":"2e3f18f34ddee04cc50e9d148108467dd2b5575d6fcfdf7c992e0ac9782ab655"} Apr 17 20:50:58.571267 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:58.571225 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-w7g97" event={"ID":"4ca6cd71-981d-410b-b6f0-e306196de4b3","Type":"ContainerStarted","Data":"3f2e86a57e5a67e4123cb4539818d290b26f1ca7262040f4b47c2602d4ee419a"} Apr 17 20:50:58.572444 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:58.572416 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ttw76" event={"ID":"872463a9-2d67-4fef-aa20-3db5ecfe6fd8","Type":"ContainerStarted","Data":"15e7c34128aa13b7fd9ed1954f9636f17f262eded2ca18ba944b0b82caf62b83"} Apr 17 20:50:58.573514 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:58.573495 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-g2jbd" event={"ID":"216f383a-4c2f-482f-b8e6-c36230d9ffbf","Type":"ContainerStarted","Data":"139b264597b8ca09ece3c7cc566e721a8e8c90f4ab83c02d02261554b7a91fd7"} Apr 17 20:50:58.579732 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:58.579699 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-q6mlz" podStartSLOduration=3.746271319 podStartE2EDuration="20.579689349s" podCreationTimestamp="2026-04-17 20:50:38 +0000 UTC" firstStartedPulling="2026-04-17 20:50:41.107678077 +0000 UTC m=+3.114092088" lastFinishedPulling="2026-04-17 20:50:57.941096108 +0000 UTC m=+19.947510118" observedRunningTime="2026-04-17 20:50:58.579274438 +0000 UTC m=+20.585688471" watchObservedRunningTime="2026-04-17 20:50:58.579689349 +0000 UTC m=+20.586103379" Apr 17 20:50:58.632330 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:58.632298 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-bv5tl" podStartSLOduration=3.773959745 podStartE2EDuration="20.632286778s" podCreationTimestamp="2026-04-17 20:50:38 +0000 UTC" firstStartedPulling="2026-04-17 20:50:41.100854014 +0000 UTC m=+3.107268025" lastFinishedPulling="2026-04-17 20:50:57.959181028 +0000 UTC m=+19.965595058" observedRunningTime="2026-04-17 20:50:58.632158135 +0000 UTC m=+20.638572168" watchObservedRunningTime="2026-04-17 20:50:58.632286778 +0000 UTC m=+20.638700810" Apr 17 20:50:58.632466 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:58.632447 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-ckt8c" podStartSLOduration=3.7383233049999998 podStartE2EDuration="20.632442029s" podCreationTimestamp="2026-04-17 20:50:38 +0000 UTC" firstStartedPulling="2026-04-17 20:50:41.101604378 +0000 UTC m=+3.108018392" lastFinishedPulling="2026-04-17 20:50:57.995723095 +0000 UTC m=+20.002137116" observedRunningTime="2026-04-17 20:50:58.617470321 +0000 UTC m=+20.623884354" watchObservedRunningTime="2026-04-17 20:50:58.632442029 +0000 UTC m=+20.638856061" Apr 17 20:50:58.646836 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:58.646804 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-g2jbd" podStartSLOduration=8.316544992 podStartE2EDuration="20.646792253s" podCreationTimestamp="2026-04-17 20:50:38 +0000 UTC" firstStartedPulling="2026-04-17 20:50:41.097668023 +0000 UTC m=+3.104082034" lastFinishedPulling="2026-04-17 20:50:53.427915269 +0000 UTC m=+15.434329295" observedRunningTime="2026-04-17 20:50:58.646641038 +0000 UTC m=+20.653055072" watchObservedRunningTime="2026-04-17 20:50:58.646792253 +0000 UTC m=+20.653206284" Apr 17 20:50:59.577969 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:59.577757 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pwcjq_c5da4be1-c2a5-411b-a362-4b558f88d556/ovn-acl-logging/0.log" Apr 17 20:50:59.578477 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:59.578398 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwcjq" event={"ID":"c5da4be1-c2a5-411b-a362-4b558f88d556","Type":"ContainerStarted","Data":"7af0f9f19b820379fe06172139caa245eca472c291f4280dba3ad647313cbf94"} Apr 17 20:50:59.578599 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:59.578485 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwcjq" event={"ID":"c5da4be1-c2a5-411b-a362-4b558f88d556","Type":"ContainerStarted","Data":"ec7a86cec909ceb9cb535e7a2425d7825022cf0e40df5b6b8883901b0e58b140"} Apr 17 20:50:59.578599 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:59.578498 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwcjq" event={"ID":"c5da4be1-c2a5-411b-a362-4b558f88d556","Type":"ContainerStarted","Data":"942b67334e4c652357584434c6f354e6098b042ec889bfe7f6717e8d54c2184d"} Apr 17 20:50:59.814816 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:50:59.814795 2576 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 17 20:51:00.259325 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:00.259245 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/1690cc02-1c37-4fea-8728-f08db138a40f-original-pull-secret\") pod \"global-pull-secret-syncer-5ws25\" (UID: \"1690cc02-1c37-4fea-8728-f08db138a40f\") " pod="kube-system/global-pull-secret-syncer-5ws25" Apr 17 20:51:00.259511 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:51:00.259399 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 20:51:00.259511 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:51:00.259468 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1690cc02-1c37-4fea-8728-f08db138a40f-original-pull-secret podName:1690cc02-1c37-4fea-8728-f08db138a40f nodeName:}" failed. No retries permitted until 2026-04-17 20:51:04.259453035 +0000 UTC m=+26.265867045 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/1690cc02-1c37-4fea-8728-f08db138a40f-original-pull-secret") pod "global-pull-secret-syncer-5ws25" (UID: "1690cc02-1c37-4fea-8728-f08db138a40f") : object "kube-system"/"original-pull-secret" not registered Apr 17 20:51:00.458840 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:00.458716 2576 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-17T20:50:59.814813708Z","UUID":"ddee2d19-57f7-48cc-ac18-ca18b6d8c8c4","Handler":null,"Name":"","Endpoint":""} Apr 17 20:51:00.460780 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:00.460761 2576 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 17 20:51:00.460918 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:00.460788 2576 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 17 20:51:00.483530 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:00.483502 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5ws25" Apr 17 20:51:00.483669 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:00.483537 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hzbw9" Apr 17 20:51:00.483669 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:00.483503 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z2lfh" Apr 17 20:51:00.483669 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:51:00.483634 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5ws25" podUID="1690cc02-1c37-4fea-8728-f08db138a40f" Apr 17 20:51:00.483828 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:51:00.483719 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z2lfh" podUID="f35870d4-a814-46cd-8764-30e79f3a7d79" Apr 17 20:51:00.483828 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:51:00.483802 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hzbw9" podUID="276cca8f-28b9-4e8c-918c-32c78b8870e7" Apr 17 20:51:00.582851 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:00.582818 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ttw76" event={"ID":"872463a9-2d67-4fef-aa20-3db5ecfe6fd8","Type":"ContainerStarted","Data":"2b745da7cf0f4f25b351d375aa2d33cc2479e32932619e1f7c9b431373add9cc"} Apr 17 20:51:00.584370 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:00.584333 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-zsnm9" event={"ID":"0c584a0a-c907-42fd-b3d5-c4705787e26a","Type":"ContainerStarted","Data":"f7189913f214e755321aeaae6512e046de3155301616dd2a4f07230983117fce"} Apr 17 20:51:00.598641 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:00.598600 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-w7g97" podStartSLOduration=5.747778156 podStartE2EDuration="22.598588483s" podCreationTimestamp="2026-04-17 20:50:38 +0000 UTC" firstStartedPulling="2026-04-17 20:50:41.107981118 +0000 UTC m=+3.114395130" lastFinishedPulling="2026-04-17 20:50:57.958791434 +0000 UTC m=+19.965205457" observedRunningTime="2026-04-17 20:50:58.662123954 +0000 UTC m=+20.668537986" watchObservedRunningTime="2026-04-17 20:51:00.598588483 +0000 UTC m=+22.605002514" Apr 17 20:51:00.598771 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:00.598751 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-zsnm9" podStartSLOduration=6.010602963 podStartE2EDuration="22.598747129s" podCreationTimestamp="2026-04-17 20:50:38 +0000 UTC" firstStartedPulling="2026-04-17 20:50:41.104679183 +0000 UTC m=+3.111093197" lastFinishedPulling="2026-04-17 20:50:57.69282334 +0000 UTC m=+19.699237363" observedRunningTime="2026-04-17 20:51:00.59818183 +0000 UTC m=+22.604595862" watchObservedRunningTime="2026-04-17 20:51:00.598747129 +0000 UTC m=+22.605161159" Apr 17 20:51:01.588061 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:01.588007 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ttw76" event={"ID":"872463a9-2d67-4fef-aa20-3db5ecfe6fd8","Type":"ContainerStarted","Data":"79519dd4de70d2183e8de103176d4ba563bea28bd58e37c6b1de8f81b6ea14ed"} Apr 17 20:51:01.591012 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:01.590988 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pwcjq_c5da4be1-c2a5-411b-a362-4b558f88d556/ovn-acl-logging/0.log" Apr 17 20:51:01.591396 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:01.591370 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwcjq" event={"ID":"c5da4be1-c2a5-411b-a362-4b558f88d556","Type":"ContainerStarted","Data":"9b56045dac82630833e5118a9bb8b7b25cbaa9ae96056f5e3825d6234d758f60"} Apr 17 20:51:01.606968 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:01.606933 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ttw76" podStartSLOduration=3.798760734 podStartE2EDuration="23.606922272s" podCreationTimestamp="2026-04-17 20:50:38 +0000 UTC" firstStartedPulling="2026-04-17 20:50:41.107759919 +0000 UTC m=+3.114173933" lastFinishedPulling="2026-04-17 20:51:00.915921447 +0000 UTC m=+22.922335471" observedRunningTime="2026-04-17 20:51:01.606638558 +0000 UTC m=+23.613052624" watchObservedRunningTime="2026-04-17 20:51:01.606922272 +0000 UTC m=+23.613336302" Apr 17 20:51:01.652682 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:01.652659 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-g2jbd" Apr 17 20:51:02.483122 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:02.483090 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hzbw9" Apr 17 20:51:02.483293 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:02.483100 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5ws25" Apr 17 20:51:02.483293 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:02.483100 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z2lfh" Apr 17 20:51:02.483390 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:51:02.483300 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5ws25" podUID="1690cc02-1c37-4fea-8728-f08db138a40f" Apr 17 20:51:02.483390 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:51:02.483365 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z2lfh" podUID="f35870d4-a814-46cd-8764-30e79f3a7d79" Apr 17 20:51:02.483390 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:51:02.483192 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hzbw9" podUID="276cca8f-28b9-4e8c-918c-32c78b8870e7" Apr 17 20:51:03.379072 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:03.378911 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-g2jbd" Apr 17 20:51:03.379520 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:03.379458 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-g2jbd" Apr 17 20:51:03.597435 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:03.597415 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pwcjq_c5da4be1-c2a5-411b-a362-4b558f88d556/ovn-acl-logging/0.log" Apr 17 20:51:03.597723 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:03.597696 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwcjq" event={"ID":"c5da4be1-c2a5-411b-a362-4b558f88d556","Type":"ContainerStarted","Data":"06a6ce7b7620600c5edbe7925b6968e2173a72d147e44308a102f017098879e0"} Apr 17 20:51:03.598166 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:03.598147 2576 scope.go:117] "RemoveContainer" containerID="22391fbac96f3af380454350b459b0f5f0d71e63768f42b0d32e6a7b3f824d8e" Apr 17 20:51:03.598468 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:03.598439 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-pwcjq" Apr 17 20:51:03.598468 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:03.598469 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-pwcjq" Apr 17 20:51:03.599451 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:03.599430 2576 generic.go:358] "Generic (PLEG): container finished" podID="f9a06ede-5ace-4172-9e77-3eb6558ce0b1" containerID="d6452ad44736f2685ddc26cda3924248dcefd24a47310bf04a3d8b65d5865561" exitCode=0 Apr 17 20:51:03.599547 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:03.599472 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hzs8v" event={"ID":"f9a06ede-5ace-4172-9e77-3eb6558ce0b1","Type":"ContainerDied","Data":"d6452ad44736f2685ddc26cda3924248dcefd24a47310bf04a3d8b65d5865561"} Apr 17 20:51:03.600180 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:03.600163 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-g2jbd" Apr 17 20:51:03.615588 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:03.615571 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pwcjq" Apr 17 20:51:04.287728 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:04.287697 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/1690cc02-1c37-4fea-8728-f08db138a40f-original-pull-secret\") pod \"global-pull-secret-syncer-5ws25\" (UID: \"1690cc02-1c37-4fea-8728-f08db138a40f\") " pod="kube-system/global-pull-secret-syncer-5ws25" Apr 17 20:51:04.287890 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:51:04.287844 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 20:51:04.287955 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:51:04.287904 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1690cc02-1c37-4fea-8728-f08db138a40f-original-pull-secret podName:1690cc02-1c37-4fea-8728-f08db138a40f nodeName:}" failed. No retries permitted until 2026-04-17 20:51:12.287890591 +0000 UTC m=+34.294304604 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/1690cc02-1c37-4fea-8728-f08db138a40f-original-pull-secret") pod "global-pull-secret-syncer-5ws25" (UID: "1690cc02-1c37-4fea-8728-f08db138a40f") : object "kube-system"/"original-pull-secret" not registered Apr 17 20:51:04.483241 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:04.483220 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z2lfh" Apr 17 20:51:04.483535 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:04.483221 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hzbw9" Apr 17 20:51:04.483535 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:51:04.483310 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z2lfh" podUID="f35870d4-a814-46cd-8764-30e79f3a7d79" Apr 17 20:51:04.483535 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:51:04.483407 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hzbw9" podUID="276cca8f-28b9-4e8c-918c-32c78b8870e7" Apr 17 20:51:04.483535 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:04.483232 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5ws25" Apr 17 20:51:04.483535 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:51:04.483504 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5ws25" podUID="1690cc02-1c37-4fea-8728-f08db138a40f" Apr 17 20:51:04.604705 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:04.604683 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pwcjq_c5da4be1-c2a5-411b-a362-4b558f88d556/ovn-acl-logging/0.log" Apr 17 20:51:04.605012 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:04.604986 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwcjq" event={"ID":"c5da4be1-c2a5-411b-a362-4b558f88d556","Type":"ContainerStarted","Data":"013f79e114e3335bc30395f8d2b7d728b5dcc3b223c2fcc9e7f9818ed839777e"} Apr 17 20:51:04.605210 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:04.605189 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-pwcjq" Apr 17 20:51:04.606993 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:04.606963 2576 generic.go:358] "Generic (PLEG): container finished" podID="f9a06ede-5ace-4172-9e77-3eb6558ce0b1" containerID="3f462fc17fb9c739e59c870e1b78dc001f4000a544c69802bc96a199a35cb6a3" exitCode=0 Apr 17 20:51:04.607089 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:04.607057 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hzs8v" event={"ID":"f9a06ede-5ace-4172-9e77-3eb6558ce0b1","Type":"ContainerDied","Data":"3f462fc17fb9c739e59c870e1b78dc001f4000a544c69802bc96a199a35cb6a3"} Apr 17 20:51:04.620019 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:04.619999 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pwcjq" Apr 17 20:51:04.633532 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:04.633491 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-pwcjq" podStartSLOduration=9.69026594 podStartE2EDuration="26.633480243s" podCreationTimestamp="2026-04-17 20:50:38 +0000 UTC" firstStartedPulling="2026-04-17 20:50:41.10559749 +0000 UTC m=+3.112011504" lastFinishedPulling="2026-04-17 20:50:58.048811785 +0000 UTC m=+20.055225807" observedRunningTime="2026-04-17 20:51:04.633199394 +0000 UTC m=+26.639613419" watchObservedRunningTime="2026-04-17 20:51:04.633480243 +0000 UTC m=+26.639894305" Apr 17 20:51:05.115645 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:05.115376 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-5ws25"] Apr 17 20:51:05.115834 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:05.115768 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5ws25" Apr 17 20:51:05.115899 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:51:05.115852 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5ws25" podUID="1690cc02-1c37-4fea-8728-f08db138a40f" Apr 17 20:51:05.118561 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:05.118535 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-hzbw9"] Apr 17 20:51:05.118673 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:05.118657 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hzbw9" Apr 17 20:51:05.118777 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:51:05.118755 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hzbw9" podUID="276cca8f-28b9-4e8c-918c-32c78b8870e7" Apr 17 20:51:05.119326 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:05.119295 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-z2lfh"] Apr 17 20:51:05.119432 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:05.119414 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z2lfh" Apr 17 20:51:05.119533 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:51:05.119506 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z2lfh" podUID="f35870d4-a814-46cd-8764-30e79f3a7d79" Apr 17 20:51:05.610884 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:05.610857 2576 generic.go:358] "Generic (PLEG): container finished" podID="f9a06ede-5ace-4172-9e77-3eb6558ce0b1" containerID="514cfadac2f1b9518f8a2b8520786c76792626543f387caf4f32746ddb95a102" exitCode=0 Apr 17 20:51:05.611427 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:05.610954 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hzs8v" event={"ID":"f9a06ede-5ace-4172-9e77-3eb6558ce0b1","Type":"ContainerDied","Data":"514cfadac2f1b9518f8a2b8520786c76792626543f387caf4f32746ddb95a102"} Apr 17 20:51:06.482641 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:06.482603 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z2lfh" Apr 17 20:51:06.482802 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:06.482714 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5ws25" Apr 17 20:51:06.482802 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:51:06.482725 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z2lfh" podUID="f35870d4-a814-46cd-8764-30e79f3a7d79" Apr 17 20:51:06.482927 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:51:06.482814 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5ws25" podUID="1690cc02-1c37-4fea-8728-f08db138a40f" Apr 17 20:51:06.482927 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:06.482867 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hzbw9" Apr 17 20:51:06.483038 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:51:06.482941 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hzbw9" podUID="276cca8f-28b9-4e8c-918c-32c78b8870e7" Apr 17 20:51:08.483679 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:08.483648 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hzbw9" Apr 17 20:51:08.484335 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:08.483755 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z2lfh" Apr 17 20:51:08.484335 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:08.483791 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5ws25" Apr 17 20:51:08.484335 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:51:08.483790 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hzbw9" podUID="276cca8f-28b9-4e8c-918c-32c78b8870e7" Apr 17 20:51:08.484335 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:51:08.483857 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5ws25" podUID="1690cc02-1c37-4fea-8728-f08db138a40f" Apr 17 20:51:08.484335 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:51:08.483935 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z2lfh" podUID="f35870d4-a814-46cd-8764-30e79f3a7d79" Apr 17 20:51:10.483562 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:10.483529 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5ws25" Apr 17 20:51:10.484004 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:10.483529 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hzbw9" Apr 17 20:51:10.484004 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:51:10.483651 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5ws25" podUID="1690cc02-1c37-4fea-8728-f08db138a40f" Apr 17 20:51:10.484004 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:51:10.483746 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hzbw9" podUID="276cca8f-28b9-4e8c-918c-32c78b8870e7" Apr 17 20:51:10.484004 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:10.483529 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z2lfh" Apr 17 20:51:10.484004 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:51:10.483877 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z2lfh" podUID="f35870d4-a814-46cd-8764-30e79f3a7d79" Apr 17 20:51:10.852849 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:10.852823 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-248.ec2.internal" event="NodeReady" Apr 17 20:51:10.853009 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:10.852967 2576 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 17 20:51:10.888283 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:10.888256 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-6db95569ff-m8thh"] Apr 17 20:51:10.892889 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:10.892871 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6db95569ff-m8thh" Apr 17 20:51:10.895353 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:10.895327 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 17 20:51:10.895449 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:10.895356 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-j6pkz\"" Apr 17 20:51:10.895449 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:10.895381 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 17 20:51:10.895522 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:10.895492 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 17 20:51:10.898261 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:10.898233 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-wsdk8"] Apr 17 20:51:10.902662 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:10.902110 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 17 20:51:10.905869 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:10.905708 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-ws9kw"] Apr 17 20:51:10.905986 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:10.905887 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-wsdk8" Apr 17 20:51:10.908890 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:10.908763 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-jvfcm\"" Apr 17 20:51:10.908977 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:10.908919 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 17 20:51:10.908977 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:10.908958 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 17 20:51:10.909211 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:10.909192 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 17 20:51:10.909818 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:10.909504 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6db95569ff-m8thh"] Apr 17 20:51:10.909818 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:10.909529 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-wsdk8"] Apr 17 20:51:10.909818 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:10.909621 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-ws9kw" Apr 17 20:51:10.912303 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:10.912288 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 17 20:51:10.912482 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:10.912461 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-ws9kw"] Apr 17 20:51:10.912482 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:10.912480 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 17 20:51:10.912632 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:10.912617 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-tswpv\"" Apr 17 20:51:11.040429 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:11.040400 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d2a7b59a-da53-4886-af79-da4e37b76a3e-metrics-tls\") pod \"dns-default-ws9kw\" (UID: \"d2a7b59a-da53-4886-af79-da4e37b76a3e\") " pod="openshift-dns/dns-default-ws9kw" Apr 17 20:51:11.040579 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:11.040435 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ad71e77d-454a-4f81-b1ef-644d66518fd4-bound-sa-token\") pod \"image-registry-6db95569ff-m8thh\" (UID: \"ad71e77d-454a-4f81-b1ef-644d66518fd4\") " pod="openshift-image-registry/image-registry-6db95569ff-m8thh" Apr 17 20:51:11.040579 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:11.040457 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ad71e77d-454a-4f81-b1ef-644d66518fd4-registry-certificates\") pod \"image-registry-6db95569ff-m8thh\" (UID: \"ad71e77d-454a-4f81-b1ef-644d66518fd4\") " pod="openshift-image-registry/image-registry-6db95569ff-m8thh" Apr 17 20:51:11.040579 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:11.040472 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gdsf\" (UniqueName: \"kubernetes.io/projected/ad71e77d-454a-4f81-b1ef-644d66518fd4-kube-api-access-8gdsf\") pod \"image-registry-6db95569ff-m8thh\" (UID: \"ad71e77d-454a-4f81-b1ef-644d66518fd4\") " pod="openshift-image-registry/image-registry-6db95569ff-m8thh" Apr 17 20:51:11.040579 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:11.040559 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d2a7b59a-da53-4886-af79-da4e37b76a3e-tmp-dir\") pod \"dns-default-ws9kw\" (UID: \"d2a7b59a-da53-4886-af79-da4e37b76a3e\") " pod="openshift-dns/dns-default-ws9kw" Apr 17 20:51:11.040786 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:11.040592 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ad71e77d-454a-4f81-b1ef-644d66518fd4-trusted-ca\") pod \"image-registry-6db95569ff-m8thh\" (UID: \"ad71e77d-454a-4f81-b1ef-644d66518fd4\") " pod="openshift-image-registry/image-registry-6db95569ff-m8thh" Apr 17 20:51:11.040786 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:11.040657 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlpxj\" (UniqueName: \"kubernetes.io/projected/d2a7b59a-da53-4886-af79-da4e37b76a3e-kube-api-access-qlpxj\") pod \"dns-default-ws9kw\" (UID: \"d2a7b59a-da53-4886-af79-da4e37b76a3e\") " pod="openshift-dns/dns-default-ws9kw" Apr 17 20:51:11.040786 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:11.040683 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/ad71e77d-454a-4f81-b1ef-644d66518fd4-image-registry-private-configuration\") pod \"image-registry-6db95569ff-m8thh\" (UID: \"ad71e77d-454a-4f81-b1ef-644d66518fd4\") " pod="openshift-image-registry/image-registry-6db95569ff-m8thh" Apr 17 20:51:11.040786 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:11.040722 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d2a7b59a-da53-4886-af79-da4e37b76a3e-config-volume\") pod \"dns-default-ws9kw\" (UID: \"d2a7b59a-da53-4886-af79-da4e37b76a3e\") " pod="openshift-dns/dns-default-ws9kw" Apr 17 20:51:11.040786 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:11.040746 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sppf8\" (UniqueName: \"kubernetes.io/projected/be99f611-577c-4ca5-8103-64206cf5c3bd-kube-api-access-sppf8\") pod \"ingress-canary-wsdk8\" (UID: \"be99f611-577c-4ca5-8103-64206cf5c3bd\") " pod="openshift-ingress-canary/ingress-canary-wsdk8" Apr 17 20:51:11.040786 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:11.040772 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ad71e77d-454a-4f81-b1ef-644d66518fd4-ca-trust-extracted\") pod \"image-registry-6db95569ff-m8thh\" (UID: \"ad71e77d-454a-4f81-b1ef-644d66518fd4\") " pod="openshift-image-registry/image-registry-6db95569ff-m8thh" Apr 17 20:51:11.041003 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:11.040806 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ad71e77d-454a-4f81-b1ef-644d66518fd4-registry-tls\") pod \"image-registry-6db95569ff-m8thh\" (UID: \"ad71e77d-454a-4f81-b1ef-644d66518fd4\") " pod="openshift-image-registry/image-registry-6db95569ff-m8thh" Apr 17 20:51:11.041003 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:11.040853 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/be99f611-577c-4ca5-8103-64206cf5c3bd-cert\") pod \"ingress-canary-wsdk8\" (UID: \"be99f611-577c-4ca5-8103-64206cf5c3bd\") " pod="openshift-ingress-canary/ingress-canary-wsdk8" Apr 17 20:51:11.041003 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:11.040883 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ad71e77d-454a-4f81-b1ef-644d66518fd4-installation-pull-secrets\") pod \"image-registry-6db95569ff-m8thh\" (UID: \"ad71e77d-454a-4f81-b1ef-644d66518fd4\") " pod="openshift-image-registry/image-registry-6db95569ff-m8thh" Apr 17 20:51:11.141578 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:11.141558 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d2a7b59a-da53-4886-af79-da4e37b76a3e-metrics-tls\") pod \"dns-default-ws9kw\" (UID: \"d2a7b59a-da53-4886-af79-da4e37b76a3e\") " pod="openshift-dns/dns-default-ws9kw" Apr 17 20:51:11.141679 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:11.141587 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ad71e77d-454a-4f81-b1ef-644d66518fd4-bound-sa-token\") pod \"image-registry-6db95569ff-m8thh\" (UID: \"ad71e77d-454a-4f81-b1ef-644d66518fd4\") " pod="openshift-image-registry/image-registry-6db95569ff-m8thh" Apr 17 20:51:11.141679 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:11.141612 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ad71e77d-454a-4f81-b1ef-644d66518fd4-registry-certificates\") pod \"image-registry-6db95569ff-m8thh\" (UID: \"ad71e77d-454a-4f81-b1ef-644d66518fd4\") " pod="openshift-image-registry/image-registry-6db95569ff-m8thh" Apr 17 20:51:11.141679 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:11.141637 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8gdsf\" (UniqueName: \"kubernetes.io/projected/ad71e77d-454a-4f81-b1ef-644d66518fd4-kube-api-access-8gdsf\") pod \"image-registry-6db95569ff-m8thh\" (UID: \"ad71e77d-454a-4f81-b1ef-644d66518fd4\") " pod="openshift-image-registry/image-registry-6db95569ff-m8thh" Apr 17 20:51:11.141679 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:11.141668 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d2a7b59a-da53-4886-af79-da4e37b76a3e-tmp-dir\") pod \"dns-default-ws9kw\" (UID: \"d2a7b59a-da53-4886-af79-da4e37b76a3e\") " pod="openshift-dns/dns-default-ws9kw" Apr 17 20:51:11.141824 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:11.141692 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ad71e77d-454a-4f81-b1ef-644d66518fd4-trusted-ca\") pod \"image-registry-6db95569ff-m8thh\" (UID: \"ad71e77d-454a-4f81-b1ef-644d66518fd4\") " pod="openshift-image-registry/image-registry-6db95569ff-m8thh" Apr 17 20:51:11.141824 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:51:11.141702 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 20:51:11.141824 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:11.141750 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qlpxj\" (UniqueName: \"kubernetes.io/projected/d2a7b59a-da53-4886-af79-da4e37b76a3e-kube-api-access-qlpxj\") pod \"dns-default-ws9kw\" (UID: \"d2a7b59a-da53-4886-af79-da4e37b76a3e\") " pod="openshift-dns/dns-default-ws9kw" Apr 17 20:51:11.141824 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:51:11.141771 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d2a7b59a-da53-4886-af79-da4e37b76a3e-metrics-tls podName:d2a7b59a-da53-4886-af79-da4e37b76a3e nodeName:}" failed. No retries permitted until 2026-04-17 20:51:11.641750456 +0000 UTC m=+33.648164476 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d2a7b59a-da53-4886-af79-da4e37b76a3e-metrics-tls") pod "dns-default-ws9kw" (UID: "d2a7b59a-da53-4886-af79-da4e37b76a3e") : secret "dns-default-metrics-tls" not found Apr 17 20:51:11.141824 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:11.141806 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/ad71e77d-454a-4f81-b1ef-644d66518fd4-image-registry-private-configuration\") pod \"image-registry-6db95569ff-m8thh\" (UID: \"ad71e77d-454a-4f81-b1ef-644d66518fd4\") " pod="openshift-image-registry/image-registry-6db95569ff-m8thh" Apr 17 20:51:11.142078 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:11.141852 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d2a7b59a-da53-4886-af79-da4e37b76a3e-config-volume\") pod \"dns-default-ws9kw\" (UID: \"d2a7b59a-da53-4886-af79-da4e37b76a3e\") " pod="openshift-dns/dns-default-ws9kw" Apr 17 20:51:11.142078 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:11.141879 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sppf8\" (UniqueName: \"kubernetes.io/projected/be99f611-577c-4ca5-8103-64206cf5c3bd-kube-api-access-sppf8\") pod \"ingress-canary-wsdk8\" (UID: \"be99f611-577c-4ca5-8103-64206cf5c3bd\") " pod="openshift-ingress-canary/ingress-canary-wsdk8" Apr 17 20:51:11.142078 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:11.141918 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ad71e77d-454a-4f81-b1ef-644d66518fd4-ca-trust-extracted\") pod \"image-registry-6db95569ff-m8thh\" (UID: \"ad71e77d-454a-4f81-b1ef-644d66518fd4\") " pod="openshift-image-registry/image-registry-6db95569ff-m8thh" Apr 17 20:51:11.142078 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:11.141957 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ad71e77d-454a-4f81-b1ef-644d66518fd4-registry-tls\") pod \"image-registry-6db95569ff-m8thh\" (UID: \"ad71e77d-454a-4f81-b1ef-644d66518fd4\") " pod="openshift-image-registry/image-registry-6db95569ff-m8thh" Apr 17 20:51:11.142078 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:11.141994 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/be99f611-577c-4ca5-8103-64206cf5c3bd-cert\") pod \"ingress-canary-wsdk8\" (UID: \"be99f611-577c-4ca5-8103-64206cf5c3bd\") " pod="openshift-ingress-canary/ingress-canary-wsdk8" Apr 17 20:51:11.142078 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:11.142000 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d2a7b59a-da53-4886-af79-da4e37b76a3e-tmp-dir\") pod \"dns-default-ws9kw\" (UID: \"d2a7b59a-da53-4886-af79-da4e37b76a3e\") " pod="openshift-dns/dns-default-ws9kw" Apr 17 20:51:11.142078 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:11.142019 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ad71e77d-454a-4f81-b1ef-644d66518fd4-installation-pull-secrets\") pod \"image-registry-6db95569ff-m8thh\" (UID: \"ad71e77d-454a-4f81-b1ef-644d66518fd4\") " pod="openshift-image-registry/image-registry-6db95569ff-m8thh" Apr 17 20:51:11.142433 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:11.142275 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ad71e77d-454a-4f81-b1ef-644d66518fd4-registry-certificates\") pod \"image-registry-6db95569ff-m8thh\" (UID: \"ad71e77d-454a-4f81-b1ef-644d66518fd4\") " pod="openshift-image-registry/image-registry-6db95569ff-m8thh" Apr 17 20:51:11.142433 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:51:11.142364 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 20:51:11.142433 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:11.142367 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ad71e77d-454a-4f81-b1ef-644d66518fd4-ca-trust-extracted\") pod \"image-registry-6db95569ff-m8thh\" (UID: \"ad71e77d-454a-4f81-b1ef-644d66518fd4\") " pod="openshift-image-registry/image-registry-6db95569ff-m8thh" Apr 17 20:51:11.142433 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:11.142397 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d2a7b59a-da53-4886-af79-da4e37b76a3e-config-volume\") pod \"dns-default-ws9kw\" (UID: \"d2a7b59a-da53-4886-af79-da4e37b76a3e\") " pod="openshift-dns/dns-default-ws9kw" Apr 17 20:51:11.142433 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:51:11.142371 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 20:51:11.142433 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:51:11.142424 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/be99f611-577c-4ca5-8103-64206cf5c3bd-cert podName:be99f611-577c-4ca5-8103-64206cf5c3bd nodeName:}" failed. No retries permitted until 2026-04-17 20:51:11.642400974 +0000 UTC m=+33.648814992 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/be99f611-577c-4ca5-8103-64206cf5c3bd-cert") pod "ingress-canary-wsdk8" (UID: "be99f611-577c-4ca5-8103-64206cf5c3bd") : secret "canary-serving-cert" not found Apr 17 20:51:11.142433 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:51:11.142430 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6db95569ff-m8thh: secret "image-registry-tls" not found Apr 17 20:51:11.142724 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:51:11.142467 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ad71e77d-454a-4f81-b1ef-644d66518fd4-registry-tls podName:ad71e77d-454a-4f81-b1ef-644d66518fd4 nodeName:}" failed. No retries permitted until 2026-04-17 20:51:11.642452017 +0000 UTC m=+33.648866027 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/ad71e77d-454a-4f81-b1ef-644d66518fd4-registry-tls") pod "image-registry-6db95569ff-m8thh" (UID: "ad71e77d-454a-4f81-b1ef-644d66518fd4") : secret "image-registry-tls" not found Apr 17 20:51:11.145690 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:11.145674 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/ad71e77d-454a-4f81-b1ef-644d66518fd4-image-registry-private-configuration\") pod \"image-registry-6db95569ff-m8thh\" (UID: \"ad71e77d-454a-4f81-b1ef-644d66518fd4\") " pod="openshift-image-registry/image-registry-6db95569ff-m8thh" Apr 17 20:51:11.145756 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:11.145678 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ad71e77d-454a-4f81-b1ef-644d66518fd4-installation-pull-secrets\") pod \"image-registry-6db95569ff-m8thh\" (UID: \"ad71e77d-454a-4f81-b1ef-644d66518fd4\") " pod="openshift-image-registry/image-registry-6db95569ff-m8thh" Apr 17 20:51:11.150585 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:11.150553 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ad71e77d-454a-4f81-b1ef-644d66518fd4-bound-sa-token\") pod \"image-registry-6db95569ff-m8thh\" (UID: \"ad71e77d-454a-4f81-b1ef-644d66518fd4\") " pod="openshift-image-registry/image-registry-6db95569ff-m8thh" Apr 17 20:51:11.150685 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:11.150668 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sppf8\" (UniqueName: \"kubernetes.io/projected/be99f611-577c-4ca5-8103-64206cf5c3bd-kube-api-access-sppf8\") pod \"ingress-canary-wsdk8\" (UID: \"be99f611-577c-4ca5-8103-64206cf5c3bd\") " pod="openshift-ingress-canary/ingress-canary-wsdk8" Apr 17 20:51:11.151454 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:11.151435 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlpxj\" (UniqueName: \"kubernetes.io/projected/d2a7b59a-da53-4886-af79-da4e37b76a3e-kube-api-access-qlpxj\") pod \"dns-default-ws9kw\" (UID: \"d2a7b59a-da53-4886-af79-da4e37b76a3e\") " pod="openshift-dns/dns-default-ws9kw" Apr 17 20:51:11.151686 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:11.151671 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gdsf\" (UniqueName: \"kubernetes.io/projected/ad71e77d-454a-4f81-b1ef-644d66518fd4-kube-api-access-8gdsf\") pod \"image-registry-6db95569ff-m8thh\" (UID: \"ad71e77d-454a-4f81-b1ef-644d66518fd4\") " pod="openshift-image-registry/image-registry-6db95569ff-m8thh" Apr 17 20:51:11.159789 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:11.159771 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ad71e77d-454a-4f81-b1ef-644d66518fd4-trusted-ca\") pod \"image-registry-6db95569ff-m8thh\" (UID: \"ad71e77d-454a-4f81-b1ef-644d66518fd4\") " pod="openshift-image-registry/image-registry-6db95569ff-m8thh" Apr 17 20:51:11.624760 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:11.624731 2576 generic.go:358] "Generic (PLEG): container finished" podID="f9a06ede-5ace-4172-9e77-3eb6558ce0b1" containerID="12b4705ed124bc35a0b6c0eb037021ba6457a07052577b9d94039135e615f38b" exitCode=0 Apr 17 20:51:11.625110 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:11.624804 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hzs8v" event={"ID":"f9a06ede-5ace-4172-9e77-3eb6558ce0b1","Type":"ContainerDied","Data":"12b4705ed124bc35a0b6c0eb037021ba6457a07052577b9d94039135e615f38b"} Apr 17 20:51:11.645419 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:11.645395 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ad71e77d-454a-4f81-b1ef-644d66518fd4-registry-tls\") pod \"image-registry-6db95569ff-m8thh\" (UID: \"ad71e77d-454a-4f81-b1ef-644d66518fd4\") " pod="openshift-image-registry/image-registry-6db95569ff-m8thh" Apr 17 20:51:11.645543 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:11.645434 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/be99f611-577c-4ca5-8103-64206cf5c3bd-cert\") pod \"ingress-canary-wsdk8\" (UID: \"be99f611-577c-4ca5-8103-64206cf5c3bd\") " pod="openshift-ingress-canary/ingress-canary-wsdk8" Apr 17 20:51:11.645604 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:51:11.645541 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 20:51:11.645604 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:51:11.645545 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 20:51:11.645604 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:51:11.645567 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6db95569ff-m8thh: secret "image-registry-tls" not found Apr 17 20:51:11.645604 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:11.645582 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d2a7b59a-da53-4886-af79-da4e37b76a3e-metrics-tls\") pod \"dns-default-ws9kw\" (UID: \"d2a7b59a-da53-4886-af79-da4e37b76a3e\") " pod="openshift-dns/dns-default-ws9kw" Apr 17 20:51:11.645742 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:51:11.645612 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ad71e77d-454a-4f81-b1ef-644d66518fd4-registry-tls podName:ad71e77d-454a-4f81-b1ef-644d66518fd4 nodeName:}" failed. No retries permitted until 2026-04-17 20:51:12.645592019 +0000 UTC m=+34.652006046 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/ad71e77d-454a-4f81-b1ef-644d66518fd4-registry-tls") pod "image-registry-6db95569ff-m8thh" (UID: "ad71e77d-454a-4f81-b1ef-644d66518fd4") : secret "image-registry-tls" not found Apr 17 20:51:11.645742 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:51:11.645652 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 20:51:11.645742 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:51:11.645653 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/be99f611-577c-4ca5-8103-64206cf5c3bd-cert podName:be99f611-577c-4ca5-8103-64206cf5c3bd nodeName:}" failed. No retries permitted until 2026-04-17 20:51:12.645635173 +0000 UTC m=+34.652049183 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/be99f611-577c-4ca5-8103-64206cf5c3bd-cert") pod "ingress-canary-wsdk8" (UID: "be99f611-577c-4ca5-8103-64206cf5c3bd") : secret "canary-serving-cert" not found Apr 17 20:51:11.645742 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:51:11.645697 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d2a7b59a-da53-4886-af79-da4e37b76a3e-metrics-tls podName:d2a7b59a-da53-4886-af79-da4e37b76a3e nodeName:}" failed. No retries permitted until 2026-04-17 20:51:12.645686841 +0000 UTC m=+34.652100851 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d2a7b59a-da53-4886-af79-da4e37b76a3e-metrics-tls") pod "dns-default-ws9kw" (UID: "d2a7b59a-da53-4886-af79-da4e37b76a3e") : secret "dns-default-metrics-tls" not found Apr 17 20:51:12.249975 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:12.249936 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xptqq\" (UniqueName: \"kubernetes.io/projected/276cca8f-28b9-4e8c-918c-32c78b8870e7-kube-api-access-xptqq\") pod \"network-check-target-hzbw9\" (UID: \"276cca8f-28b9-4e8c-918c-32c78b8870e7\") " pod="openshift-network-diagnostics/network-check-target-hzbw9" Apr 17 20:51:12.250207 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:12.249994 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f35870d4-a814-46cd-8764-30e79f3a7d79-metrics-certs\") pod \"network-metrics-daemon-z2lfh\" (UID: \"f35870d4-a814-46cd-8764-30e79f3a7d79\") " pod="openshift-multus/network-metrics-daemon-z2lfh" Apr 17 20:51:12.250207 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:51:12.250111 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 20:51:12.250207 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:51:12.250131 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 20:51:12.250207 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:51:12.250136 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:51:12.250207 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:51:12.250141 2576 projected.go:194] Error preparing data for projected volume kube-api-access-xptqq for pod openshift-network-diagnostics/network-check-target-hzbw9: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:51:12.250207 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:51:12.250193 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f35870d4-a814-46cd-8764-30e79f3a7d79-metrics-certs podName:f35870d4-a814-46cd-8764-30e79f3a7d79 nodeName:}" failed. No retries permitted until 2026-04-17 20:51:44.250177721 +0000 UTC m=+66.256591731 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f35870d4-a814-46cd-8764-30e79f3a7d79-metrics-certs") pod "network-metrics-daemon-z2lfh" (UID: "f35870d4-a814-46cd-8764-30e79f3a7d79") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:51:12.250424 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:51:12.250231 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/276cca8f-28b9-4e8c-918c-32c78b8870e7-kube-api-access-xptqq podName:276cca8f-28b9-4e8c-918c-32c78b8870e7 nodeName:}" failed. No retries permitted until 2026-04-17 20:51:44.250215581 +0000 UTC m=+66.256629594 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-xptqq" (UniqueName: "kubernetes.io/projected/276cca8f-28b9-4e8c-918c-32c78b8870e7-kube-api-access-xptqq") pod "network-check-target-hzbw9" (UID: "276cca8f-28b9-4e8c-918c-32c78b8870e7") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:51:12.350750 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:12.350685 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/1690cc02-1c37-4fea-8728-f08db138a40f-original-pull-secret\") pod \"global-pull-secret-syncer-5ws25\" (UID: \"1690cc02-1c37-4fea-8728-f08db138a40f\") " pod="kube-system/global-pull-secret-syncer-5ws25" Apr 17 20:51:12.350859 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:51:12.350830 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 20:51:12.350895 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:51:12.350881 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1690cc02-1c37-4fea-8728-f08db138a40f-original-pull-secret podName:1690cc02-1c37-4fea-8728-f08db138a40f nodeName:}" failed. No retries permitted until 2026-04-17 20:51:28.350864793 +0000 UTC m=+50.357278820 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/1690cc02-1c37-4fea-8728-f08db138a40f-original-pull-secret") pod "global-pull-secret-syncer-5ws25" (UID: "1690cc02-1c37-4fea-8728-f08db138a40f") : object "kube-system"/"original-pull-secret" not registered Apr 17 20:51:12.483305 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:12.483283 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z2lfh" Apr 17 20:51:12.483417 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:12.483283 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hzbw9" Apr 17 20:51:12.483519 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:12.483283 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5ws25" Apr 17 20:51:12.485893 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:12.485873 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 17 20:51:12.486001 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:12.485871 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 20:51:12.487011 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:12.486995 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-jrdfc\"" Apr 17 20:51:12.487117 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:12.487012 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 20:51:12.487181 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:12.487120 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 20:51:12.487277 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:12.487257 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-f22jz\"" Apr 17 20:51:12.629138 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:12.629081 2576 generic.go:358] "Generic (PLEG): container finished" podID="f9a06ede-5ace-4172-9e77-3eb6558ce0b1" containerID="6ce101e3888085be91b38479343fff4d4a3db3f138f01914d3bfe9b5cae7d5b5" exitCode=0 Apr 17 20:51:12.629138 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:12.629127 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hzs8v" event={"ID":"f9a06ede-5ace-4172-9e77-3eb6558ce0b1","Type":"ContainerDied","Data":"6ce101e3888085be91b38479343fff4d4a3db3f138f01914d3bfe9b5cae7d5b5"} Apr 17 20:51:12.653980 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:12.653948 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ad71e77d-454a-4f81-b1ef-644d66518fd4-registry-tls\") pod \"image-registry-6db95569ff-m8thh\" (UID: \"ad71e77d-454a-4f81-b1ef-644d66518fd4\") " pod="openshift-image-registry/image-registry-6db95569ff-m8thh" Apr 17 20:51:12.654082 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:12.654007 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/be99f611-577c-4ca5-8103-64206cf5c3bd-cert\") pod \"ingress-canary-wsdk8\" (UID: \"be99f611-577c-4ca5-8103-64206cf5c3bd\") " pod="openshift-ingress-canary/ingress-canary-wsdk8" Apr 17 20:51:12.654132 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:12.654083 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d2a7b59a-da53-4886-af79-da4e37b76a3e-metrics-tls\") pod \"dns-default-ws9kw\" (UID: \"d2a7b59a-da53-4886-af79-da4e37b76a3e\") " pod="openshift-dns/dns-default-ws9kw" Apr 17 20:51:12.654132 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:51:12.654115 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 20:51:12.654209 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:51:12.654134 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6db95569ff-m8thh: secret "image-registry-tls" not found Apr 17 20:51:12.654209 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:51:12.654174 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 20:51:12.654209 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:51:12.654184 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ad71e77d-454a-4f81-b1ef-644d66518fd4-registry-tls podName:ad71e77d-454a-4f81-b1ef-644d66518fd4 nodeName:}" failed. No retries permitted until 2026-04-17 20:51:14.654166422 +0000 UTC m=+36.660580441 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/ad71e77d-454a-4f81-b1ef-644d66518fd4-registry-tls") pod "image-registry-6db95569ff-m8thh" (UID: "ad71e77d-454a-4f81-b1ef-644d66518fd4") : secret "image-registry-tls" not found Apr 17 20:51:12.654346 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:51:12.654219 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d2a7b59a-da53-4886-af79-da4e37b76a3e-metrics-tls podName:d2a7b59a-da53-4886-af79-da4e37b76a3e nodeName:}" failed. No retries permitted until 2026-04-17 20:51:14.654203334 +0000 UTC m=+36.660617357 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d2a7b59a-da53-4886-af79-da4e37b76a3e-metrics-tls") pod "dns-default-ws9kw" (UID: "d2a7b59a-da53-4886-af79-da4e37b76a3e") : secret "dns-default-metrics-tls" not found Apr 17 20:51:12.654346 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:51:12.654261 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 20:51:12.654346 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:51:12.654312 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/be99f611-577c-4ca5-8103-64206cf5c3bd-cert podName:be99f611-577c-4ca5-8103-64206cf5c3bd nodeName:}" failed. No retries permitted until 2026-04-17 20:51:14.654300082 +0000 UTC m=+36.660714097 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/be99f611-577c-4ca5-8103-64206cf5c3bd-cert") pod "ingress-canary-wsdk8" (UID: "be99f611-577c-4ca5-8103-64206cf5c3bd") : secret "canary-serving-cert" not found Apr 17 20:51:13.633501 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:13.633469 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hzs8v" event={"ID":"f9a06ede-5ace-4172-9e77-3eb6558ce0b1","Type":"ContainerStarted","Data":"20353b07701320cc3cfca1e1d77db168a187b2202dbcc6aa5b033808e3653af0"} Apr 17 20:51:13.655986 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:13.655948 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-hzs8v" podStartSLOduration=5.686804446 podStartE2EDuration="35.655935256s" podCreationTimestamp="2026-04-17 20:50:38 +0000 UTC" firstStartedPulling="2026-04-17 20:50:41.09952314 +0000 UTC m=+3.105937154" lastFinishedPulling="2026-04-17 20:51:11.068653949 +0000 UTC m=+33.075067964" observedRunningTime="2026-04-17 20:51:13.65433916 +0000 UTC m=+35.660753191" watchObservedRunningTime="2026-04-17 20:51:13.655935256 +0000 UTC m=+35.662349279" Apr 17 20:51:14.667036 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:14.666997 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ad71e77d-454a-4f81-b1ef-644d66518fd4-registry-tls\") pod \"image-registry-6db95569ff-m8thh\" (UID: \"ad71e77d-454a-4f81-b1ef-644d66518fd4\") " pod="openshift-image-registry/image-registry-6db95569ff-m8thh" Apr 17 20:51:14.667334 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:14.667049 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/be99f611-577c-4ca5-8103-64206cf5c3bd-cert\") pod \"ingress-canary-wsdk8\" (UID: \"be99f611-577c-4ca5-8103-64206cf5c3bd\") " pod="openshift-ingress-canary/ingress-canary-wsdk8" Apr 17 20:51:14.667334 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:14.667122 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d2a7b59a-da53-4886-af79-da4e37b76a3e-metrics-tls\") pod \"dns-default-ws9kw\" (UID: \"d2a7b59a-da53-4886-af79-da4e37b76a3e\") " pod="openshift-dns/dns-default-ws9kw" Apr 17 20:51:14.667334 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:51:14.667154 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 20:51:14.667334 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:51:14.667247 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/be99f611-577c-4ca5-8103-64206cf5c3bd-cert podName:be99f611-577c-4ca5-8103-64206cf5c3bd nodeName:}" failed. No retries permitted until 2026-04-17 20:51:18.667229457 +0000 UTC m=+40.673643470 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/be99f611-577c-4ca5-8103-64206cf5c3bd-cert") pod "ingress-canary-wsdk8" (UID: "be99f611-577c-4ca5-8103-64206cf5c3bd") : secret "canary-serving-cert" not found Apr 17 20:51:14.667334 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:51:14.667253 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 20:51:14.667334 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:51:14.667269 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6db95569ff-m8thh: secret "image-registry-tls" not found Apr 17 20:51:14.667334 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:51:14.667272 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 20:51:14.667334 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:51:14.667321 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d2a7b59a-da53-4886-af79-da4e37b76a3e-metrics-tls podName:d2a7b59a-da53-4886-af79-da4e37b76a3e nodeName:}" failed. No retries permitted until 2026-04-17 20:51:18.667310657 +0000 UTC m=+40.673724679 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d2a7b59a-da53-4886-af79-da4e37b76a3e-metrics-tls") pod "dns-default-ws9kw" (UID: "d2a7b59a-da53-4886-af79-da4e37b76a3e") : secret "dns-default-metrics-tls" not found Apr 17 20:51:14.667334 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:51:14.667337 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ad71e77d-454a-4f81-b1ef-644d66518fd4-registry-tls podName:ad71e77d-454a-4f81-b1ef-644d66518fd4 nodeName:}" failed. No retries permitted until 2026-04-17 20:51:18.66732912 +0000 UTC m=+40.673743145 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/ad71e77d-454a-4f81-b1ef-644d66518fd4-registry-tls") pod "image-registry-6db95569ff-m8thh" (UID: "ad71e77d-454a-4f81-b1ef-644d66518fd4") : secret "image-registry-tls" not found Apr 17 20:51:18.696436 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:18.696405 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ad71e77d-454a-4f81-b1ef-644d66518fd4-registry-tls\") pod \"image-registry-6db95569ff-m8thh\" (UID: \"ad71e77d-454a-4f81-b1ef-644d66518fd4\") " pod="openshift-image-registry/image-registry-6db95569ff-m8thh" Apr 17 20:51:18.696436 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:18.696436 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/be99f611-577c-4ca5-8103-64206cf5c3bd-cert\") pod \"ingress-canary-wsdk8\" (UID: \"be99f611-577c-4ca5-8103-64206cf5c3bd\") " pod="openshift-ingress-canary/ingress-canary-wsdk8" Apr 17 20:51:18.696798 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:18.696461 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d2a7b59a-da53-4886-af79-da4e37b76a3e-metrics-tls\") pod \"dns-default-ws9kw\" (UID: \"d2a7b59a-da53-4886-af79-da4e37b76a3e\") " pod="openshift-dns/dns-default-ws9kw" Apr 17 20:51:18.696798 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:51:18.696542 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 20:51:18.696798 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:51:18.696549 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 20:51:18.696798 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:51:18.696590 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d2a7b59a-da53-4886-af79-da4e37b76a3e-metrics-tls podName:d2a7b59a-da53-4886-af79-da4e37b76a3e nodeName:}" failed. No retries permitted until 2026-04-17 20:51:26.696576624 +0000 UTC m=+48.702990634 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d2a7b59a-da53-4886-af79-da4e37b76a3e-metrics-tls") pod "dns-default-ws9kw" (UID: "d2a7b59a-da53-4886-af79-da4e37b76a3e") : secret "dns-default-metrics-tls" not found Apr 17 20:51:18.696798 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:51:18.696612 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/be99f611-577c-4ca5-8103-64206cf5c3bd-cert podName:be99f611-577c-4ca5-8103-64206cf5c3bd nodeName:}" failed. No retries permitted until 2026-04-17 20:51:26.696600129 +0000 UTC m=+48.703014139 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/be99f611-577c-4ca5-8103-64206cf5c3bd-cert") pod "ingress-canary-wsdk8" (UID: "be99f611-577c-4ca5-8103-64206cf5c3bd") : secret "canary-serving-cert" not found Apr 17 20:51:18.696798 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:51:18.696551 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 20:51:18.696798 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:51:18.696625 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6db95569ff-m8thh: secret "image-registry-tls" not found Apr 17 20:51:18.696798 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:51:18.696654 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ad71e77d-454a-4f81-b1ef-644d66518fd4-registry-tls podName:ad71e77d-454a-4f81-b1ef-644d66518fd4 nodeName:}" failed. No retries permitted until 2026-04-17 20:51:26.696644554 +0000 UTC m=+48.703058609 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/ad71e77d-454a-4f81-b1ef-644d66518fd4-registry-tls") pod "image-registry-6db95569ff-m8thh" (UID: "ad71e77d-454a-4f81-b1ef-644d66518fd4") : secret "image-registry-tls" not found Apr 17 20:51:26.756870 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:26.756832 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ad71e77d-454a-4f81-b1ef-644d66518fd4-registry-tls\") pod \"image-registry-6db95569ff-m8thh\" (UID: \"ad71e77d-454a-4f81-b1ef-644d66518fd4\") " pod="openshift-image-registry/image-registry-6db95569ff-m8thh" Apr 17 20:51:26.756870 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:26.756869 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/be99f611-577c-4ca5-8103-64206cf5c3bd-cert\") pod \"ingress-canary-wsdk8\" (UID: \"be99f611-577c-4ca5-8103-64206cf5c3bd\") " pod="openshift-ingress-canary/ingress-canary-wsdk8" Apr 17 20:51:26.757307 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:26.756896 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d2a7b59a-da53-4886-af79-da4e37b76a3e-metrics-tls\") pod \"dns-default-ws9kw\" (UID: \"d2a7b59a-da53-4886-af79-da4e37b76a3e\") " pod="openshift-dns/dns-default-ws9kw" Apr 17 20:51:26.757307 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:51:26.756981 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 20:51:26.757307 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:51:26.757052 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/be99f611-577c-4ca5-8103-64206cf5c3bd-cert podName:be99f611-577c-4ca5-8103-64206cf5c3bd nodeName:}" failed. No retries permitted until 2026-04-17 20:51:42.757037471 +0000 UTC m=+64.763451495 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/be99f611-577c-4ca5-8103-64206cf5c3bd-cert") pod "ingress-canary-wsdk8" (UID: "be99f611-577c-4ca5-8103-64206cf5c3bd") : secret "canary-serving-cert" not found Apr 17 20:51:26.757307 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:51:26.756988 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 20:51:26.757307 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:51:26.757096 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d2a7b59a-da53-4886-af79-da4e37b76a3e-metrics-tls podName:d2a7b59a-da53-4886-af79-da4e37b76a3e nodeName:}" failed. No retries permitted until 2026-04-17 20:51:42.757085219 +0000 UTC m=+64.763499229 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d2a7b59a-da53-4886-af79-da4e37b76a3e-metrics-tls") pod "dns-default-ws9kw" (UID: "d2a7b59a-da53-4886-af79-da4e37b76a3e") : secret "dns-default-metrics-tls" not found Apr 17 20:51:26.757307 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:51:26.756986 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 20:51:26.757307 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:51:26.757109 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6db95569ff-m8thh: secret "image-registry-tls" not found Apr 17 20:51:26.757307 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:51:26.757128 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ad71e77d-454a-4f81-b1ef-644d66518fd4-registry-tls podName:ad71e77d-454a-4f81-b1ef-644d66518fd4 nodeName:}" failed. No retries permitted until 2026-04-17 20:51:42.7571225 +0000 UTC m=+64.763536510 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/ad71e77d-454a-4f81-b1ef-644d66518fd4-registry-tls") pod "image-registry-6db95569ff-m8thh" (UID: "ad71e77d-454a-4f81-b1ef-644d66518fd4") : secret "image-registry-tls" not found Apr 17 20:51:28.366513 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:28.366477 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/1690cc02-1c37-4fea-8728-f08db138a40f-original-pull-secret\") pod \"global-pull-secret-syncer-5ws25\" (UID: \"1690cc02-1c37-4fea-8728-f08db138a40f\") " pod="kube-system/global-pull-secret-syncer-5ws25" Apr 17 20:51:28.369205 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:28.369184 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/1690cc02-1c37-4fea-8728-f08db138a40f-original-pull-secret\") pod \"global-pull-secret-syncer-5ws25\" (UID: \"1690cc02-1c37-4fea-8728-f08db138a40f\") " pod="kube-system/global-pull-secret-syncer-5ws25" Apr 17 20:51:28.403519 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:28.403489 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5ws25" Apr 17 20:51:28.573785 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:28.573757 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-5ws25"] Apr 17 20:51:28.577290 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:51:28.577259 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1690cc02_1c37_4fea_8728_f08db138a40f.slice/crio-f30068968d851c16b6a20e77dbe80ecdecb17421fafbf1ff5dc72f8b798592d4 WatchSource:0}: Error finding container f30068968d851c16b6a20e77dbe80ecdecb17421fafbf1ff5dc72f8b798592d4: Status 404 returned error can't find the container with id f30068968d851c16b6a20e77dbe80ecdecb17421fafbf1ff5dc72f8b798592d4 Apr 17 20:51:28.659785 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:28.659708 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-5ws25" event={"ID":"1690cc02-1c37-4fea-8728-f08db138a40f","Type":"ContainerStarted","Data":"f30068968d851c16b6a20e77dbe80ecdecb17421fafbf1ff5dc72f8b798592d4"} Apr 17 20:51:32.667259 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:32.667178 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-5ws25" event={"ID":"1690cc02-1c37-4fea-8728-f08db138a40f","Type":"ContainerStarted","Data":"79a2759f5eb78b79fb02682a9012e29d1add78f57aeab7f409c1786c188f8c0e"} Apr 17 20:51:32.681653 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:32.681611 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-5ws25" podStartSLOduration=32.912755895 podStartE2EDuration="36.681598008s" podCreationTimestamp="2026-04-17 20:50:56 +0000 UTC" firstStartedPulling="2026-04-17 20:51:28.578819744 +0000 UTC m=+50.585233754" lastFinishedPulling="2026-04-17 20:51:32.34766185 +0000 UTC m=+54.354075867" observedRunningTime="2026-04-17 20:51:32.681301787 +0000 UTC m=+54.687715819" watchObservedRunningTime="2026-04-17 20:51:32.681598008 +0000 UTC m=+54.688012037" Apr 17 20:51:36.622282 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:36.622255 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pwcjq" Apr 17 20:51:42.774528 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:42.774488 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ad71e77d-454a-4f81-b1ef-644d66518fd4-registry-tls\") pod \"image-registry-6db95569ff-m8thh\" (UID: \"ad71e77d-454a-4f81-b1ef-644d66518fd4\") " pod="openshift-image-registry/image-registry-6db95569ff-m8thh" Apr 17 20:51:42.774528 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:42.774526 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/be99f611-577c-4ca5-8103-64206cf5c3bd-cert\") pod \"ingress-canary-wsdk8\" (UID: \"be99f611-577c-4ca5-8103-64206cf5c3bd\") " pod="openshift-ingress-canary/ingress-canary-wsdk8" Apr 17 20:51:42.775072 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:42.774547 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d2a7b59a-da53-4886-af79-da4e37b76a3e-metrics-tls\") pod \"dns-default-ws9kw\" (UID: \"d2a7b59a-da53-4886-af79-da4e37b76a3e\") " pod="openshift-dns/dns-default-ws9kw" Apr 17 20:51:42.775072 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:51:42.774625 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 20:51:42.775072 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:51:42.774643 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6db95569ff-m8thh: secret "image-registry-tls" not found Apr 17 20:51:42.775072 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:51:42.774648 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 20:51:42.775072 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:51:42.774671 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 20:51:42.775072 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:51:42.774693 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d2a7b59a-da53-4886-af79-da4e37b76a3e-metrics-tls podName:d2a7b59a-da53-4886-af79-da4e37b76a3e nodeName:}" failed. No retries permitted until 2026-04-17 20:52:14.774680716 +0000 UTC m=+96.781094726 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d2a7b59a-da53-4886-af79-da4e37b76a3e-metrics-tls") pod "dns-default-ws9kw" (UID: "d2a7b59a-da53-4886-af79-da4e37b76a3e") : secret "dns-default-metrics-tls" not found Apr 17 20:51:42.775072 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:51:42.774706 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ad71e77d-454a-4f81-b1ef-644d66518fd4-registry-tls podName:ad71e77d-454a-4f81-b1ef-644d66518fd4 nodeName:}" failed. No retries permitted until 2026-04-17 20:52:14.774700447 +0000 UTC m=+96.781114457 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/ad71e77d-454a-4f81-b1ef-644d66518fd4-registry-tls") pod "image-registry-6db95569ff-m8thh" (UID: "ad71e77d-454a-4f81-b1ef-644d66518fd4") : secret "image-registry-tls" not found Apr 17 20:51:42.775072 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:51:42.774736 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/be99f611-577c-4ca5-8103-64206cf5c3bd-cert podName:be99f611-577c-4ca5-8103-64206cf5c3bd nodeName:}" failed. No retries permitted until 2026-04-17 20:52:14.774719117 +0000 UTC m=+96.781133143 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/be99f611-577c-4ca5-8103-64206cf5c3bd-cert") pod "ingress-canary-wsdk8" (UID: "be99f611-577c-4ca5-8103-64206cf5c3bd") : secret "canary-serving-cert" not found Apr 17 20:51:44.286445 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:44.286414 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xptqq\" (UniqueName: \"kubernetes.io/projected/276cca8f-28b9-4e8c-918c-32c78b8870e7-kube-api-access-xptqq\") pod \"network-check-target-hzbw9\" (UID: \"276cca8f-28b9-4e8c-918c-32c78b8870e7\") " pod="openshift-network-diagnostics/network-check-target-hzbw9" Apr 17 20:51:44.287144 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:44.286454 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f35870d4-a814-46cd-8764-30e79f3a7d79-metrics-certs\") pod \"network-metrics-daemon-z2lfh\" (UID: \"f35870d4-a814-46cd-8764-30e79f3a7d79\") " pod="openshift-multus/network-metrics-daemon-z2lfh" Apr 17 20:51:44.289409 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:44.289389 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 20:51:44.289502 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:44.289408 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 20:51:44.297668 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:51:44.297650 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 20:51:44.297741 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:51:44.297712 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f35870d4-a814-46cd-8764-30e79f3a7d79-metrics-certs podName:f35870d4-a814-46cd-8764-30e79f3a7d79 nodeName:}" failed. No retries permitted until 2026-04-17 20:52:48.29769216 +0000 UTC m=+130.304106170 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f35870d4-a814-46cd-8764-30e79f3a7d79-metrics-certs") pod "network-metrics-daemon-z2lfh" (UID: "f35870d4-a814-46cd-8764-30e79f3a7d79") : secret "metrics-daemon-secret" not found Apr 17 20:51:44.299694 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:44.299678 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 20:51:44.310235 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:44.310213 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xptqq\" (UniqueName: \"kubernetes.io/projected/276cca8f-28b9-4e8c-918c-32c78b8870e7-kube-api-access-xptqq\") pod \"network-check-target-hzbw9\" (UID: \"276cca8f-28b9-4e8c-918c-32c78b8870e7\") " pod="openshift-network-diagnostics/network-check-target-hzbw9" Apr 17 20:51:44.601038 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:44.601004 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-jrdfc\"" Apr 17 20:51:44.609365 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:44.609348 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hzbw9" Apr 17 20:51:44.731695 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:44.731640 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-hzbw9"] Apr 17 20:51:44.734460 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:51:44.734435 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod276cca8f_28b9_4e8c_918c_32c78b8870e7.slice/crio-d0e1aae41426ec9faa382426aeaa80bda308383eed3afe3baf9820bfa3f9716a WatchSource:0}: Error finding container d0e1aae41426ec9faa382426aeaa80bda308383eed3afe3baf9820bfa3f9716a: Status 404 returned error can't find the container with id d0e1aae41426ec9faa382426aeaa80bda308383eed3afe3baf9820bfa3f9716a Apr 17 20:51:45.702480 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:45.702447 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-hzbw9" event={"ID":"276cca8f-28b9-4e8c-918c-32c78b8870e7","Type":"ContainerStarted","Data":"d0e1aae41426ec9faa382426aeaa80bda308383eed3afe3baf9820bfa3f9716a"} Apr 17 20:51:47.708308 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:47.708234 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-hzbw9" event={"ID":"276cca8f-28b9-4e8c-918c-32c78b8870e7","Type":"ContainerStarted","Data":"88aaedd7a94c8bd9a407c16f0e2d7dd6c743e21c6c542f1828996edf86f69701"} Apr 17 20:51:47.708615 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:47.708344 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-hzbw9" Apr 17 20:51:47.725341 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:51:47.725302 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-hzbw9" podStartSLOduration=67.083893056 podStartE2EDuration="1m9.725289982s" podCreationTimestamp="2026-04-17 20:50:38 +0000 UTC" firstStartedPulling="2026-04-17 20:51:44.736141845 +0000 UTC m=+66.742555856" lastFinishedPulling="2026-04-17 20:51:47.377538768 +0000 UTC m=+69.383952782" observedRunningTime="2026-04-17 20:51:47.724683737 +0000 UTC m=+69.731097780" watchObservedRunningTime="2026-04-17 20:51:47.725289982 +0000 UTC m=+69.731704019" Apr 17 20:52:14.805205 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:52:14.805177 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ad71e77d-454a-4f81-b1ef-644d66518fd4-registry-tls\") pod \"image-registry-6db95569ff-m8thh\" (UID: \"ad71e77d-454a-4f81-b1ef-644d66518fd4\") " pod="openshift-image-registry/image-registry-6db95569ff-m8thh" Apr 17 20:52:14.805205 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:52:14.805210 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/be99f611-577c-4ca5-8103-64206cf5c3bd-cert\") pod \"ingress-canary-wsdk8\" (UID: \"be99f611-577c-4ca5-8103-64206cf5c3bd\") " pod="openshift-ingress-canary/ingress-canary-wsdk8" Apr 17 20:52:14.805652 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:52:14.805232 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d2a7b59a-da53-4886-af79-da4e37b76a3e-metrics-tls\") pod \"dns-default-ws9kw\" (UID: \"d2a7b59a-da53-4886-af79-da4e37b76a3e\") " pod="openshift-dns/dns-default-ws9kw" Apr 17 20:52:14.805652 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:52:14.805320 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 20:52:14.805652 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:52:14.805331 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 20:52:14.805652 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:52:14.805386 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d2a7b59a-da53-4886-af79-da4e37b76a3e-metrics-tls podName:d2a7b59a-da53-4886-af79-da4e37b76a3e nodeName:}" failed. No retries permitted until 2026-04-17 20:53:18.805372362 +0000 UTC m=+160.811786372 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d2a7b59a-da53-4886-af79-da4e37b76a3e-metrics-tls") pod "dns-default-ws9kw" (UID: "d2a7b59a-da53-4886-af79-da4e37b76a3e") : secret "dns-default-metrics-tls" not found Apr 17 20:52:14.805652 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:52:14.805398 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/be99f611-577c-4ca5-8103-64206cf5c3bd-cert podName:be99f611-577c-4ca5-8103-64206cf5c3bd nodeName:}" failed. No retries permitted until 2026-04-17 20:53:18.805391449 +0000 UTC m=+160.811805459 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/be99f611-577c-4ca5-8103-64206cf5c3bd-cert") pod "ingress-canary-wsdk8" (UID: "be99f611-577c-4ca5-8103-64206cf5c3bd") : secret "canary-serving-cert" not found Apr 17 20:52:14.805652 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:52:14.805332 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 20:52:14.805652 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:52:14.805408 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6db95569ff-m8thh: secret "image-registry-tls" not found Apr 17 20:52:14.805652 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:52:14.805438 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ad71e77d-454a-4f81-b1ef-644d66518fd4-registry-tls podName:ad71e77d-454a-4f81-b1ef-644d66518fd4 nodeName:}" failed. No retries permitted until 2026-04-17 20:53:18.805432141 +0000 UTC m=+160.811846151 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/ad71e77d-454a-4f81-b1ef-644d66518fd4-registry-tls") pod "image-registry-6db95569ff-m8thh" (UID: "ad71e77d-454a-4f81-b1ef-644d66518fd4") : secret "image-registry-tls" not found Apr 17 20:52:18.712179 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:52:18.712149 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-hzbw9" Apr 17 20:52:45.476041 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:52:45.475994 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-wf7nr"] Apr 17 20:52:45.479335 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:52:45.479317 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gzcjp"] Apr 17 20:52:45.479485 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:52:45.479467 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-wf7nr" Apr 17 20:52:45.481953 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:52:45.481931 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 17 20:52:45.482154 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:52:45.482134 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-hfhrb\"" Apr 17 20:52:45.482576 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:52:45.482555 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 17 20:52:45.482678 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:52:45.482585 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 17 20:52:45.482678 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:52:45.482641 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gzcjp" Apr 17 20:52:45.483206 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:52:45.483188 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 17 20:52:45.485464 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:52:45.485441 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 17 20:52:45.485566 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:52:45.485462 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 17 20:52:45.485633 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:52:45.485608 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-s2x5v\"" Apr 17 20:52:45.485633 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:52:45.485626 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 17 20:52:45.485760 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:52:45.485644 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 17 20:52:45.488685 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:52:45.488667 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 17 20:52:45.493149 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:52:45.493130 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gzcjp"] Apr 17 20:52:45.493795 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:52:45.493766 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-wf7nr"] Apr 17 20:52:45.587996 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:52:45.587965 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-ntz2w"] Apr 17 20:52:45.590998 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:52:45.590979 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-ntz2w" Apr 17 20:52:45.593442 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:52:45.593420 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 17 20:52:45.593549 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:52:45.593422 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 17 20:52:45.593549 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:52:45.593427 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 17 20:52:45.593549 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:52:45.593511 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 17 20:52:45.593706 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:52:45.593585 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-m8l2d\"" Apr 17 20:52:45.602690 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:52:45.602672 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7e1e5242-44ff-4464-9b66-32c92d278373-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-gzcjp\" (UID: \"7e1e5242-44ff-4464-9b66-32c92d278373\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gzcjp" Apr 17 20:52:45.602771 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:52:45.602696 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e1e5242-44ff-4464-9b66-32c92d278373-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-gzcjp\" (UID: \"7e1e5242-44ff-4464-9b66-32c92d278373\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gzcjp" Apr 17 20:52:45.602771 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:52:45.602742 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6ddc2491-03fd-4683-9633-ee15d1c98c2d-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-wf7nr\" (UID: \"6ddc2491-03fd-4683-9633-ee15d1c98c2d\") " pod="openshift-insights/insights-operator-585dfdc468-wf7nr" Apr 17 20:52:45.602840 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:52:45.602785 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4x5jv\" (UniqueName: \"kubernetes.io/projected/7e1e5242-44ff-4464-9b66-32c92d278373-kube-api-access-4x5jv\") pod \"kube-storage-version-migrator-operator-6769c5d45-gzcjp\" (UID: \"7e1e5242-44ff-4464-9b66-32c92d278373\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gzcjp" Apr 17 20:52:45.602840 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:52:45.602822 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/6ddc2491-03fd-4683-9633-ee15d1c98c2d-snapshots\") pod \"insights-operator-585dfdc468-wf7nr\" (UID: \"6ddc2491-03fd-4683-9633-ee15d1c98c2d\") " pod="openshift-insights/insights-operator-585dfdc468-wf7nr" Apr 17 20:52:45.602902 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:52:45.602840 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6ddc2491-03fd-4683-9633-ee15d1c98c2d-serving-cert\") pod \"insights-operator-585dfdc468-wf7nr\" (UID: \"6ddc2491-03fd-4683-9633-ee15d1c98c2d\") " pod="openshift-insights/insights-operator-585dfdc468-wf7nr" Apr 17 20:52:45.602902 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:52:45.602866 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8v46k\" (UniqueName: \"kubernetes.io/projected/6ddc2491-03fd-4683-9633-ee15d1c98c2d-kube-api-access-8v46k\") pod \"insights-operator-585dfdc468-wf7nr\" (UID: \"6ddc2491-03fd-4683-9633-ee15d1c98c2d\") " pod="openshift-insights/insights-operator-585dfdc468-wf7nr" Apr 17 20:52:45.602902 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:52:45.602899 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6ddc2491-03fd-4683-9633-ee15d1c98c2d-tmp\") pod \"insights-operator-585dfdc468-wf7nr\" (UID: \"6ddc2491-03fd-4683-9633-ee15d1c98c2d\") " pod="openshift-insights/insights-operator-585dfdc468-wf7nr" Apr 17 20:52:45.602994 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:52:45.602915 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6ddc2491-03fd-4683-9633-ee15d1c98c2d-service-ca-bundle\") pod \"insights-operator-585dfdc468-wf7nr\" (UID: \"6ddc2491-03fd-4683-9633-ee15d1c98c2d\") " pod="openshift-insights/insights-operator-585dfdc468-wf7nr" Apr 17 20:52:45.608886 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:52:45.608865 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-ntz2w"] Apr 17 20:52:45.703973 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:52:45.703935 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7e1e5242-44ff-4464-9b66-32c92d278373-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-gzcjp\" (UID: \"7e1e5242-44ff-4464-9b66-32c92d278373\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gzcjp" Apr 17 20:52:45.703973 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:52:45.703975 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e1e5242-44ff-4464-9b66-32c92d278373-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-gzcjp\" (UID: \"7e1e5242-44ff-4464-9b66-32c92d278373\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gzcjp" Apr 17 20:52:45.704219 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:52:45.704054 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6ddc2491-03fd-4683-9633-ee15d1c98c2d-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-wf7nr\" (UID: \"6ddc2491-03fd-4683-9633-ee15d1c98c2d\") " pod="openshift-insights/insights-operator-585dfdc468-wf7nr" Apr 17 20:52:45.704219 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:52:45.704101 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4x5jv\" (UniqueName: \"kubernetes.io/projected/7e1e5242-44ff-4464-9b66-32c92d278373-kube-api-access-4x5jv\") pod \"kube-storage-version-migrator-operator-6769c5d45-gzcjp\" (UID: \"7e1e5242-44ff-4464-9b66-32c92d278373\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gzcjp" Apr 17 20:52:45.704219 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:52:45.704126 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/6ddc2491-03fd-4683-9633-ee15d1c98c2d-snapshots\") pod \"insights-operator-585dfdc468-wf7nr\" (UID: \"6ddc2491-03fd-4683-9633-ee15d1c98c2d\") " pod="openshift-insights/insights-operator-585dfdc468-wf7nr" Apr 17 20:52:45.704219 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:52:45.704142 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6ddc2491-03fd-4683-9633-ee15d1c98c2d-serving-cert\") pod \"insights-operator-585dfdc468-wf7nr\" (UID: \"6ddc2491-03fd-4683-9633-ee15d1c98c2d\") " pod="openshift-insights/insights-operator-585dfdc468-wf7nr" Apr 17 20:52:45.704219 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:52:45.704170 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8v46k\" (UniqueName: \"kubernetes.io/projected/6ddc2491-03fd-4683-9633-ee15d1c98c2d-kube-api-access-8v46k\") pod \"insights-operator-585dfdc468-wf7nr\" (UID: \"6ddc2491-03fd-4683-9633-ee15d1c98c2d\") " pod="openshift-insights/insights-operator-585dfdc468-wf7nr" Apr 17 20:52:45.704452 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:52:45.704228 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6ddc2491-03fd-4683-9633-ee15d1c98c2d-tmp\") pod \"insights-operator-585dfdc468-wf7nr\" (UID: \"6ddc2491-03fd-4683-9633-ee15d1c98c2d\") " pod="openshift-insights/insights-operator-585dfdc468-wf7nr" Apr 17 20:52:45.704452 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:52:45.704258 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6ddc2491-03fd-4683-9633-ee15d1c98c2d-service-ca-bundle\") pod \"insights-operator-585dfdc468-wf7nr\" (UID: \"6ddc2491-03fd-4683-9633-ee15d1c98c2d\") " pod="openshift-insights/insights-operator-585dfdc468-wf7nr" Apr 17 20:52:45.704452 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:52:45.704286 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f7403c3a-b10b-4584-9045-252ae239c2ef-serving-cert\") pod \"service-ca-operator-d6fc45fc5-ntz2w\" (UID: \"f7403c3a-b10b-4584-9045-252ae239c2ef\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-ntz2w" Apr 17 20:52:45.704452 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:52:45.704313 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7403c3a-b10b-4584-9045-252ae239c2ef-config\") pod \"service-ca-operator-d6fc45fc5-ntz2w\" (UID: \"f7403c3a-b10b-4584-9045-252ae239c2ef\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-ntz2w" Apr 17 20:52:45.704452 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:52:45.704337 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdcr7\" (UniqueName: \"kubernetes.io/projected/f7403c3a-b10b-4584-9045-252ae239c2ef-kube-api-access-jdcr7\") pod \"service-ca-operator-d6fc45fc5-ntz2w\" (UID: \"f7403c3a-b10b-4584-9045-252ae239c2ef\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-ntz2w" Apr 17 20:52:45.704835 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:52:45.704812 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6ddc2491-03fd-4683-9633-ee15d1c98c2d-tmp\") pod \"insights-operator-585dfdc468-wf7nr\" (UID: \"6ddc2491-03fd-4683-9633-ee15d1c98c2d\") " pod="openshift-insights/insights-operator-585dfdc468-wf7nr" Apr 17 20:52:45.704922 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:52:45.704817 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/6ddc2491-03fd-4683-9633-ee15d1c98c2d-snapshots\") pod \"insights-operator-585dfdc468-wf7nr\" (UID: \"6ddc2491-03fd-4683-9633-ee15d1c98c2d\") " pod="openshift-insights/insights-operator-585dfdc468-wf7nr" Apr 17 20:52:45.705104 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:52:45.705087 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e1e5242-44ff-4464-9b66-32c92d278373-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-gzcjp\" (UID: \"7e1e5242-44ff-4464-9b66-32c92d278373\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gzcjp" Apr 17 20:52:45.705379 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:52:45.705355 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6ddc2491-03fd-4683-9633-ee15d1c98c2d-service-ca-bundle\") pod \"insights-operator-585dfdc468-wf7nr\" (UID: \"6ddc2491-03fd-4683-9633-ee15d1c98c2d\") " pod="openshift-insights/insights-operator-585dfdc468-wf7nr" Apr 17 20:52:45.705445 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:52:45.705379 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6ddc2491-03fd-4683-9633-ee15d1c98c2d-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-wf7nr\" (UID: \"6ddc2491-03fd-4683-9633-ee15d1c98c2d\") " pod="openshift-insights/insights-operator-585dfdc468-wf7nr" Apr 17 20:52:45.706404 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:52:45.706381 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7e1e5242-44ff-4464-9b66-32c92d278373-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-gzcjp\" (UID: \"7e1e5242-44ff-4464-9b66-32c92d278373\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gzcjp" Apr 17 20:52:45.706540 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:52:45.706526 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6ddc2491-03fd-4683-9633-ee15d1c98c2d-serving-cert\") pod \"insights-operator-585dfdc468-wf7nr\" (UID: \"6ddc2491-03fd-4683-9633-ee15d1c98c2d\") " pod="openshift-insights/insights-operator-585dfdc468-wf7nr" Apr 17 20:52:45.726379 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:52:45.726316 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8v46k\" (UniqueName: \"kubernetes.io/projected/6ddc2491-03fd-4683-9633-ee15d1c98c2d-kube-api-access-8v46k\") pod \"insights-operator-585dfdc468-wf7nr\" (UID: \"6ddc2491-03fd-4683-9633-ee15d1c98c2d\") " pod="openshift-insights/insights-operator-585dfdc468-wf7nr" Apr 17 20:52:45.726379 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:52:45.726356 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4x5jv\" (UniqueName: \"kubernetes.io/projected/7e1e5242-44ff-4464-9b66-32c92d278373-kube-api-access-4x5jv\") pod \"kube-storage-version-migrator-operator-6769c5d45-gzcjp\" (UID: \"7e1e5242-44ff-4464-9b66-32c92d278373\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gzcjp" Apr 17 20:52:45.790199 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:52:45.790172 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-wf7nr" Apr 17 20:52:45.795017 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:52:45.794991 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gzcjp" Apr 17 20:52:45.804988 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:52:45.804965 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f7403c3a-b10b-4584-9045-252ae239c2ef-serving-cert\") pod \"service-ca-operator-d6fc45fc5-ntz2w\" (UID: \"f7403c3a-b10b-4584-9045-252ae239c2ef\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-ntz2w" Apr 17 20:52:45.805086 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:52:45.804997 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7403c3a-b10b-4584-9045-252ae239c2ef-config\") pod \"service-ca-operator-d6fc45fc5-ntz2w\" (UID: \"f7403c3a-b10b-4584-9045-252ae239c2ef\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-ntz2w" Apr 17 20:52:45.805086 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:52:45.805017 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jdcr7\" (UniqueName: \"kubernetes.io/projected/f7403c3a-b10b-4584-9045-252ae239c2ef-kube-api-access-jdcr7\") pod \"service-ca-operator-d6fc45fc5-ntz2w\" (UID: \"f7403c3a-b10b-4584-9045-252ae239c2ef\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-ntz2w" Apr 17 20:52:45.805593 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:52:45.805568 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7403c3a-b10b-4584-9045-252ae239c2ef-config\") pod \"service-ca-operator-d6fc45fc5-ntz2w\" (UID: \"f7403c3a-b10b-4584-9045-252ae239c2ef\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-ntz2w" Apr 17 20:52:45.807517 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:52:45.807495 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f7403c3a-b10b-4584-9045-252ae239c2ef-serving-cert\") pod \"service-ca-operator-d6fc45fc5-ntz2w\" (UID: \"f7403c3a-b10b-4584-9045-252ae239c2ef\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-ntz2w" Apr 17 20:52:45.813782 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:52:45.813756 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdcr7\" (UniqueName: \"kubernetes.io/projected/f7403c3a-b10b-4584-9045-252ae239c2ef-kube-api-access-jdcr7\") pod \"service-ca-operator-d6fc45fc5-ntz2w\" (UID: \"f7403c3a-b10b-4584-9045-252ae239c2ef\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-ntz2w" Apr 17 20:52:45.899337 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:52:45.899311 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-ntz2w" Apr 17 20:52:45.905143 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:52:45.905116 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-wf7nr"] Apr 17 20:52:45.908313 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:52:45.908285 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ddc2491_03fd_4683_9633_ee15d1c98c2d.slice/crio-e6feb99b0c3b38422a098c1f4ecc0313b621fa5377cb776279d045a1d9c410fe WatchSource:0}: Error finding container e6feb99b0c3b38422a098c1f4ecc0313b621fa5377cb776279d045a1d9c410fe: Status 404 returned error can't find the container with id e6feb99b0c3b38422a098c1f4ecc0313b621fa5377cb776279d045a1d9c410fe Apr 17 20:52:45.918453 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:52:45.918427 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gzcjp"] Apr 17 20:52:45.920525 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:52:45.920500 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e1e5242_44ff_4464_9b66_32c92d278373.slice/crio-33dd14f3857998a70705280aca3beb1711d3a93d308d71fd4e00140cf3b892ff WatchSource:0}: Error finding container 33dd14f3857998a70705280aca3beb1711d3a93d308d71fd4e00140cf3b892ff: Status 404 returned error can't find the container with id 33dd14f3857998a70705280aca3beb1711d3a93d308d71fd4e00140cf3b892ff Apr 17 20:52:46.005298 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:52:46.005220 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-ntz2w"] Apr 17 20:52:46.008267 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:52:46.008245 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7403c3a_b10b_4584_9045_252ae239c2ef.slice/crio-117c94b31b64c4fb7ffb570df079bcf0594c93eda99dbee10f6022083dffa134 WatchSource:0}: Error finding container 117c94b31b64c4fb7ffb570df079bcf0594c93eda99dbee10f6022083dffa134: Status 404 returned error can't find the container with id 117c94b31b64c4fb7ffb570df079bcf0594c93eda99dbee10f6022083dffa134 Apr 17 20:52:46.813545 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:52:46.813505 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-ntz2w" event={"ID":"f7403c3a-b10b-4584-9045-252ae239c2ef","Type":"ContainerStarted","Data":"117c94b31b64c4fb7ffb570df079bcf0594c93eda99dbee10f6022083dffa134"} Apr 17 20:52:46.814698 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:52:46.814659 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gzcjp" event={"ID":"7e1e5242-44ff-4464-9b66-32c92d278373","Type":"ContainerStarted","Data":"33dd14f3857998a70705280aca3beb1711d3a93d308d71fd4e00140cf3b892ff"} Apr 17 20:52:46.815868 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:52:46.815843 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-wf7nr" event={"ID":"6ddc2491-03fd-4683-9633-ee15d1c98c2d","Type":"ContainerStarted","Data":"e6feb99b0c3b38422a098c1f4ecc0313b621fa5377cb776279d045a1d9c410fe"} Apr 17 20:52:48.325759 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:52:48.325726 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f35870d4-a814-46cd-8764-30e79f3a7d79-metrics-certs\") pod \"network-metrics-daemon-z2lfh\" (UID: \"f35870d4-a814-46cd-8764-30e79f3a7d79\") " pod="openshift-multus/network-metrics-daemon-z2lfh" Apr 17 20:52:48.326165 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:52:48.325903 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 20:52:48.326165 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:52:48.325992 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f35870d4-a814-46cd-8764-30e79f3a7d79-metrics-certs podName:f35870d4-a814-46cd-8764-30e79f3a7d79 nodeName:}" failed. No retries permitted until 2026-04-17 20:54:50.325964112 +0000 UTC m=+252.332378128 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f35870d4-a814-46cd-8764-30e79f3a7d79-metrics-certs") pod "network-metrics-daemon-z2lfh" (UID: "f35870d4-a814-46cd-8764-30e79f3a7d79") : secret "metrics-daemon-secret" not found Apr 17 20:52:48.822703 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:52:48.822666 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gzcjp" event={"ID":"7e1e5242-44ff-4464-9b66-32c92d278373","Type":"ContainerStarted","Data":"b5eb56cf02370b7c1707efa47a8654c604ce2ded0f9af151640e781fc8aabf9f"} Apr 17 20:52:48.823937 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:52:48.823908 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-wf7nr" event={"ID":"6ddc2491-03fd-4683-9633-ee15d1c98c2d","Type":"ContainerStarted","Data":"ad094b90d75bbf4510c77d757843648ad7b5c2e8f739c83802353d19723ced9a"} Apr 17 20:52:48.825262 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:52:48.825240 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-ntz2w" event={"ID":"f7403c3a-b10b-4584-9045-252ae239c2ef","Type":"ContainerStarted","Data":"833f3f04b73da65b315cd68f83f4bd3386240616d03aedb431ba7991735a903a"} Apr 17 20:52:48.838060 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:52:48.837982 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gzcjp" podStartSLOduration=1.216055184 podStartE2EDuration="3.837967671s" podCreationTimestamp="2026-04-17 20:52:45 +0000 UTC" firstStartedPulling="2026-04-17 20:52:45.922134558 +0000 UTC m=+127.928548569" lastFinishedPulling="2026-04-17 20:52:48.544047046 +0000 UTC m=+130.550461056" observedRunningTime="2026-04-17 20:52:48.836587746 +0000 UTC m=+130.843001779" watchObservedRunningTime="2026-04-17 20:52:48.837967671 +0000 UTC m=+130.844381704" Apr 17 20:52:48.850660 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:52:48.850352 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-wf7nr" podStartSLOduration=1.221977704 podStartE2EDuration="3.850335898s" podCreationTimestamp="2026-04-17 20:52:45 +0000 UTC" firstStartedPulling="2026-04-17 20:52:45.910064288 +0000 UTC m=+127.916478302" lastFinishedPulling="2026-04-17 20:52:48.538422483 +0000 UTC m=+130.544836496" observedRunningTime="2026-04-17 20:52:48.849967775 +0000 UTC m=+130.856381807" watchObservedRunningTime="2026-04-17 20:52:48.850335898 +0000 UTC m=+130.856749931" Apr 17 20:52:48.864489 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:52:48.864445 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-ntz2w" podStartSLOduration=1.3329591330000001 podStartE2EDuration="3.864435929s" podCreationTimestamp="2026-04-17 20:52:45 +0000 UTC" firstStartedPulling="2026-04-17 20:52:46.009829213 +0000 UTC m=+128.016243223" lastFinishedPulling="2026-04-17 20:52:48.541306009 +0000 UTC m=+130.547720019" observedRunningTime="2026-04-17 20:52:48.863599764 +0000 UTC m=+130.870013796" watchObservedRunningTime="2026-04-17 20:52:48.864435929 +0000 UTC m=+130.870849962" Apr 17 20:52:49.611435 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:52:49.611403 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-dhjql"] Apr 17 20:52:49.615589 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:52:49.615566 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-dhjql" Apr 17 20:52:49.618340 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:52:49.618303 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-gcv7s\"" Apr 17 20:52:49.618461 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:52:49.618342 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 17 20:52:49.619580 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:52:49.619564 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 17 20:52:49.624741 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:52:49.624719 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-dhjql"] Apr 17 20:52:49.734909 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:52:49.734885 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqkhz\" (UniqueName: \"kubernetes.io/projected/9d15f35d-a2cc-4916-9a12-7d0f34fdfe67-kube-api-access-fqkhz\") pod \"migrator-74bb7799d9-dhjql\" (UID: \"9d15f35d-a2cc-4916-9a12-7d0f34fdfe67\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-dhjql" Apr 17 20:52:49.835803 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:52:49.835767 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fqkhz\" (UniqueName: \"kubernetes.io/projected/9d15f35d-a2cc-4916-9a12-7d0f34fdfe67-kube-api-access-fqkhz\") pod \"migrator-74bb7799d9-dhjql\" (UID: \"9d15f35d-a2cc-4916-9a12-7d0f34fdfe67\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-dhjql" Apr 17 20:52:49.843255 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:52:49.843237 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqkhz\" (UniqueName: \"kubernetes.io/projected/9d15f35d-a2cc-4916-9a12-7d0f34fdfe67-kube-api-access-fqkhz\") pod \"migrator-74bb7799d9-dhjql\" (UID: \"9d15f35d-a2cc-4916-9a12-7d0f34fdfe67\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-dhjql" Apr 17 20:52:49.925396 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:52:49.925314 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-dhjql" Apr 17 20:52:50.041300 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:52:50.041262 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-dhjql"] Apr 17 20:52:50.044777 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:52:50.044742 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d15f35d_a2cc_4916_9a12_7d0f34fdfe67.slice/crio-9470a1a9914235e044690185ec851fc78cdb625e34acefb935e42e85583ee99e WatchSource:0}: Error finding container 9470a1a9914235e044690185ec851fc78cdb625e34acefb935e42e85583ee99e: Status 404 returned error can't find the container with id 9470a1a9914235e044690185ec851fc78cdb625e34acefb935e42e85583ee99e Apr 17 20:52:50.830850 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:52:50.830813 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-dhjql" event={"ID":"9d15f35d-a2cc-4916-9a12-7d0f34fdfe67","Type":"ContainerStarted","Data":"9470a1a9914235e044690185ec851fc78cdb625e34acefb935e42e85583ee99e"} Apr 17 20:52:51.835683 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:52:51.835643 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-dhjql" event={"ID":"9d15f35d-a2cc-4916-9a12-7d0f34fdfe67","Type":"ContainerStarted","Data":"7a47b82c1d71049db02040e09376f5b9c8a0828904881d0ab6f237c663ac0e3c"} Apr 17 20:52:51.835683 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:52:51.835681 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-dhjql" event={"ID":"9d15f35d-a2cc-4916-9a12-7d0f34fdfe67","Type":"ContainerStarted","Data":"aad654edc370e692f29bf9b401360cce2f357e9a69b63adbac9e9709fcc839e9"} Apr 17 20:52:51.853986 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:52:51.853942 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-dhjql" podStartSLOduration=1.874081682 podStartE2EDuration="2.853929886s" podCreationTimestamp="2026-04-17 20:52:49 +0000 UTC" firstStartedPulling="2026-04-17 20:52:50.046609138 +0000 UTC m=+132.053023149" lastFinishedPulling="2026-04-17 20:52:51.026457333 +0000 UTC m=+133.032871353" observedRunningTime="2026-04-17 20:52:51.852445997 +0000 UTC m=+133.858860028" watchObservedRunningTime="2026-04-17 20:52:51.853929886 +0000 UTC m=+133.860343917" Apr 17 20:52:52.740545 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:52:52.740514 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-bv5tl_bfe609fc-0a8b-47f5-ac0d-e7e0c47917de/dns-node-resolver/0.log" Apr 17 20:52:53.940359 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:52:53.940329 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-q6mlz_bd8e668b-8da4-40de-9724-891b41810e2d/node-ca/0.log" Apr 17 20:52:54.940414 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:52:54.940377 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-dhjql_9d15f35d-a2cc-4916-9a12-7d0f34fdfe67/migrator/0.log" Apr 17 20:52:55.140750 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:52:55.140716 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-dhjql_9d15f35d-a2cc-4916-9a12-7d0f34fdfe67/graceful-termination/0.log" Apr 17 20:52:55.342063 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:52:55.342037 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-gzcjp_7e1e5242-44ff-4464-9b66-32c92d278373/kube-storage-version-migrator-operator/0.log" Apr 17 20:53:10.718467 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:10.718434 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-6db95569ff-m8thh"] Apr 17 20:53:10.718877 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:53:10.718580 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-image-registry/image-registry-6db95569ff-m8thh" podUID="ad71e77d-454a-4f81-b1ef-644d66518fd4" Apr 17 20:53:10.736975 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:10.736946 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-29mv6"] Apr 17 20:53:10.742081 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:10.742064 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-29mv6" Apr 17 20:53:10.744613 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:10.744593 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 17 20:53:10.744705 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:10.744615 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-j2pjg\"" Apr 17 20:53:10.744705 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:10.744671 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 17 20:53:10.750833 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:10.750812 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-29mv6"] Apr 17 20:53:10.763898 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:10.763879 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66c59b6986-j28rw"] Apr 17 20:53:10.767011 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:10.766997 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66c59b6986-j28rw" Apr 17 20:53:10.780535 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:10.780517 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f793a52c-6af4-4836-bb9d-78c25e56a7ed-bound-sa-token\") pod \"image-registry-66c59b6986-j28rw\" (UID: \"f793a52c-6af4-4836-bb9d-78c25e56a7ed\") " pod="openshift-image-registry/image-registry-66c59b6986-j28rw" Apr 17 20:53:10.780620 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:10.780552 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/1b4762c6-58a6-4aa4-83cd-d9b76247b1e7-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-29mv6\" (UID: \"1b4762c6-58a6-4aa4-83cd-d9b76247b1e7\") " pod="openshift-insights/insights-runtime-extractor-29mv6" Apr 17 20:53:10.780620 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:10.780571 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/1b4762c6-58a6-4aa4-83cd-d9b76247b1e7-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-29mv6\" (UID: \"1b4762c6-58a6-4aa4-83cd-d9b76247b1e7\") " pod="openshift-insights/insights-runtime-extractor-29mv6" Apr 17 20:53:10.780700 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:10.780619 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f793a52c-6af4-4836-bb9d-78c25e56a7ed-registry-tls\") pod \"image-registry-66c59b6986-j28rw\" (UID: \"f793a52c-6af4-4836-bb9d-78c25e56a7ed\") " pod="openshift-image-registry/image-registry-66c59b6986-j28rw" Apr 17 20:53:10.780700 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:10.780642 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f793a52c-6af4-4836-bb9d-78c25e56a7ed-ca-trust-extracted\") pod \"image-registry-66c59b6986-j28rw\" (UID: \"f793a52c-6af4-4836-bb9d-78c25e56a7ed\") " pod="openshift-image-registry/image-registry-66c59b6986-j28rw" Apr 17 20:53:10.780700 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:10.780668 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f793a52c-6af4-4836-bb9d-78c25e56a7ed-installation-pull-secrets\") pod \"image-registry-66c59b6986-j28rw\" (UID: \"f793a52c-6af4-4836-bb9d-78c25e56a7ed\") " pod="openshift-image-registry/image-registry-66c59b6986-j28rw" Apr 17 20:53:10.780700 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:10.780696 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/f793a52c-6af4-4836-bb9d-78c25e56a7ed-image-registry-private-configuration\") pod \"image-registry-66c59b6986-j28rw\" (UID: \"f793a52c-6af4-4836-bb9d-78c25e56a7ed\") " pod="openshift-image-registry/image-registry-66c59b6986-j28rw" Apr 17 20:53:10.780842 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:10.780717 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f793a52c-6af4-4836-bb9d-78c25e56a7ed-registry-certificates\") pod \"image-registry-66c59b6986-j28rw\" (UID: \"f793a52c-6af4-4836-bb9d-78c25e56a7ed\") " pod="openshift-image-registry/image-registry-66c59b6986-j28rw" Apr 17 20:53:10.780842 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:10.780749 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/1b4762c6-58a6-4aa4-83cd-d9b76247b1e7-crio-socket\") pod \"insights-runtime-extractor-29mv6\" (UID: \"1b4762c6-58a6-4aa4-83cd-d9b76247b1e7\") " pod="openshift-insights/insights-runtime-extractor-29mv6" Apr 17 20:53:10.780842 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:10.780771 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/1b4762c6-58a6-4aa4-83cd-d9b76247b1e7-data-volume\") pod \"insights-runtime-extractor-29mv6\" (UID: \"1b4762c6-58a6-4aa4-83cd-d9b76247b1e7\") " pod="openshift-insights/insights-runtime-extractor-29mv6" Apr 17 20:53:10.780842 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:10.780788 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmdt5\" (UniqueName: \"kubernetes.io/projected/1b4762c6-58a6-4aa4-83cd-d9b76247b1e7-kube-api-access-dmdt5\") pod \"insights-runtime-extractor-29mv6\" (UID: \"1b4762c6-58a6-4aa4-83cd-d9b76247b1e7\") " pod="openshift-insights/insights-runtime-extractor-29mv6" Apr 17 20:53:10.780842 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:10.780827 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f793a52c-6af4-4836-bb9d-78c25e56a7ed-trusted-ca\") pod \"image-registry-66c59b6986-j28rw\" (UID: \"f793a52c-6af4-4836-bb9d-78c25e56a7ed\") " pod="openshift-image-registry/image-registry-66c59b6986-j28rw" Apr 17 20:53:10.781035 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:10.780895 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrfvw\" (UniqueName: \"kubernetes.io/projected/f793a52c-6af4-4836-bb9d-78c25e56a7ed-kube-api-access-lrfvw\") pod \"image-registry-66c59b6986-j28rw\" (UID: \"f793a52c-6af4-4836-bb9d-78c25e56a7ed\") " pod="openshift-image-registry/image-registry-66c59b6986-j28rw" Apr 17 20:53:10.782397 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:10.782376 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66c59b6986-j28rw"] Apr 17 20:53:10.878644 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:10.878616 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6db95569ff-m8thh" Apr 17 20:53:10.881389 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:10.881364 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/1b4762c6-58a6-4aa4-83cd-d9b76247b1e7-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-29mv6\" (UID: \"1b4762c6-58a6-4aa4-83cd-d9b76247b1e7\") " pod="openshift-insights/insights-runtime-extractor-29mv6" Apr 17 20:53:10.881498 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:10.881418 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f793a52c-6af4-4836-bb9d-78c25e56a7ed-registry-tls\") pod \"image-registry-66c59b6986-j28rw\" (UID: \"f793a52c-6af4-4836-bb9d-78c25e56a7ed\") " pod="openshift-image-registry/image-registry-66c59b6986-j28rw" Apr 17 20:53:10.881498 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:10.881444 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f793a52c-6af4-4836-bb9d-78c25e56a7ed-ca-trust-extracted\") pod \"image-registry-66c59b6986-j28rw\" (UID: \"f793a52c-6af4-4836-bb9d-78c25e56a7ed\") " pod="openshift-image-registry/image-registry-66c59b6986-j28rw" Apr 17 20:53:10.881498 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:10.881459 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f793a52c-6af4-4836-bb9d-78c25e56a7ed-installation-pull-secrets\") pod \"image-registry-66c59b6986-j28rw\" (UID: \"f793a52c-6af4-4836-bb9d-78c25e56a7ed\") " pod="openshift-image-registry/image-registry-66c59b6986-j28rw" Apr 17 20:53:10.881650 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:10.881573 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/f793a52c-6af4-4836-bb9d-78c25e56a7ed-image-registry-private-configuration\") pod \"image-registry-66c59b6986-j28rw\" (UID: \"f793a52c-6af4-4836-bb9d-78c25e56a7ed\") " pod="openshift-image-registry/image-registry-66c59b6986-j28rw" Apr 17 20:53:10.881650 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:10.881620 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f793a52c-6af4-4836-bb9d-78c25e56a7ed-registry-certificates\") pod \"image-registry-66c59b6986-j28rw\" (UID: \"f793a52c-6af4-4836-bb9d-78c25e56a7ed\") " pod="openshift-image-registry/image-registry-66c59b6986-j28rw" Apr 17 20:53:10.881746 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:10.881670 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/1b4762c6-58a6-4aa4-83cd-d9b76247b1e7-crio-socket\") pod \"insights-runtime-extractor-29mv6\" (UID: \"1b4762c6-58a6-4aa4-83cd-d9b76247b1e7\") " pod="openshift-insights/insights-runtime-extractor-29mv6" Apr 17 20:53:10.881746 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:10.881713 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/1b4762c6-58a6-4aa4-83cd-d9b76247b1e7-data-volume\") pod \"insights-runtime-extractor-29mv6\" (UID: \"1b4762c6-58a6-4aa4-83cd-d9b76247b1e7\") " pod="openshift-insights/insights-runtime-extractor-29mv6" Apr 17 20:53:10.881859 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:10.881746 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dmdt5\" (UniqueName: \"kubernetes.io/projected/1b4762c6-58a6-4aa4-83cd-d9b76247b1e7-kube-api-access-dmdt5\") pod \"insights-runtime-extractor-29mv6\" (UID: \"1b4762c6-58a6-4aa4-83cd-d9b76247b1e7\") " pod="openshift-insights/insights-runtime-extractor-29mv6" Apr 17 20:53:10.881859 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:10.881791 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f793a52c-6af4-4836-bb9d-78c25e56a7ed-trusted-ca\") pod \"image-registry-66c59b6986-j28rw\" (UID: \"f793a52c-6af4-4836-bb9d-78c25e56a7ed\") " pod="openshift-image-registry/image-registry-66c59b6986-j28rw" Apr 17 20:53:10.881859 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:10.881819 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lrfvw\" (UniqueName: \"kubernetes.io/projected/f793a52c-6af4-4836-bb9d-78c25e56a7ed-kube-api-access-lrfvw\") pod \"image-registry-66c59b6986-j28rw\" (UID: \"f793a52c-6af4-4836-bb9d-78c25e56a7ed\") " pod="openshift-image-registry/image-registry-66c59b6986-j28rw" Apr 17 20:53:10.881993 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:10.881846 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f793a52c-6af4-4836-bb9d-78c25e56a7ed-ca-trust-extracted\") pod \"image-registry-66c59b6986-j28rw\" (UID: \"f793a52c-6af4-4836-bb9d-78c25e56a7ed\") " pod="openshift-image-registry/image-registry-66c59b6986-j28rw" Apr 17 20:53:10.881993 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:10.881860 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f793a52c-6af4-4836-bb9d-78c25e56a7ed-bound-sa-token\") pod \"image-registry-66c59b6986-j28rw\" (UID: \"f793a52c-6af4-4836-bb9d-78c25e56a7ed\") " pod="openshift-image-registry/image-registry-66c59b6986-j28rw" Apr 17 20:53:10.882113 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:10.881998 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/1b4762c6-58a6-4aa4-83cd-d9b76247b1e7-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-29mv6\" (UID: \"1b4762c6-58a6-4aa4-83cd-d9b76247b1e7\") " pod="openshift-insights/insights-runtime-extractor-29mv6" Apr 17 20:53:10.882695 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:10.882359 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/1b4762c6-58a6-4aa4-83cd-d9b76247b1e7-crio-socket\") pod \"insights-runtime-extractor-29mv6\" (UID: \"1b4762c6-58a6-4aa4-83cd-d9b76247b1e7\") " pod="openshift-insights/insights-runtime-extractor-29mv6" Apr 17 20:53:10.882695 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:10.882629 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/1b4762c6-58a6-4aa4-83cd-d9b76247b1e7-data-volume\") pod \"insights-runtime-extractor-29mv6\" (UID: \"1b4762c6-58a6-4aa4-83cd-d9b76247b1e7\") " pod="openshift-insights/insights-runtime-extractor-29mv6" Apr 17 20:53:10.882695 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:10.882672 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f793a52c-6af4-4836-bb9d-78c25e56a7ed-registry-certificates\") pod \"image-registry-66c59b6986-j28rw\" (UID: \"f793a52c-6af4-4836-bb9d-78c25e56a7ed\") " pod="openshift-image-registry/image-registry-66c59b6986-j28rw" Apr 17 20:53:10.882695 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:10.882691 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/1b4762c6-58a6-4aa4-83cd-d9b76247b1e7-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-29mv6\" (UID: \"1b4762c6-58a6-4aa4-83cd-d9b76247b1e7\") " pod="openshift-insights/insights-runtime-extractor-29mv6" Apr 17 20:53:10.883329 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:10.883305 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f793a52c-6af4-4836-bb9d-78c25e56a7ed-trusted-ca\") pod \"image-registry-66c59b6986-j28rw\" (UID: \"f793a52c-6af4-4836-bb9d-78c25e56a7ed\") " pod="openshift-image-registry/image-registry-66c59b6986-j28rw" Apr 17 20:53:10.883440 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:10.883427 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6db95569ff-m8thh" Apr 17 20:53:10.884131 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:10.884103 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f793a52c-6af4-4836-bb9d-78c25e56a7ed-registry-tls\") pod \"image-registry-66c59b6986-j28rw\" (UID: \"f793a52c-6af4-4836-bb9d-78c25e56a7ed\") " pod="openshift-image-registry/image-registry-66c59b6986-j28rw" Apr 17 20:53:10.884406 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:10.884389 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/f793a52c-6af4-4836-bb9d-78c25e56a7ed-image-registry-private-configuration\") pod \"image-registry-66c59b6986-j28rw\" (UID: \"f793a52c-6af4-4836-bb9d-78c25e56a7ed\") " pod="openshift-image-registry/image-registry-66c59b6986-j28rw" Apr 17 20:53:10.884747 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:10.884731 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/1b4762c6-58a6-4aa4-83cd-d9b76247b1e7-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-29mv6\" (UID: \"1b4762c6-58a6-4aa4-83cd-d9b76247b1e7\") " pod="openshift-insights/insights-runtime-extractor-29mv6" Apr 17 20:53:10.884805 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:10.884784 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f793a52c-6af4-4836-bb9d-78c25e56a7ed-installation-pull-secrets\") pod \"image-registry-66c59b6986-j28rw\" (UID: \"f793a52c-6af4-4836-bb9d-78c25e56a7ed\") " pod="openshift-image-registry/image-registry-66c59b6986-j28rw" Apr 17 20:53:10.895689 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:10.895666 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f793a52c-6af4-4836-bb9d-78c25e56a7ed-bound-sa-token\") pod \"image-registry-66c59b6986-j28rw\" (UID: \"f793a52c-6af4-4836-bb9d-78c25e56a7ed\") " pod="openshift-image-registry/image-registry-66c59b6986-j28rw" Apr 17 20:53:10.896537 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:10.896520 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmdt5\" (UniqueName: \"kubernetes.io/projected/1b4762c6-58a6-4aa4-83cd-d9b76247b1e7-kube-api-access-dmdt5\") pod \"insights-runtime-extractor-29mv6\" (UID: \"1b4762c6-58a6-4aa4-83cd-d9b76247b1e7\") " pod="openshift-insights/insights-runtime-extractor-29mv6" Apr 17 20:53:10.896680 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:10.896661 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrfvw\" (UniqueName: \"kubernetes.io/projected/f793a52c-6af4-4836-bb9d-78c25e56a7ed-kube-api-access-lrfvw\") pod \"image-registry-66c59b6986-j28rw\" (UID: \"f793a52c-6af4-4836-bb9d-78c25e56a7ed\") " pod="openshift-image-registry/image-registry-66c59b6986-j28rw" Apr 17 20:53:10.982254 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:10.982207 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ad71e77d-454a-4f81-b1ef-644d66518fd4-bound-sa-token\") pod \"ad71e77d-454a-4f81-b1ef-644d66518fd4\" (UID: \"ad71e77d-454a-4f81-b1ef-644d66518fd4\") " Apr 17 20:53:10.982254 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:10.982237 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/ad71e77d-454a-4f81-b1ef-644d66518fd4-image-registry-private-configuration\") pod \"ad71e77d-454a-4f81-b1ef-644d66518fd4\" (UID: \"ad71e77d-454a-4f81-b1ef-644d66518fd4\") " Apr 17 20:53:10.982398 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:10.982269 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ad71e77d-454a-4f81-b1ef-644d66518fd4-trusted-ca\") pod \"ad71e77d-454a-4f81-b1ef-644d66518fd4\" (UID: \"ad71e77d-454a-4f81-b1ef-644d66518fd4\") " Apr 17 20:53:10.982398 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:10.982295 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ad71e77d-454a-4f81-b1ef-644d66518fd4-registry-certificates\") pod \"ad71e77d-454a-4f81-b1ef-644d66518fd4\" (UID: \"ad71e77d-454a-4f81-b1ef-644d66518fd4\") " Apr 17 20:53:10.982398 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:10.982317 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ad71e77d-454a-4f81-b1ef-644d66518fd4-installation-pull-secrets\") pod \"ad71e77d-454a-4f81-b1ef-644d66518fd4\" (UID: \"ad71e77d-454a-4f81-b1ef-644d66518fd4\") " Apr 17 20:53:10.982398 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:10.982369 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ad71e77d-454a-4f81-b1ef-644d66518fd4-ca-trust-extracted\") pod \"ad71e77d-454a-4f81-b1ef-644d66518fd4\" (UID: \"ad71e77d-454a-4f81-b1ef-644d66518fd4\") " Apr 17 20:53:10.982577 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:10.982415 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8gdsf\" (UniqueName: \"kubernetes.io/projected/ad71e77d-454a-4f81-b1ef-644d66518fd4-kube-api-access-8gdsf\") pod \"ad71e77d-454a-4f81-b1ef-644d66518fd4\" (UID: \"ad71e77d-454a-4f81-b1ef-644d66518fd4\") " Apr 17 20:53:10.982687 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:10.982661 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad71e77d-454a-4f81-b1ef-644d66518fd4-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "ad71e77d-454a-4f81-b1ef-644d66518fd4" (UID: "ad71e77d-454a-4f81-b1ef-644d66518fd4"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 20:53:10.982740 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:10.982668 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad71e77d-454a-4f81-b1ef-644d66518fd4-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "ad71e77d-454a-4f81-b1ef-644d66518fd4" (UID: "ad71e77d-454a-4f81-b1ef-644d66518fd4"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 20:53:10.982740 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:10.982725 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad71e77d-454a-4f81-b1ef-644d66518fd4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "ad71e77d-454a-4f81-b1ef-644d66518fd4" (UID: "ad71e77d-454a-4f81-b1ef-644d66518fd4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 20:53:10.984280 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:10.984249 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad71e77d-454a-4f81-b1ef-644d66518fd4-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "ad71e77d-454a-4f81-b1ef-644d66518fd4" (UID: "ad71e77d-454a-4f81-b1ef-644d66518fd4"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 20:53:10.984368 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:10.984323 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad71e77d-454a-4f81-b1ef-644d66518fd4-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "ad71e77d-454a-4f81-b1ef-644d66518fd4" (UID: "ad71e77d-454a-4f81-b1ef-644d66518fd4"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 20:53:10.984421 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:10.984363 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad71e77d-454a-4f81-b1ef-644d66518fd4-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "ad71e77d-454a-4f81-b1ef-644d66518fd4" (UID: "ad71e77d-454a-4f81-b1ef-644d66518fd4"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:53:10.984460 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:10.984416 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad71e77d-454a-4f81-b1ef-644d66518fd4-kube-api-access-8gdsf" (OuterVolumeSpecName: "kube-api-access-8gdsf") pod "ad71e77d-454a-4f81-b1ef-644d66518fd4" (UID: "ad71e77d-454a-4f81-b1ef-644d66518fd4"). InnerVolumeSpecName "kube-api-access-8gdsf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:53:11.050895 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:11.050865 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-29mv6" Apr 17 20:53:11.078333 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:11.078311 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-j6pkz\"" Apr 17 20:53:11.083773 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:11.083752 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8gdsf\" (UniqueName: \"kubernetes.io/projected/ad71e77d-454a-4f81-b1ef-644d66518fd4-kube-api-access-8gdsf\") on node \"ip-10-0-128-248.ec2.internal\" DevicePath \"\"" Apr 17 20:53:11.083773 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:11.083772 2576 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ad71e77d-454a-4f81-b1ef-644d66518fd4-bound-sa-token\") on node \"ip-10-0-128-248.ec2.internal\" DevicePath \"\"" Apr 17 20:53:11.083892 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:11.083783 2576 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/ad71e77d-454a-4f81-b1ef-644d66518fd4-image-registry-private-configuration\") on node \"ip-10-0-128-248.ec2.internal\" DevicePath \"\"" Apr 17 20:53:11.083892 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:11.083792 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ad71e77d-454a-4f81-b1ef-644d66518fd4-trusted-ca\") on node \"ip-10-0-128-248.ec2.internal\" DevicePath \"\"" Apr 17 20:53:11.083892 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:11.083802 2576 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ad71e77d-454a-4f81-b1ef-644d66518fd4-registry-certificates\") on node \"ip-10-0-128-248.ec2.internal\" DevicePath \"\"" Apr 17 20:53:11.083892 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:11.083811 2576 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ad71e77d-454a-4f81-b1ef-644d66518fd4-installation-pull-secrets\") on node \"ip-10-0-128-248.ec2.internal\" DevicePath \"\"" Apr 17 20:53:11.083892 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:11.083819 2576 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ad71e77d-454a-4f81-b1ef-644d66518fd4-ca-trust-extracted\") on node \"ip-10-0-128-248.ec2.internal\" DevicePath \"\"" Apr 17 20:53:11.086708 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:11.086692 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66c59b6986-j28rw" Apr 17 20:53:11.169259 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:11.169157 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-29mv6"] Apr 17 20:53:11.171604 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:53:11.171579 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b4762c6_58a6_4aa4_83cd_d9b76247b1e7.slice/crio-bf8e2a3cdc77ad56ab4184eb598b43e0a8654a749899432032e858a6e7120b59 WatchSource:0}: Error finding container bf8e2a3cdc77ad56ab4184eb598b43e0a8654a749899432032e858a6e7120b59: Status 404 returned error can't find the container with id bf8e2a3cdc77ad56ab4184eb598b43e0a8654a749899432032e858a6e7120b59 Apr 17 20:53:11.212667 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:11.212641 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66c59b6986-j28rw"] Apr 17 20:53:11.215007 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:53:11.214979 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf793a52c_6af4_4836_bb9d_78c25e56a7ed.slice/crio-8ef51f9cdf968a1f349f0b5b6d3dbb12a3d02c18450ddaa747b4146976471544 WatchSource:0}: Error finding container 8ef51f9cdf968a1f349f0b5b6d3dbb12a3d02c18450ddaa747b4146976471544: Status 404 returned error can't find the container with id 8ef51f9cdf968a1f349f0b5b6d3dbb12a3d02c18450ddaa747b4146976471544 Apr 17 20:53:11.881896 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:11.881865 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66c59b6986-j28rw" event={"ID":"f793a52c-6af4-4836-bb9d-78c25e56a7ed","Type":"ContainerStarted","Data":"4645044f06837337c3b33f49b2570365859692c677a3652ed840a6e77e978ed5"} Apr 17 20:53:11.882259 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:11.881908 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66c59b6986-j28rw" event={"ID":"f793a52c-6af4-4836-bb9d-78c25e56a7ed","Type":"ContainerStarted","Data":"8ef51f9cdf968a1f349f0b5b6d3dbb12a3d02c18450ddaa747b4146976471544"} Apr 17 20:53:11.882259 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:11.881967 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-66c59b6986-j28rw" Apr 17 20:53:11.883020 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:11.882995 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-29mv6" event={"ID":"1b4762c6-58a6-4aa4-83cd-d9b76247b1e7","Type":"ContainerStarted","Data":"9118c0703c2bdbd934b69412902dd6a4e67d1f848f7018f1c0c1852163ee263c"} Apr 17 20:53:11.883020 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:11.883015 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6db95569ff-m8thh" Apr 17 20:53:11.883020 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:11.883039 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-29mv6" event={"ID":"1b4762c6-58a6-4aa4-83cd-d9b76247b1e7","Type":"ContainerStarted","Data":"bf8e2a3cdc77ad56ab4184eb598b43e0a8654a749899432032e858a6e7120b59"} Apr 17 20:53:11.905212 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:11.905168 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66c59b6986-j28rw" podStartSLOduration=1.9051538159999999 podStartE2EDuration="1.905153816s" podCreationTimestamp="2026-04-17 20:53:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 20:53:11.904424568 +0000 UTC m=+153.910838599" watchObservedRunningTime="2026-04-17 20:53:11.905153816 +0000 UTC m=+153.911567849" Apr 17 20:53:11.934834 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:11.934811 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-6db95569ff-m8thh"] Apr 17 20:53:11.941359 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:11.941342 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-6db95569ff-m8thh"] Apr 17 20:53:11.991186 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:11.991161 2576 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ad71e77d-454a-4f81-b1ef-644d66518fd4-registry-tls\") on node \"ip-10-0-128-248.ec2.internal\" DevicePath \"\"" Apr 17 20:53:12.486529 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:12.486494 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad71e77d-454a-4f81-b1ef-644d66518fd4" path="/var/lib/kubelet/pods/ad71e77d-454a-4f81-b1ef-644d66518fd4/volumes" Apr 17 20:53:12.887727 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:12.887688 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-29mv6" event={"ID":"1b4762c6-58a6-4aa4-83cd-d9b76247b1e7","Type":"ContainerStarted","Data":"e4648be66101444127371390ad73f668b22fbfc8725adf6b68f4a114bd8ab887"} Apr 17 20:53:13.891404 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:13.891369 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-29mv6" event={"ID":"1b4762c6-58a6-4aa4-83cd-d9b76247b1e7","Type":"ContainerStarted","Data":"53b500d90422cd942047ccb4ea8980303ddad1c2fe280780ac2497101d87b31a"} Apr 17 20:53:13.909610 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:13.909568 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-29mv6" podStartSLOduration=1.9131536329999999 podStartE2EDuration="3.909555312s" podCreationTimestamp="2026-04-17 20:53:10 +0000 UTC" firstStartedPulling="2026-04-17 20:53:11.228125657 +0000 UTC m=+153.234539668" lastFinishedPulling="2026-04-17 20:53:13.224527337 +0000 UTC m=+155.230941347" observedRunningTime="2026-04-17 20:53:13.908953034 +0000 UTC m=+155.915367065" watchObservedRunningTime="2026-04-17 20:53:13.909555312 +0000 UTC m=+155.915969344" Apr 17 20:53:13.918097 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:53:13.918071 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-wsdk8" podUID="be99f611-577c-4ca5-8103-64206cf5c3bd" Apr 17 20:53:13.925218 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:53:13.925197 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-ws9kw" podUID="d2a7b59a-da53-4886-af79-da4e37b76a3e" Apr 17 20:53:14.893827 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:14.893798 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-ws9kw" Apr 17 20:53:14.894232 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:14.893912 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-wsdk8" Apr 17 20:53:15.492767 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:53:15.492726 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-z2lfh" podUID="f35870d4-a814-46cd-8764-30e79f3a7d79" Apr 17 20:53:18.838467 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:18.838436 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/be99f611-577c-4ca5-8103-64206cf5c3bd-cert\") pod \"ingress-canary-wsdk8\" (UID: \"be99f611-577c-4ca5-8103-64206cf5c3bd\") " pod="openshift-ingress-canary/ingress-canary-wsdk8" Apr 17 20:53:18.838906 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:18.838473 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d2a7b59a-da53-4886-af79-da4e37b76a3e-metrics-tls\") pod \"dns-default-ws9kw\" (UID: \"d2a7b59a-da53-4886-af79-da4e37b76a3e\") " pod="openshift-dns/dns-default-ws9kw" Apr 17 20:53:18.840814 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:18.840788 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d2a7b59a-da53-4886-af79-da4e37b76a3e-metrics-tls\") pod \"dns-default-ws9kw\" (UID: \"d2a7b59a-da53-4886-af79-da4e37b76a3e\") " pod="openshift-dns/dns-default-ws9kw" Apr 17 20:53:18.840814 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:18.840797 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/be99f611-577c-4ca5-8103-64206cf5c3bd-cert\") pod \"ingress-canary-wsdk8\" (UID: \"be99f611-577c-4ca5-8103-64206cf5c3bd\") " pod="openshift-ingress-canary/ingress-canary-wsdk8" Apr 17 20:53:19.097639 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:19.097566 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-tswpv\"" Apr 17 20:53:19.098690 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:19.098667 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-jvfcm\"" Apr 17 20:53:19.105760 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:19.105742 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-wsdk8" Apr 17 20:53:19.105828 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:19.105766 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-ws9kw" Apr 17 20:53:19.227957 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:19.227930 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-wsdk8"] Apr 17 20:53:19.231117 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:53:19.231080 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe99f611_577c_4ca5_8103_64206cf5c3bd.slice/crio-5e3ab10fe701d1ad9b0d905a75d189d54f501a102edb1c7f20bb451124047867 WatchSource:0}: Error finding container 5e3ab10fe701d1ad9b0d905a75d189d54f501a102edb1c7f20bb451124047867: Status 404 returned error can't find the container with id 5e3ab10fe701d1ad9b0d905a75d189d54f501a102edb1c7f20bb451124047867 Apr 17 20:53:19.257256 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:19.255702 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-ws9kw"] Apr 17 20:53:19.260737 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:53:19.260715 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd2a7b59a_da53_4886_af79_da4e37b76a3e.slice/crio-f568020b3816fb269990c15a05e0697f63c7c868edd25eeea878e0483f611c45 WatchSource:0}: Error finding container f568020b3816fb269990c15a05e0697f63c7c868edd25eeea878e0483f611c45: Status 404 returned error can't find the container with id f568020b3816fb269990c15a05e0697f63c7c868edd25eeea878e0483f611c45 Apr 17 20:53:19.906491 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:19.906450 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-wsdk8" event={"ID":"be99f611-577c-4ca5-8103-64206cf5c3bd","Type":"ContainerStarted","Data":"5e3ab10fe701d1ad9b0d905a75d189d54f501a102edb1c7f20bb451124047867"} Apr 17 20:53:19.907482 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:19.907456 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-ws9kw" event={"ID":"d2a7b59a-da53-4886-af79-da4e37b76a3e","Type":"ContainerStarted","Data":"f568020b3816fb269990c15a05e0697f63c7c868edd25eeea878e0483f611c45"} Apr 17 20:53:20.175871 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:20.175801 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-qwgks"] Apr 17 20:53:20.179015 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:20.178993 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-qwgks" Apr 17 20:53:20.181463 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:20.181438 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-rgjld\"" Apr 17 20:53:20.181601 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:20.181584 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 17 20:53:20.184809 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:20.184771 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-qwgks"] Apr 17 20:53:20.264048 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:20.264005 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/6113635f-b0f8-443f-935b-a31f49f907ad-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-qwgks\" (UID: \"6113635f-b0f8-443f-935b-a31f49f907ad\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-qwgks" Apr 17 20:53:20.364957 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:20.364921 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/6113635f-b0f8-443f-935b-a31f49f907ad-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-qwgks\" (UID: \"6113635f-b0f8-443f-935b-a31f49f907ad\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-qwgks" Apr 17 20:53:20.365125 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:53:20.365073 2576 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Apr 17 20:53:20.365199 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:53:20.365140 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6113635f-b0f8-443f-935b-a31f49f907ad-tls-certificates podName:6113635f-b0f8-443f-935b-a31f49f907ad nodeName:}" failed. No retries permitted until 2026-04-17 20:53:20.865121207 +0000 UTC m=+162.871535234 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/6113635f-b0f8-443f-935b-a31f49f907ad-tls-certificates") pod "prometheus-operator-admission-webhook-57cf98b594-qwgks" (UID: "6113635f-b0f8-443f-935b-a31f49f907ad") : secret "prometheus-operator-admission-webhook-tls" not found Apr 17 20:53:20.869414 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:20.869378 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/6113635f-b0f8-443f-935b-a31f49f907ad-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-qwgks\" (UID: \"6113635f-b0f8-443f-935b-a31f49f907ad\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-qwgks" Apr 17 20:53:20.871975 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:20.871949 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/6113635f-b0f8-443f-935b-a31f49f907ad-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-qwgks\" (UID: \"6113635f-b0f8-443f-935b-a31f49f907ad\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-qwgks" Apr 17 20:53:20.911951 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:20.911888 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-ws9kw" event={"ID":"d2a7b59a-da53-4886-af79-da4e37b76a3e","Type":"ContainerStarted","Data":"47f65939b02c956d510b3d214036edd2d404c4bbbd329a35256a15bc16d7979f"} Apr 17 20:53:21.091328 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:21.091271 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-qwgks" Apr 17 20:53:21.321193 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:21.321169 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-qwgks"] Apr 17 20:53:21.324239 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:53:21.324214 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6113635f_b0f8_443f_935b_a31f49f907ad.slice/crio-17a7005e4eb5669602960f67f23cea9685880c21189f9fcc5148098946418b86 WatchSource:0}: Error finding container 17a7005e4eb5669602960f67f23cea9685880c21189f9fcc5148098946418b86: Status 404 returned error can't find the container with id 17a7005e4eb5669602960f67f23cea9685880c21189f9fcc5148098946418b86 Apr 17 20:53:21.916682 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:21.916648 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-ws9kw" event={"ID":"d2a7b59a-da53-4886-af79-da4e37b76a3e","Type":"ContainerStarted","Data":"a67a064009105eb1646aa2eb07c931bdb347f82bd5569a0fcfc60594d66c97ad"} Apr 17 20:53:21.916682 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:21.916692 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-ws9kw" Apr 17 20:53:21.917776 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:21.917748 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-qwgks" event={"ID":"6113635f-b0f8-443f-935b-a31f49f907ad","Type":"ContainerStarted","Data":"17a7005e4eb5669602960f67f23cea9685880c21189f9fcc5148098946418b86"} Apr 17 20:53:21.919098 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:21.919073 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-wsdk8" event={"ID":"be99f611-577c-4ca5-8103-64206cf5c3bd","Type":"ContainerStarted","Data":"008d38841828c1410ce62b3fd5f99882c5874dcd568656873d8b033906c1eeda"} Apr 17 20:53:21.932077 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:21.932016 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-ws9kw" podStartSLOduration=130.560868469 podStartE2EDuration="2m11.932004022s" podCreationTimestamp="2026-04-17 20:51:10 +0000 UTC" firstStartedPulling="2026-04-17 20:53:19.26237114 +0000 UTC m=+161.268785153" lastFinishedPulling="2026-04-17 20:53:20.633506692 +0000 UTC m=+162.639920706" observedRunningTime="2026-04-17 20:53:21.931445032 +0000 UTC m=+163.937859092" watchObservedRunningTime="2026-04-17 20:53:21.932004022 +0000 UTC m=+163.938418048" Apr 17 20:53:21.944764 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:21.944690 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-wsdk8" podStartSLOduration=129.920522862 podStartE2EDuration="2m11.944679458s" podCreationTimestamp="2026-04-17 20:51:10 +0000 UTC" firstStartedPulling="2026-04-17 20:53:19.233369126 +0000 UTC m=+161.239783139" lastFinishedPulling="2026-04-17 20:53:21.257525706 +0000 UTC m=+163.263939735" observedRunningTime="2026-04-17 20:53:21.94458197 +0000 UTC m=+163.950996004" watchObservedRunningTime="2026-04-17 20:53:21.944679458 +0000 UTC m=+163.951093489" Apr 17 20:53:22.923096 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:22.923056 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-qwgks" event={"ID":"6113635f-b0f8-443f-935b-a31f49f907ad","Type":"ContainerStarted","Data":"95a20c231480a457da4d3a1bc9d4d101705cf5c641b9576d610199b1d0a6480a"} Apr 17 20:53:22.923522 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:22.923236 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-qwgks" Apr 17 20:53:22.927581 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:22.927560 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-qwgks" Apr 17 20:53:22.943554 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:22.943518 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-qwgks" podStartSLOduration=1.994334464 podStartE2EDuration="2.94350788s" podCreationTimestamp="2026-04-17 20:53:20 +0000 UTC" firstStartedPulling="2026-04-17 20:53:21.326172146 +0000 UTC m=+163.332586160" lastFinishedPulling="2026-04-17 20:53:22.275345563 +0000 UTC m=+164.281759576" observedRunningTime="2026-04-17 20:53:22.942565389 +0000 UTC m=+164.948979443" watchObservedRunningTime="2026-04-17 20:53:22.94350788 +0000 UTC m=+164.949921911" Apr 17 20:53:26.483559 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:26.483522 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z2lfh" Apr 17 20:53:27.567281 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:27.567247 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-xpj77"] Apr 17 20:53:27.570519 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:27.570498 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-xpj77" Apr 17 20:53:27.573281 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:27.573257 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 17 20:53:27.573508 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:27.573259 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 17 20:53:27.574403 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:27.574388 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 17 20:53:27.574553 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:27.574535 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 17 20:53:27.574617 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:27.574592 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 17 20:53:27.574671 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:27.574623 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-vxkgf\"" Apr 17 20:53:27.583541 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:27.583523 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-xpj77"] Apr 17 20:53:27.587369 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:27.587349 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-4xl6v"] Apr 17 20:53:27.590410 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:27.590393 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-4xl6v" Apr 17 20:53:27.593144 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:27.593125 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 17 20:53:27.593243 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:27.593131 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-w7zbw\"" Apr 17 20:53:27.593243 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:27.593181 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 17 20:53:27.593243 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:27.593197 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 17 20:53:27.601040 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:27.601008 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-98rrz"] Apr 17 20:53:27.604255 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:27.604237 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-4xl6v"] Apr 17 20:53:27.604357 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:27.604345 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-98rrz" Apr 17 20:53:27.608422 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:27.608398 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 17 20:53:27.608686 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:27.608668 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 17 20:53:27.609659 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:27.609635 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-hmlbz\"" Apr 17 20:53:27.610648 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:27.610626 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 17 20:53:27.716806 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:27.716782 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/80324393-b669-4bf0-9373-8962675c7e34-metrics-client-ca\") pod \"node-exporter-98rrz\" (UID: \"80324393-b669-4bf0-9373-8962675c7e34\") " pod="openshift-monitoring/node-exporter-98rrz" Apr 17 20:53:27.716899 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:27.716811 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/80324393-b669-4bf0-9373-8962675c7e34-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-98rrz\" (UID: \"80324393-b669-4bf0-9373-8962675c7e34\") " pod="openshift-monitoring/node-exporter-98rrz" Apr 17 20:53:27.716899 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:27.716831 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/80324393-b669-4bf0-9373-8962675c7e34-node-exporter-textfile\") pod \"node-exporter-98rrz\" (UID: \"80324393-b669-4bf0-9373-8962675c7e34\") " pod="openshift-monitoring/node-exporter-98rrz" Apr 17 20:53:27.716899 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:27.716855 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dj8q4\" (UniqueName: \"kubernetes.io/projected/80324393-b669-4bf0-9373-8962675c7e34-kube-api-access-dj8q4\") pod \"node-exporter-98rrz\" (UID: \"80324393-b669-4bf0-9373-8962675c7e34\") " pod="openshift-monitoring/node-exporter-98rrz" Apr 17 20:53:27.716995 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:27.716898 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/80324393-b669-4bf0-9373-8962675c7e34-node-exporter-accelerators-collector-config\") pod \"node-exporter-98rrz\" (UID: \"80324393-b669-4bf0-9373-8962675c7e34\") " pod="openshift-monitoring/node-exporter-98rrz" Apr 17 20:53:27.716995 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:27.716959 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frb8d\" (UniqueName: \"kubernetes.io/projected/1049aead-593a-4542-8f3d-64e8b6e5868c-kube-api-access-frb8d\") pod \"kube-state-metrics-69db897b98-4xl6v\" (UID: \"1049aead-593a-4542-8f3d-64e8b6e5868c\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-4xl6v" Apr 17 20:53:27.716995 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:27.716988 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/80324393-b669-4bf0-9373-8962675c7e34-node-exporter-wtmp\") pod \"node-exporter-98rrz\" (UID: \"80324393-b669-4bf0-9373-8962675c7e34\") " pod="openshift-monitoring/node-exporter-98rrz" Apr 17 20:53:27.717143 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:27.717004 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/1049aead-593a-4542-8f3d-64e8b6e5868c-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-4xl6v\" (UID: \"1049aead-593a-4542-8f3d-64e8b6e5868c\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-4xl6v" Apr 17 20:53:27.717143 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:27.717020 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1049aead-593a-4542-8f3d-64e8b6e5868c-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-4xl6v\" (UID: \"1049aead-593a-4542-8f3d-64e8b6e5868c\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-4xl6v" Apr 17 20:53:27.717143 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:27.717065 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/3366da27-91fd-4f57-9f19-7f1a7aca9a23-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-xpj77\" (UID: \"3366da27-91fd-4f57-9f19-7f1a7aca9a23\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-xpj77" Apr 17 20:53:27.717143 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:27.717081 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3366da27-91fd-4f57-9f19-7f1a7aca9a23-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-xpj77\" (UID: \"3366da27-91fd-4f57-9f19-7f1a7aca9a23\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-xpj77" Apr 17 20:53:27.717143 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:27.717101 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3366da27-91fd-4f57-9f19-7f1a7aca9a23-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-xpj77\" (UID: \"3366da27-91fd-4f57-9f19-7f1a7aca9a23\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-xpj77" Apr 17 20:53:27.717285 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:27.717141 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/1049aead-593a-4542-8f3d-64e8b6e5868c-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-4xl6v\" (UID: \"1049aead-593a-4542-8f3d-64e8b6e5868c\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-4xl6v" Apr 17 20:53:27.717285 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:27.717164 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/1049aead-593a-4542-8f3d-64e8b6e5868c-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-4xl6v\" (UID: \"1049aead-593a-4542-8f3d-64e8b6e5868c\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-4xl6v" Apr 17 20:53:27.717285 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:27.717183 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/80324393-b669-4bf0-9373-8962675c7e34-sys\") pod \"node-exporter-98rrz\" (UID: \"80324393-b669-4bf0-9373-8962675c7e34\") " pod="openshift-monitoring/node-exporter-98rrz" Apr 17 20:53:27.717285 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:27.717198 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/80324393-b669-4bf0-9373-8962675c7e34-node-exporter-tls\") pod \"node-exporter-98rrz\" (UID: \"80324393-b669-4bf0-9373-8962675c7e34\") " pod="openshift-monitoring/node-exporter-98rrz" Apr 17 20:53:27.717285 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:27.717251 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwqs5\" (UniqueName: \"kubernetes.io/projected/3366da27-91fd-4f57-9f19-7f1a7aca9a23-kube-api-access-qwqs5\") pod \"openshift-state-metrics-9d44df66c-xpj77\" (UID: \"3366da27-91fd-4f57-9f19-7f1a7aca9a23\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-xpj77" Apr 17 20:53:27.717285 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:27.717272 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/80324393-b669-4bf0-9373-8962675c7e34-root\") pod \"node-exporter-98rrz\" (UID: \"80324393-b669-4bf0-9373-8962675c7e34\") " pod="openshift-monitoring/node-exporter-98rrz" Apr 17 20:53:27.717454 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:27.717292 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1049aead-593a-4542-8f3d-64e8b6e5868c-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-4xl6v\" (UID: \"1049aead-593a-4542-8f3d-64e8b6e5868c\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-4xl6v" Apr 17 20:53:27.818269 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:27.818212 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/80324393-b669-4bf0-9373-8962675c7e34-node-exporter-accelerators-collector-config\") pod \"node-exporter-98rrz\" (UID: \"80324393-b669-4bf0-9373-8962675c7e34\") " pod="openshift-monitoring/node-exporter-98rrz" Apr 17 20:53:27.818269 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:27.818240 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-frb8d\" (UniqueName: \"kubernetes.io/projected/1049aead-593a-4542-8f3d-64e8b6e5868c-kube-api-access-frb8d\") pod \"kube-state-metrics-69db897b98-4xl6v\" (UID: \"1049aead-593a-4542-8f3d-64e8b6e5868c\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-4xl6v" Apr 17 20:53:27.818269 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:27.818258 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/80324393-b669-4bf0-9373-8962675c7e34-node-exporter-wtmp\") pod \"node-exporter-98rrz\" (UID: \"80324393-b669-4bf0-9373-8962675c7e34\") " pod="openshift-monitoring/node-exporter-98rrz" Apr 17 20:53:27.818448 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:27.818276 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/1049aead-593a-4542-8f3d-64e8b6e5868c-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-4xl6v\" (UID: \"1049aead-593a-4542-8f3d-64e8b6e5868c\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-4xl6v" Apr 17 20:53:27.818448 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:27.818302 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1049aead-593a-4542-8f3d-64e8b6e5868c-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-4xl6v\" (UID: \"1049aead-593a-4542-8f3d-64e8b6e5868c\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-4xl6v" Apr 17 20:53:27.818448 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:27.818324 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/3366da27-91fd-4f57-9f19-7f1a7aca9a23-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-xpj77\" (UID: \"3366da27-91fd-4f57-9f19-7f1a7aca9a23\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-xpj77" Apr 17 20:53:27.818448 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:27.818345 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3366da27-91fd-4f57-9f19-7f1a7aca9a23-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-xpj77\" (UID: \"3366da27-91fd-4f57-9f19-7f1a7aca9a23\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-xpj77" Apr 17 20:53:27.818448 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:27.818372 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3366da27-91fd-4f57-9f19-7f1a7aca9a23-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-xpj77\" (UID: \"3366da27-91fd-4f57-9f19-7f1a7aca9a23\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-xpj77" Apr 17 20:53:27.818448 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:27.818403 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/1049aead-593a-4542-8f3d-64e8b6e5868c-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-4xl6v\" (UID: \"1049aead-593a-4542-8f3d-64e8b6e5868c\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-4xl6v" Apr 17 20:53:27.818448 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:27.818416 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/80324393-b669-4bf0-9373-8962675c7e34-node-exporter-wtmp\") pod \"node-exporter-98rrz\" (UID: \"80324393-b669-4bf0-9373-8962675c7e34\") " pod="openshift-monitoring/node-exporter-98rrz" Apr 17 20:53:27.818758 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:27.818474 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/1049aead-593a-4542-8f3d-64e8b6e5868c-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-4xl6v\" (UID: \"1049aead-593a-4542-8f3d-64e8b6e5868c\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-4xl6v" Apr 17 20:53:27.818758 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:53:27.818498 2576 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 17 20:53:27.818758 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:27.818517 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/80324393-b669-4bf0-9373-8962675c7e34-sys\") pod \"node-exporter-98rrz\" (UID: \"80324393-b669-4bf0-9373-8962675c7e34\") " pod="openshift-monitoring/node-exporter-98rrz" Apr 17 20:53:27.818758 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:27.818546 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/80324393-b669-4bf0-9373-8962675c7e34-node-exporter-tls\") pod \"node-exporter-98rrz\" (UID: \"80324393-b669-4bf0-9373-8962675c7e34\") " pod="openshift-monitoring/node-exporter-98rrz" Apr 17 20:53:27.818758 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:53:27.818577 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3366da27-91fd-4f57-9f19-7f1a7aca9a23-openshift-state-metrics-tls podName:3366da27-91fd-4f57-9f19-7f1a7aca9a23 nodeName:}" failed. No retries permitted until 2026-04-17 20:53:28.318553924 +0000 UTC m=+170.324967957 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/3366da27-91fd-4f57-9f19-7f1a7aca9a23-openshift-state-metrics-tls") pod "openshift-state-metrics-9d44df66c-xpj77" (UID: "3366da27-91fd-4f57-9f19-7f1a7aca9a23") : secret "openshift-state-metrics-tls" not found Apr 17 20:53:27.818758 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:53:27.818598 2576 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 17 20:53:27.818758 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:53:27.818621 2576 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 17 20:53:27.818758 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:27.818632 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qwqs5\" (UniqueName: \"kubernetes.io/projected/3366da27-91fd-4f57-9f19-7f1a7aca9a23-kube-api-access-qwqs5\") pod \"openshift-state-metrics-9d44df66c-xpj77\" (UID: \"3366da27-91fd-4f57-9f19-7f1a7aca9a23\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-xpj77" Apr 17 20:53:27.818758 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:53:27.818666 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1049aead-593a-4542-8f3d-64e8b6e5868c-kube-state-metrics-tls podName:1049aead-593a-4542-8f3d-64e8b6e5868c nodeName:}" failed. No retries permitted until 2026-04-17 20:53:28.318646964 +0000 UTC m=+170.325060980 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/1049aead-593a-4542-8f3d-64e8b6e5868c-kube-state-metrics-tls") pod "kube-state-metrics-69db897b98-4xl6v" (UID: "1049aead-593a-4542-8f3d-64e8b6e5868c") : secret "kube-state-metrics-tls" not found Apr 17 20:53:27.818758 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:27.818689 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/80324393-b669-4bf0-9373-8962675c7e34-root\") pod \"node-exporter-98rrz\" (UID: \"80324393-b669-4bf0-9373-8962675c7e34\") " pod="openshift-monitoring/node-exporter-98rrz" Apr 17 20:53:27.818758 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:27.818695 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/80324393-b669-4bf0-9373-8962675c7e34-sys\") pod \"node-exporter-98rrz\" (UID: \"80324393-b669-4bf0-9373-8962675c7e34\") " pod="openshift-monitoring/node-exporter-98rrz" Apr 17 20:53:27.818758 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:27.818691 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/1049aead-593a-4542-8f3d-64e8b6e5868c-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-4xl6v\" (UID: \"1049aead-593a-4542-8f3d-64e8b6e5868c\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-4xl6v" Apr 17 20:53:27.818758 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:53:27.818731 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/80324393-b669-4bf0-9373-8962675c7e34-node-exporter-tls podName:80324393-b669-4bf0-9373-8962675c7e34 nodeName:}" failed. No retries permitted until 2026-04-17 20:53:28.318717174 +0000 UTC m=+170.325131188 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/80324393-b669-4bf0-9373-8962675c7e34-node-exporter-tls") pod "node-exporter-98rrz" (UID: "80324393-b669-4bf0-9373-8962675c7e34") : secret "node-exporter-tls" not found Apr 17 20:53:27.818758 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:27.818734 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/80324393-b669-4bf0-9373-8962675c7e34-root\") pod \"node-exporter-98rrz\" (UID: \"80324393-b669-4bf0-9373-8962675c7e34\") " pod="openshift-monitoring/node-exporter-98rrz" Apr 17 20:53:27.818758 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:27.818752 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1049aead-593a-4542-8f3d-64e8b6e5868c-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-4xl6v\" (UID: \"1049aead-593a-4542-8f3d-64e8b6e5868c\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-4xl6v" Apr 17 20:53:27.819542 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:27.818800 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/80324393-b669-4bf0-9373-8962675c7e34-metrics-client-ca\") pod \"node-exporter-98rrz\" (UID: \"80324393-b669-4bf0-9373-8962675c7e34\") " pod="openshift-monitoring/node-exporter-98rrz" Apr 17 20:53:27.819542 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:27.818828 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/80324393-b669-4bf0-9373-8962675c7e34-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-98rrz\" (UID: \"80324393-b669-4bf0-9373-8962675c7e34\") " pod="openshift-monitoring/node-exporter-98rrz" Apr 17 20:53:27.819542 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:27.818861 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/80324393-b669-4bf0-9373-8962675c7e34-node-exporter-textfile\") pod \"node-exporter-98rrz\" (UID: \"80324393-b669-4bf0-9373-8962675c7e34\") " pod="openshift-monitoring/node-exporter-98rrz" Apr 17 20:53:27.819542 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:27.818865 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/80324393-b669-4bf0-9373-8962675c7e34-node-exporter-accelerators-collector-config\") pod \"node-exporter-98rrz\" (UID: \"80324393-b669-4bf0-9373-8962675c7e34\") " pod="openshift-monitoring/node-exporter-98rrz" Apr 17 20:53:27.819542 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:27.818914 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dj8q4\" (UniqueName: \"kubernetes.io/projected/80324393-b669-4bf0-9373-8962675c7e34-kube-api-access-dj8q4\") pod \"node-exporter-98rrz\" (UID: \"80324393-b669-4bf0-9373-8962675c7e34\") " pod="openshift-monitoring/node-exporter-98rrz" Apr 17 20:53:27.819542 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:27.819138 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/80324393-b669-4bf0-9373-8962675c7e34-node-exporter-textfile\") pod \"node-exporter-98rrz\" (UID: \"80324393-b669-4bf0-9373-8962675c7e34\") " pod="openshift-monitoring/node-exporter-98rrz" Apr 17 20:53:27.819542 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:27.819147 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3366da27-91fd-4f57-9f19-7f1a7aca9a23-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-xpj77\" (UID: \"3366da27-91fd-4f57-9f19-7f1a7aca9a23\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-xpj77" Apr 17 20:53:27.819542 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:27.819196 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/1049aead-593a-4542-8f3d-64e8b6e5868c-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-4xl6v\" (UID: \"1049aead-593a-4542-8f3d-64e8b6e5868c\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-4xl6v" Apr 17 20:53:27.819542 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:27.819339 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1049aead-593a-4542-8f3d-64e8b6e5868c-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-4xl6v\" (UID: \"1049aead-593a-4542-8f3d-64e8b6e5868c\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-4xl6v" Apr 17 20:53:27.819542 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:27.819484 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/80324393-b669-4bf0-9373-8962675c7e34-metrics-client-ca\") pod \"node-exporter-98rrz\" (UID: \"80324393-b669-4bf0-9373-8962675c7e34\") " pod="openshift-monitoring/node-exporter-98rrz" Apr 17 20:53:27.821146 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:27.821126 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1049aead-593a-4542-8f3d-64e8b6e5868c-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-4xl6v\" (UID: \"1049aead-593a-4542-8f3d-64e8b6e5868c\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-4xl6v" Apr 17 20:53:27.821238 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:27.821181 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3366da27-91fd-4f57-9f19-7f1a7aca9a23-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-xpj77\" (UID: \"3366da27-91fd-4f57-9f19-7f1a7aca9a23\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-xpj77" Apr 17 20:53:27.821333 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:27.821313 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/80324393-b669-4bf0-9373-8962675c7e34-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-98rrz\" (UID: \"80324393-b669-4bf0-9373-8962675c7e34\") " pod="openshift-monitoring/node-exporter-98rrz" Apr 17 20:53:27.831947 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:27.831925 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dj8q4\" (UniqueName: \"kubernetes.io/projected/80324393-b669-4bf0-9373-8962675c7e34-kube-api-access-dj8q4\") pod \"node-exporter-98rrz\" (UID: \"80324393-b669-4bf0-9373-8962675c7e34\") " pod="openshift-monitoring/node-exporter-98rrz" Apr 17 20:53:27.832991 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:27.832971 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-frb8d\" (UniqueName: \"kubernetes.io/projected/1049aead-593a-4542-8f3d-64e8b6e5868c-kube-api-access-frb8d\") pod \"kube-state-metrics-69db897b98-4xl6v\" (UID: \"1049aead-593a-4542-8f3d-64e8b6e5868c\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-4xl6v" Apr 17 20:53:27.833113 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:27.833096 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwqs5\" (UniqueName: \"kubernetes.io/projected/3366da27-91fd-4f57-9f19-7f1a7aca9a23-kube-api-access-qwqs5\") pod \"openshift-state-metrics-9d44df66c-xpj77\" (UID: \"3366da27-91fd-4f57-9f19-7f1a7aca9a23\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-xpj77" Apr 17 20:53:28.322582 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:28.322554 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/3366da27-91fd-4f57-9f19-7f1a7aca9a23-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-xpj77\" (UID: \"3366da27-91fd-4f57-9f19-7f1a7aca9a23\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-xpj77" Apr 17 20:53:28.322728 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:28.322596 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/1049aead-593a-4542-8f3d-64e8b6e5868c-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-4xl6v\" (UID: \"1049aead-593a-4542-8f3d-64e8b6e5868c\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-4xl6v" Apr 17 20:53:28.322728 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:28.322616 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/80324393-b669-4bf0-9373-8962675c7e34-node-exporter-tls\") pod \"node-exporter-98rrz\" (UID: \"80324393-b669-4bf0-9373-8962675c7e34\") " pod="openshift-monitoring/node-exporter-98rrz" Apr 17 20:53:28.324908 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:28.324870 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/80324393-b669-4bf0-9373-8962675c7e34-node-exporter-tls\") pod \"node-exporter-98rrz\" (UID: \"80324393-b669-4bf0-9373-8962675c7e34\") " pod="openshift-monitoring/node-exporter-98rrz" Apr 17 20:53:28.325049 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:28.324942 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/3366da27-91fd-4f57-9f19-7f1a7aca9a23-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-xpj77\" (UID: \"3366da27-91fd-4f57-9f19-7f1a7aca9a23\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-xpj77" Apr 17 20:53:28.325049 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:28.324963 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/1049aead-593a-4542-8f3d-64e8b6e5868c-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-4xl6v\" (UID: \"1049aead-593a-4542-8f3d-64e8b6e5868c\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-4xl6v" Apr 17 20:53:28.479057 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:28.479021 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-xpj77" Apr 17 20:53:28.498494 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:28.498468 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-4xl6v" Apr 17 20:53:28.515199 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:28.515173 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-98rrz" Apr 17 20:53:28.606480 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:28.606445 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-xpj77"] Apr 17 20:53:28.609288 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:53:28.609252 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3366da27_91fd_4f57_9f19_7f1a7aca9a23.slice/crio-9b77c1e9631d5f8a2dc47026f40815151300b7edf4c5e8a138d0238052d159ba WatchSource:0}: Error finding container 9b77c1e9631d5f8a2dc47026f40815151300b7edf4c5e8a138d0238052d159ba: Status 404 returned error can't find the container with id 9b77c1e9631d5f8a2dc47026f40815151300b7edf4c5e8a138d0238052d159ba Apr 17 20:53:28.622999 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:28.622973 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-4xl6v"] Apr 17 20:53:28.625488 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:53:28.625465 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1049aead_593a_4542_8f3d_64e8b6e5868c.slice/crio-67f058dd29f989d7dacbdc41d951e27ad13573762440223e3eb54a09f194d483 WatchSource:0}: Error finding container 67f058dd29f989d7dacbdc41d951e27ad13573762440223e3eb54a09f194d483: Status 404 returned error can't find the container with id 67f058dd29f989d7dacbdc41d951e27ad13573762440223e3eb54a09f194d483 Apr 17 20:53:28.680390 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:28.680368 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 20:53:28.685386 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:28.685362 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:53:28.688129 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:28.687937 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 17 20:53:28.688230 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:28.688180 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 17 20:53:28.688399 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:28.688380 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 17 20:53:28.688494 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:28.688459 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-ml5lb\"" Apr 17 20:53:28.688616 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:28.688600 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 17 20:53:28.688668 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:28.688654 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 17 20:53:28.688869 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:28.688776 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 17 20:53:28.688869 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:28.688866 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 17 20:53:28.689013 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:28.688967 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 17 20:53:28.689013 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:28.689006 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 17 20:53:28.696990 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:28.696952 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 20:53:28.826065 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:28.826020 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/ba720c9a-8559-424f-aabc-7fc45c75698f-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"ba720c9a-8559-424f-aabc-7fc45c75698f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:53:28.826207 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:28.826073 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba720c9a-8559-424f-aabc-7fc45c75698f-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"ba720c9a-8559-424f-aabc-7fc45c75698f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:53:28.826207 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:28.826094 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvkzx\" (UniqueName: \"kubernetes.io/projected/ba720c9a-8559-424f-aabc-7fc45c75698f-kube-api-access-tvkzx\") pod \"alertmanager-main-0\" (UID: \"ba720c9a-8559-424f-aabc-7fc45c75698f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:53:28.826207 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:28.826120 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ba720c9a-8559-424f-aabc-7fc45c75698f-config-out\") pod \"alertmanager-main-0\" (UID: \"ba720c9a-8559-424f-aabc-7fc45c75698f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:53:28.826207 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:28.826143 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ba720c9a-8559-424f-aabc-7fc45c75698f-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"ba720c9a-8559-424f-aabc-7fc45c75698f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:53:28.826423 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:28.826226 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/ba720c9a-8559-424f-aabc-7fc45c75698f-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"ba720c9a-8559-424f-aabc-7fc45c75698f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:53:28.826423 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:28.826253 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ba720c9a-8559-424f-aabc-7fc45c75698f-web-config\") pod \"alertmanager-main-0\" (UID: \"ba720c9a-8559-424f-aabc-7fc45c75698f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:53:28.826423 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:28.826273 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/ba720c9a-8559-424f-aabc-7fc45c75698f-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"ba720c9a-8559-424f-aabc-7fc45c75698f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:53:28.826423 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:28.826333 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/ba720c9a-8559-424f-aabc-7fc45c75698f-config-volume\") pod \"alertmanager-main-0\" (UID: \"ba720c9a-8559-424f-aabc-7fc45c75698f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:53:28.826423 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:28.826350 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ba720c9a-8559-424f-aabc-7fc45c75698f-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"ba720c9a-8559-424f-aabc-7fc45c75698f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:53:28.826423 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:28.826366 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/ba720c9a-8559-424f-aabc-7fc45c75698f-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"ba720c9a-8559-424f-aabc-7fc45c75698f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:53:28.826423 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:28.826386 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ba720c9a-8559-424f-aabc-7fc45c75698f-tls-assets\") pod \"alertmanager-main-0\" (UID: \"ba720c9a-8559-424f-aabc-7fc45c75698f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:53:28.826423 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:28.826403 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ba720c9a-8559-424f-aabc-7fc45c75698f-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"ba720c9a-8559-424f-aabc-7fc45c75698f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:53:28.927314 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:28.927236 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ba720c9a-8559-424f-aabc-7fc45c75698f-tls-assets\") pod \"alertmanager-main-0\" (UID: \"ba720c9a-8559-424f-aabc-7fc45c75698f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:53:28.927314 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:28.927283 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ba720c9a-8559-424f-aabc-7fc45c75698f-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"ba720c9a-8559-424f-aabc-7fc45c75698f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:53:28.927549 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:28.927331 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/ba720c9a-8559-424f-aabc-7fc45c75698f-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"ba720c9a-8559-424f-aabc-7fc45c75698f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:53:28.927549 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:28.927363 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba720c9a-8559-424f-aabc-7fc45c75698f-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"ba720c9a-8559-424f-aabc-7fc45c75698f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:53:28.927549 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:28.927389 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tvkzx\" (UniqueName: \"kubernetes.io/projected/ba720c9a-8559-424f-aabc-7fc45c75698f-kube-api-access-tvkzx\") pod \"alertmanager-main-0\" (UID: \"ba720c9a-8559-424f-aabc-7fc45c75698f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:53:28.927549 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:28.927419 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ba720c9a-8559-424f-aabc-7fc45c75698f-config-out\") pod \"alertmanager-main-0\" (UID: \"ba720c9a-8559-424f-aabc-7fc45c75698f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:53:28.927549 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:28.927448 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ba720c9a-8559-424f-aabc-7fc45c75698f-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"ba720c9a-8559-424f-aabc-7fc45c75698f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:53:28.927850 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:28.927694 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/ba720c9a-8559-424f-aabc-7fc45c75698f-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"ba720c9a-8559-424f-aabc-7fc45c75698f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:53:28.927850 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:28.927741 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ba720c9a-8559-424f-aabc-7fc45c75698f-web-config\") pod \"alertmanager-main-0\" (UID: \"ba720c9a-8559-424f-aabc-7fc45c75698f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:53:28.927850 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:28.927775 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/ba720c9a-8559-424f-aabc-7fc45c75698f-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"ba720c9a-8559-424f-aabc-7fc45c75698f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:53:28.927850 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:28.927832 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/ba720c9a-8559-424f-aabc-7fc45c75698f-config-volume\") pod \"alertmanager-main-0\" (UID: \"ba720c9a-8559-424f-aabc-7fc45c75698f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:53:28.928062 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:28.927856 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ba720c9a-8559-424f-aabc-7fc45c75698f-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"ba720c9a-8559-424f-aabc-7fc45c75698f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:53:28.928062 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:28.927892 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/ba720c9a-8559-424f-aabc-7fc45c75698f-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"ba720c9a-8559-424f-aabc-7fc45c75698f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:53:28.928536 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:28.928224 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/ba720c9a-8559-424f-aabc-7fc45c75698f-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"ba720c9a-8559-424f-aabc-7fc45c75698f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:53:28.928536 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:28.928474 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba720c9a-8559-424f-aabc-7fc45c75698f-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"ba720c9a-8559-424f-aabc-7fc45c75698f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:53:28.929078 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:28.929054 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ba720c9a-8559-424f-aabc-7fc45c75698f-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"ba720c9a-8559-424f-aabc-7fc45c75698f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:53:28.930868 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:28.930432 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/ba720c9a-8559-424f-aabc-7fc45c75698f-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"ba720c9a-8559-424f-aabc-7fc45c75698f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:53:28.930868 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:28.930681 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/ba720c9a-8559-424f-aabc-7fc45c75698f-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"ba720c9a-8559-424f-aabc-7fc45c75698f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:53:28.930868 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:28.930831 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ba720c9a-8559-424f-aabc-7fc45c75698f-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"ba720c9a-8559-424f-aabc-7fc45c75698f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:53:28.931491 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:28.931205 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ba720c9a-8559-424f-aabc-7fc45c75698f-config-out\") pod \"alertmanager-main-0\" (UID: \"ba720c9a-8559-424f-aabc-7fc45c75698f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:53:28.931491 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:28.931454 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ba720c9a-8559-424f-aabc-7fc45c75698f-tls-assets\") pod \"alertmanager-main-0\" (UID: \"ba720c9a-8559-424f-aabc-7fc45c75698f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:53:28.931752 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:28.931722 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/ba720c9a-8559-424f-aabc-7fc45c75698f-config-volume\") pod \"alertmanager-main-0\" (UID: \"ba720c9a-8559-424f-aabc-7fc45c75698f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:53:28.932339 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:28.932317 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/ba720c9a-8559-424f-aabc-7fc45c75698f-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"ba720c9a-8559-424f-aabc-7fc45c75698f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:53:28.932780 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:28.932749 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ba720c9a-8559-424f-aabc-7fc45c75698f-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"ba720c9a-8559-424f-aabc-7fc45c75698f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:53:28.933267 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:28.933237 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ba720c9a-8559-424f-aabc-7fc45c75698f-web-config\") pod \"alertmanager-main-0\" (UID: \"ba720c9a-8559-424f-aabc-7fc45c75698f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:53:28.934947 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:28.934928 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvkzx\" (UniqueName: \"kubernetes.io/projected/ba720c9a-8559-424f-aabc-7fc45c75698f-kube-api-access-tvkzx\") pod \"alertmanager-main-0\" (UID: \"ba720c9a-8559-424f-aabc-7fc45c75698f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:53:28.939790 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:28.939765 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-4xl6v" event={"ID":"1049aead-593a-4542-8f3d-64e8b6e5868c","Type":"ContainerStarted","Data":"67f058dd29f989d7dacbdc41d951e27ad13573762440223e3eb54a09f194d483"} Apr 17 20:53:28.941562 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:28.941533 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-xpj77" event={"ID":"3366da27-91fd-4f57-9f19-7f1a7aca9a23","Type":"ContainerStarted","Data":"bf07332867e1cefbf049d729334ace21ae64b931fd37f0ea7db6e2e7def74c5b"} Apr 17 20:53:28.941650 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:28.941565 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-xpj77" event={"ID":"3366da27-91fd-4f57-9f19-7f1a7aca9a23","Type":"ContainerStarted","Data":"0022ccf67719e1b918fb25fb510ece6ca109154d403016e4d5271baadf90a3de"} Apr 17 20:53:28.941650 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:28.941580 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-xpj77" event={"ID":"3366da27-91fd-4f57-9f19-7f1a7aca9a23","Type":"ContainerStarted","Data":"9b77c1e9631d5f8a2dc47026f40815151300b7edf4c5e8a138d0238052d159ba"} Apr 17 20:53:28.942703 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:28.942676 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-98rrz" event={"ID":"80324393-b669-4bf0-9373-8962675c7e34","Type":"ContainerStarted","Data":"9ab55676e9d9a7e661216de0254a1c7c3c9eda117ab0b07f0b7aef88f509170e"} Apr 17 20:53:28.999149 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:28.999125 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:53:29.143195 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:29.142937 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 20:53:29.211468 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:53:29.211398 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba720c9a_8559_424f_aabc_7fc45c75698f.slice/crio-d6e9a21ad2a71e6dc927c081060806518b0fe0fa6aab65eebd18fbc6a8c16caf WatchSource:0}: Error finding container d6e9a21ad2a71e6dc927c081060806518b0fe0fa6aab65eebd18fbc6a8c16caf: Status 404 returned error can't find the container with id d6e9a21ad2a71e6dc927c081060806518b0fe0fa6aab65eebd18fbc6a8c16caf Apr 17 20:53:29.947861 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:29.947760 2576 generic.go:358] "Generic (PLEG): container finished" podID="80324393-b669-4bf0-9373-8962675c7e34" containerID="aed50a4bf07cdde974f99bb4e025277cb4edc5bcd4b65a7a4536359854f0546f" exitCode=0 Apr 17 20:53:29.947861 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:29.947840 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-98rrz" event={"ID":"80324393-b669-4bf0-9373-8962675c7e34","Type":"ContainerDied","Data":"aed50a4bf07cdde974f99bb4e025277cb4edc5bcd4b65a7a4536359854f0546f"} Apr 17 20:53:29.949398 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:29.949358 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ba720c9a-8559-424f-aabc-7fc45c75698f","Type":"ContainerStarted","Data":"d6e9a21ad2a71e6dc927c081060806518b0fe0fa6aab65eebd18fbc6a8c16caf"} Apr 17 20:53:30.954134 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:30.954094 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-xpj77" event={"ID":"3366da27-91fd-4f57-9f19-7f1a7aca9a23","Type":"ContainerStarted","Data":"2d590cabfa1f48368b4d726d31f1b13677d15eeabcd05ca86f23a18dd881067b"} Apr 17 20:53:30.956007 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:30.955982 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-98rrz" event={"ID":"80324393-b669-4bf0-9373-8962675c7e34","Type":"ContainerStarted","Data":"f73b16e62bbe41629cdb8613fe1f62172495304cdf0764c04e45e222523d113d"} Apr 17 20:53:30.956134 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:30.956013 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-98rrz" event={"ID":"80324393-b669-4bf0-9373-8962675c7e34","Type":"ContainerStarted","Data":"0fdf7384ded0ecae737ed0650b81e13afa4c044f8f0e470e6a78e729ff9510dd"} Apr 17 20:53:30.957718 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:30.957690 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-4xl6v" event={"ID":"1049aead-593a-4542-8f3d-64e8b6e5868c","Type":"ContainerStarted","Data":"2df94d77686c8e1086c714794dd509be2ca14e8f9dd25f73a5b43d61b0d76fba"} Apr 17 20:53:30.957841 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:30.957725 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-4xl6v" event={"ID":"1049aead-593a-4542-8f3d-64e8b6e5868c","Type":"ContainerStarted","Data":"089feee489d51c5d39d4ada3f7a1bb94c1afdcaba8f15bf6a921f7104028e382"} Apr 17 20:53:30.957841 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:30.957738 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-4xl6v" event={"ID":"1049aead-593a-4542-8f3d-64e8b6e5868c","Type":"ContainerStarted","Data":"98b735317716804a151398da06aefc43aba191956fe3eb6376f46230235e1e6d"} Apr 17 20:53:30.958879 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:30.958855 2576 generic.go:358] "Generic (PLEG): container finished" podID="ba720c9a-8559-424f-aabc-7fc45c75698f" containerID="d2d171ec4d0a1ecfcb1b5765ae2ed5ec487b0953803f1e9da636cee410f6c728" exitCode=0 Apr 17 20:53:30.958951 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:30.958902 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ba720c9a-8559-424f-aabc-7fc45c75698f","Type":"ContainerDied","Data":"d2d171ec4d0a1ecfcb1b5765ae2ed5ec487b0953803f1e9da636cee410f6c728"} Apr 17 20:53:30.971016 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:30.970980 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-xpj77" podStartSLOduration=2.791441734 podStartE2EDuration="3.970967849s" podCreationTimestamp="2026-04-17 20:53:27 +0000 UTC" firstStartedPulling="2026-04-17 20:53:28.721950623 +0000 UTC m=+170.728364633" lastFinishedPulling="2026-04-17 20:53:29.901476726 +0000 UTC m=+171.907890748" observedRunningTime="2026-04-17 20:53:30.970119233 +0000 UTC m=+172.976533264" watchObservedRunningTime="2026-04-17 20:53:30.970967849 +0000 UTC m=+172.977381872" Apr 17 20:53:30.987001 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:30.986966 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-98rrz" podStartSLOduration=3.257717656 podStartE2EDuration="3.986953151s" podCreationTimestamp="2026-04-17 20:53:27 +0000 UTC" firstStartedPulling="2026-04-17 20:53:28.529220179 +0000 UTC m=+170.535634190" lastFinishedPulling="2026-04-17 20:53:29.258455663 +0000 UTC m=+171.264869685" observedRunningTime="2026-04-17 20:53:30.986300348 +0000 UTC m=+172.992714382" watchObservedRunningTime="2026-04-17 20:53:30.986953151 +0000 UTC m=+172.993367187" Apr 17 20:53:31.023612 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:31.023575 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-4xl6v" podStartSLOduration=2.751734859 podStartE2EDuration="4.023564255s" podCreationTimestamp="2026-04-17 20:53:27 +0000 UTC" firstStartedPulling="2026-04-17 20:53:28.627077006 +0000 UTC m=+170.633491017" lastFinishedPulling="2026-04-17 20:53:29.898906389 +0000 UTC m=+171.905320413" observedRunningTime="2026-04-17 20:53:31.022987814 +0000 UTC m=+173.029401847" watchObservedRunningTime="2026-04-17 20:53:31.023564255 +0000 UTC m=+173.029978287" Apr 17 20:53:31.090825 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:31.090788 2576 patch_prober.go:28] interesting pod/image-registry-66c59b6986-j28rw container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 17 20:53:31.090953 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:31.090841 2576 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-66c59b6986-j28rw" podUID="f793a52c-6af4-4836-bb9d-78c25e56a7ed" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 20:53:31.925645 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:31.925617 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-ws9kw" Apr 17 20:53:32.348455 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:32.348431 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-s89br"] Apr 17 20:53:32.351810 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:32.351793 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-s89br" Apr 17 20:53:32.354441 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:32.354391 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 17 20:53:32.354548 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:32.354451 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-5jw5m\"" Apr 17 20:53:32.358142 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:32.358124 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-s89br"] Apr 17 20:53:32.460809 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:32.460782 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/e0274df8-8f37-41d5-9ac8-89ed0c37fdd0-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-s89br\" (UID: \"e0274df8-8f37-41d5-9ac8-89ed0c37fdd0\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-s89br" Apr 17 20:53:32.561581 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:32.561554 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/e0274df8-8f37-41d5-9ac8-89ed0c37fdd0-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-s89br\" (UID: \"e0274df8-8f37-41d5-9ac8-89ed0c37fdd0\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-s89br" Apr 17 20:53:32.563775 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:32.563757 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/e0274df8-8f37-41d5-9ac8-89ed0c37fdd0-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-s89br\" (UID: \"e0274df8-8f37-41d5-9ac8-89ed0c37fdd0\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-s89br" Apr 17 20:53:32.671833 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:32.671808 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-s89br" Apr 17 20:53:32.783469 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:32.783371 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-s89br"] Apr 17 20:53:32.785788 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:53:32.785759 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0274df8_8f37_41d5_9ac8_89ed0c37fdd0.slice/crio-aa8bd28342a793cd22bd2206233d7f1cf23c88030d2a309a99fefc0faae0a92c WatchSource:0}: Error finding container aa8bd28342a793cd22bd2206233d7f1cf23c88030d2a309a99fefc0faae0a92c: Status 404 returned error can't find the container with id aa8bd28342a793cd22bd2206233d7f1cf23c88030d2a309a99fefc0faae0a92c Apr 17 20:53:32.892546 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:32.892518 2576 patch_prober.go:28] interesting pod/image-registry-66c59b6986-j28rw container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 17 20:53:32.892675 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:32.892578 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-66c59b6986-j28rw" podUID="f793a52c-6af4-4836-bb9d-78c25e56a7ed" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 20:53:32.969821 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:32.969744 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ba720c9a-8559-424f-aabc-7fc45c75698f","Type":"ContainerStarted","Data":"c5ff7d7513137725e43083393acb9b3bd728d63fdfa23da1a381dcc01eab79cf"} Apr 17 20:53:32.969821 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:32.969786 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ba720c9a-8559-424f-aabc-7fc45c75698f","Type":"ContainerStarted","Data":"2914f8f1cefde30258de862165dab33c08b5c3ba79882b188d285a98e947a238"} Apr 17 20:53:32.969821 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:32.969800 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ba720c9a-8559-424f-aabc-7fc45c75698f","Type":"ContainerStarted","Data":"970a56aa74b5ad8db16aefae334e746649d7895de2327f34cf3ca946743bdad6"} Apr 17 20:53:32.969821 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:32.969812 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ba720c9a-8559-424f-aabc-7fc45c75698f","Type":"ContainerStarted","Data":"fae5f9bd4cdd3f4674cc10a3c199f26b38ef35f2d356249f9f5fa1afa36b5ea0"} Apr 17 20:53:32.970100 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:32.969825 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ba720c9a-8559-424f-aabc-7fc45c75698f","Type":"ContainerStarted","Data":"76f9de0395bbb686e33986f4fd7e326361d74143c8c67530a1a210de97eb2e14"} Apr 17 20:53:32.970816 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:32.970788 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-s89br" event={"ID":"e0274df8-8f37-41d5-9ac8-89ed0c37fdd0","Type":"ContainerStarted","Data":"aa8bd28342a793cd22bd2206233d7f1cf23c88030d2a309a99fefc0faae0a92c"} Apr 17 20:53:33.976680 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:33.976656 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ba720c9a-8559-424f-aabc-7fc45c75698f","Type":"ContainerStarted","Data":"d4bd9bd282d20e69db6ec6d692047a45c9f18689ca13f4dbdf6ad465f25a7917"} Apr 17 20:53:34.004293 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:34.004245 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.051216012 podStartE2EDuration="6.004229448s" podCreationTimestamp="2026-04-17 20:53:28 +0000 UTC" firstStartedPulling="2026-04-17 20:53:29.213782179 +0000 UTC m=+171.220196192" lastFinishedPulling="2026-04-17 20:53:33.16679561 +0000 UTC m=+175.173209628" observedRunningTime="2026-04-17 20:53:34.002259026 +0000 UTC m=+176.008673094" watchObservedRunningTime="2026-04-17 20:53:34.004229448 +0000 UTC m=+176.010643479" Apr 17 20:53:34.981037 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:34.980988 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-s89br" event={"ID":"e0274df8-8f37-41d5-9ac8-89ed0c37fdd0","Type":"ContainerStarted","Data":"d6d491cc43f6687e9162db435a419a47c0bc1f9278354a25366e7a60a4873a9f"} Apr 17 20:53:34.981508 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:34.981152 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-s89br" Apr 17 20:53:34.985673 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:34.985648 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-s89br" Apr 17 20:53:34.995548 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:34.995500 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-s89br" podStartSLOduration=1.8522589790000001 podStartE2EDuration="2.995487287s" podCreationTimestamp="2026-04-17 20:53:32 +0000 UTC" firstStartedPulling="2026-04-17 20:53:32.787541316 +0000 UTC m=+174.793955326" lastFinishedPulling="2026-04-17 20:53:33.93076961 +0000 UTC m=+175.937183634" observedRunningTime="2026-04-17 20:53:34.995288453 +0000 UTC m=+177.001702494" watchObservedRunningTime="2026-04-17 20:53:34.995487287 +0000 UTC m=+177.001901320" Apr 17 20:53:41.090681 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:41.090648 2576 patch_prober.go:28] interesting pod/image-registry-66c59b6986-j28rw container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 17 20:53:41.091063 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:41.090705 2576 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-66c59b6986-j28rw" podUID="f793a52c-6af4-4836-bb9d-78c25e56a7ed" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 20:53:42.891973 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:42.891941 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66c59b6986-j28rw" Apr 17 20:53:44.094447 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:44.094419 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-smxnr"] Apr 17 20:53:44.099953 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:44.099933 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-smxnr" Apr 17 20:53:44.102564 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:44.102538 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 17 20:53:44.102722 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:44.102606 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 17 20:53:44.102722 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:44.102606 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-qhjdk\"" Apr 17 20:53:44.107415 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:44.107395 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-smxnr"] Apr 17 20:53:44.255920 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:44.255890 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlqjh\" (UniqueName: \"kubernetes.io/projected/fad3a57b-d1f9-4920-971e-1e1bdf88df8f-kube-api-access-qlqjh\") pod \"downloads-6bcc868b7-smxnr\" (UID: \"fad3a57b-d1f9-4920-971e-1e1bdf88df8f\") " pod="openshift-console/downloads-6bcc868b7-smxnr" Apr 17 20:53:44.356583 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:44.356520 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qlqjh\" (UniqueName: \"kubernetes.io/projected/fad3a57b-d1f9-4920-971e-1e1bdf88df8f-kube-api-access-qlqjh\") pod \"downloads-6bcc868b7-smxnr\" (UID: \"fad3a57b-d1f9-4920-971e-1e1bdf88df8f\") " pod="openshift-console/downloads-6bcc868b7-smxnr" Apr 17 20:53:44.365125 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:44.365099 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlqjh\" (UniqueName: \"kubernetes.io/projected/fad3a57b-d1f9-4920-971e-1e1bdf88df8f-kube-api-access-qlqjh\") pod \"downloads-6bcc868b7-smxnr\" (UID: \"fad3a57b-d1f9-4920-971e-1e1bdf88df8f\") " pod="openshift-console/downloads-6bcc868b7-smxnr" Apr 17 20:53:44.408893 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:44.408875 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-smxnr" Apr 17 20:53:44.517936 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:44.517911 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-smxnr"] Apr 17 20:53:44.519956 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:53:44.519931 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfad3a57b_d1f9_4920_971e_1e1bdf88df8f.slice/crio-0cf67f0077a99bce0b4d25161967bc49925d679837ec6ca34dac3d773b6e38cd WatchSource:0}: Error finding container 0cf67f0077a99bce0b4d25161967bc49925d679837ec6ca34dac3d773b6e38cd: Status 404 returned error can't find the container with id 0cf67f0077a99bce0b4d25161967bc49925d679837ec6ca34dac3d773b6e38cd Apr 17 20:53:45.011965 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:45.011931 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-smxnr" event={"ID":"fad3a57b-d1f9-4920-971e-1e1bdf88df8f","Type":"ContainerStarted","Data":"0cf67f0077a99bce0b4d25161967bc49925d679837ec6ca34dac3d773b6e38cd"} Apr 17 20:53:53.228184 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:53.228146 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7cc8fc6744-qp5fg"] Apr 17 20:53:53.248102 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:53.248074 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7cc8fc6744-qp5fg"] Apr 17 20:53:53.248266 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:53.248226 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7cc8fc6744-qp5fg" Apr 17 20:53:53.251069 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:53.251019 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 17 20:53:53.251069 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:53.251059 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 17 20:53:53.251069 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:53.251070 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 17 20:53:53.251306 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:53.251259 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-8b6pn\"" Apr 17 20:53:53.251362 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:53.251333 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 17 20:53:53.252265 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:53.252242 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 17 20:53:53.335404 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:53.335373 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0197dff2-26be-43bc-a692-2098548fc84f-console-serving-cert\") pod \"console-7cc8fc6744-qp5fg\" (UID: \"0197dff2-26be-43bc-a692-2098548fc84f\") " pod="openshift-console/console-7cc8fc6744-qp5fg" Apr 17 20:53:53.335404 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:53.335406 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0197dff2-26be-43bc-a692-2098548fc84f-service-ca\") pod \"console-7cc8fc6744-qp5fg\" (UID: \"0197dff2-26be-43bc-a692-2098548fc84f\") " pod="openshift-console/console-7cc8fc6744-qp5fg" Apr 17 20:53:53.335611 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:53.335503 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0197dff2-26be-43bc-a692-2098548fc84f-oauth-serving-cert\") pod \"console-7cc8fc6744-qp5fg\" (UID: \"0197dff2-26be-43bc-a692-2098548fc84f\") " pod="openshift-console/console-7cc8fc6744-qp5fg" Apr 17 20:53:53.335611 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:53.335543 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0197dff2-26be-43bc-a692-2098548fc84f-console-config\") pod \"console-7cc8fc6744-qp5fg\" (UID: \"0197dff2-26be-43bc-a692-2098548fc84f\") " pod="openshift-console/console-7cc8fc6744-qp5fg" Apr 17 20:53:53.335611 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:53.335575 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0197dff2-26be-43bc-a692-2098548fc84f-console-oauth-config\") pod \"console-7cc8fc6744-qp5fg\" (UID: \"0197dff2-26be-43bc-a692-2098548fc84f\") " pod="openshift-console/console-7cc8fc6744-qp5fg" Apr 17 20:53:53.335757 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:53.335649 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6m5rs\" (UniqueName: \"kubernetes.io/projected/0197dff2-26be-43bc-a692-2098548fc84f-kube-api-access-6m5rs\") pod \"console-7cc8fc6744-qp5fg\" (UID: \"0197dff2-26be-43bc-a692-2098548fc84f\") " pod="openshift-console/console-7cc8fc6744-qp5fg" Apr 17 20:53:53.436539 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:53.436492 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0197dff2-26be-43bc-a692-2098548fc84f-oauth-serving-cert\") pod \"console-7cc8fc6744-qp5fg\" (UID: \"0197dff2-26be-43bc-a692-2098548fc84f\") " pod="openshift-console/console-7cc8fc6744-qp5fg" Apr 17 20:53:53.436539 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:53.436543 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0197dff2-26be-43bc-a692-2098548fc84f-console-config\") pod \"console-7cc8fc6744-qp5fg\" (UID: \"0197dff2-26be-43bc-a692-2098548fc84f\") " pod="openshift-console/console-7cc8fc6744-qp5fg" Apr 17 20:53:53.436758 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:53.436582 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0197dff2-26be-43bc-a692-2098548fc84f-console-oauth-config\") pod \"console-7cc8fc6744-qp5fg\" (UID: \"0197dff2-26be-43bc-a692-2098548fc84f\") " pod="openshift-console/console-7cc8fc6744-qp5fg" Apr 17 20:53:53.436758 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:53.436608 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6m5rs\" (UniqueName: \"kubernetes.io/projected/0197dff2-26be-43bc-a692-2098548fc84f-kube-api-access-6m5rs\") pod \"console-7cc8fc6744-qp5fg\" (UID: \"0197dff2-26be-43bc-a692-2098548fc84f\") " pod="openshift-console/console-7cc8fc6744-qp5fg" Apr 17 20:53:53.436758 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:53.436729 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0197dff2-26be-43bc-a692-2098548fc84f-console-serving-cert\") pod \"console-7cc8fc6744-qp5fg\" (UID: \"0197dff2-26be-43bc-a692-2098548fc84f\") " pod="openshift-console/console-7cc8fc6744-qp5fg" Apr 17 20:53:53.436907 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:53.436766 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0197dff2-26be-43bc-a692-2098548fc84f-service-ca\") pod \"console-7cc8fc6744-qp5fg\" (UID: \"0197dff2-26be-43bc-a692-2098548fc84f\") " pod="openshift-console/console-7cc8fc6744-qp5fg" Apr 17 20:53:53.437338 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:53.437286 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0197dff2-26be-43bc-a692-2098548fc84f-oauth-serving-cert\") pod \"console-7cc8fc6744-qp5fg\" (UID: \"0197dff2-26be-43bc-a692-2098548fc84f\") " pod="openshift-console/console-7cc8fc6744-qp5fg" Apr 17 20:53:53.437471 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:53.437371 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0197dff2-26be-43bc-a692-2098548fc84f-console-config\") pod \"console-7cc8fc6744-qp5fg\" (UID: \"0197dff2-26be-43bc-a692-2098548fc84f\") " pod="openshift-console/console-7cc8fc6744-qp5fg" Apr 17 20:53:53.437471 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:53.437438 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0197dff2-26be-43bc-a692-2098548fc84f-service-ca\") pod \"console-7cc8fc6744-qp5fg\" (UID: \"0197dff2-26be-43bc-a692-2098548fc84f\") " pod="openshift-console/console-7cc8fc6744-qp5fg" Apr 17 20:53:53.439567 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:53.439538 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0197dff2-26be-43bc-a692-2098548fc84f-console-serving-cert\") pod \"console-7cc8fc6744-qp5fg\" (UID: \"0197dff2-26be-43bc-a692-2098548fc84f\") " pod="openshift-console/console-7cc8fc6744-qp5fg" Apr 17 20:53:53.439678 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:53.439601 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0197dff2-26be-43bc-a692-2098548fc84f-console-oauth-config\") pod \"console-7cc8fc6744-qp5fg\" (UID: \"0197dff2-26be-43bc-a692-2098548fc84f\") " pod="openshift-console/console-7cc8fc6744-qp5fg" Apr 17 20:53:53.444840 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:53.444813 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6m5rs\" (UniqueName: \"kubernetes.io/projected/0197dff2-26be-43bc-a692-2098548fc84f-kube-api-access-6m5rs\") pod \"console-7cc8fc6744-qp5fg\" (UID: \"0197dff2-26be-43bc-a692-2098548fc84f\") " pod="openshift-console/console-7cc8fc6744-qp5fg" Apr 17 20:53:53.559937 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:53.559853 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7cc8fc6744-qp5fg" Apr 17 20:53:59.479138 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:53:59.479107 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7cc8fc6744-qp5fg"] Apr 17 20:53:59.483160 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:53:59.483132 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0197dff2_26be_43bc_a692_2098548fc84f.slice/crio-e10f33c31759a465694c375e0777ea964f4308b98390f3c041d32dedabef75cc WatchSource:0}: Error finding container e10f33c31759a465694c375e0777ea964f4308b98390f3c041d32dedabef75cc: Status 404 returned error can't find the container with id e10f33c31759a465694c375e0777ea964f4308b98390f3c041d32dedabef75cc Apr 17 20:54:00.061738 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:00.061652 2576 generic.go:358] "Generic (PLEG): container finished" podID="f7403c3a-b10b-4584-9045-252ae239c2ef" containerID="833f3f04b73da65b315cd68f83f4bd3386240616d03aedb431ba7991735a903a" exitCode=0 Apr 17 20:54:00.061738 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:00.061707 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-ntz2w" event={"ID":"f7403c3a-b10b-4584-9045-252ae239c2ef","Type":"ContainerDied","Data":"833f3f04b73da65b315cd68f83f4bd3386240616d03aedb431ba7991735a903a"} Apr 17 20:54:00.062297 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:00.062253 2576 scope.go:117] "RemoveContainer" containerID="833f3f04b73da65b315cd68f83f4bd3386240616d03aedb431ba7991735a903a" Apr 17 20:54:00.066636 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:00.066422 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-smxnr" event={"ID":"fad3a57b-d1f9-4920-971e-1e1bdf88df8f","Type":"ContainerStarted","Data":"45787617708f0ece0154f07628684c0f30cb1d5d4ffb022b87e894afaef44923"} Apr 17 20:54:00.067176 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:00.067150 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-smxnr" Apr 17 20:54:00.070555 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:00.070487 2576 generic.go:358] "Generic (PLEG): container finished" podID="6ddc2491-03fd-4683-9633-ee15d1c98c2d" containerID="ad094b90d75bbf4510c77d757843648ad7b5c2e8f739c83802353d19723ced9a" exitCode=0 Apr 17 20:54:00.070555 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:00.070520 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-wf7nr" event={"ID":"6ddc2491-03fd-4683-9633-ee15d1c98c2d","Type":"ContainerDied","Data":"ad094b90d75bbf4510c77d757843648ad7b5c2e8f739c83802353d19723ced9a"} Apr 17 20:54:00.070987 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:00.070929 2576 scope.go:117] "RemoveContainer" containerID="ad094b90d75bbf4510c77d757843648ad7b5c2e8f739c83802353d19723ced9a" Apr 17 20:54:00.071917 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:00.071897 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7cc8fc6744-qp5fg" event={"ID":"0197dff2-26be-43bc-a692-2098548fc84f","Type":"ContainerStarted","Data":"e10f33c31759a465694c375e0777ea964f4308b98390f3c041d32dedabef75cc"} Apr 17 20:54:00.079788 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:00.079749 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-smxnr" Apr 17 20:54:00.116495 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:00.116045 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-smxnr" podStartSLOduration=1.18863446 podStartE2EDuration="16.116002813s" podCreationTimestamp="2026-04-17 20:53:44 +0000 UTC" firstStartedPulling="2026-04-17 20:53:44.521909831 +0000 UTC m=+186.528323844" lastFinishedPulling="2026-04-17 20:53:59.449278171 +0000 UTC m=+201.455692197" observedRunningTime="2026-04-17 20:54:00.115497295 +0000 UTC m=+202.121911329" watchObservedRunningTime="2026-04-17 20:54:00.116002813 +0000 UTC m=+202.122416849" Apr 17 20:54:00.811659 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:00.811402 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-759c6ffc5c-9bxqv"] Apr 17 20:54:00.815788 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:00.815721 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-759c6ffc5c-9bxqv" Apr 17 20:54:00.825786 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:00.825763 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 17 20:54:00.825979 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:00.825953 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-759c6ffc5c-9bxqv"] Apr 17 20:54:00.906018 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:00.905978 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c231d7b0-ed68-4d67-854a-1c84bcff5594-console-serving-cert\") pod \"console-759c6ffc5c-9bxqv\" (UID: \"c231d7b0-ed68-4d67-854a-1c84bcff5594\") " pod="openshift-console/console-759c6ffc5c-9bxqv" Apr 17 20:54:00.906209 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:00.906071 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c231d7b0-ed68-4d67-854a-1c84bcff5594-oauth-serving-cert\") pod \"console-759c6ffc5c-9bxqv\" (UID: \"c231d7b0-ed68-4d67-854a-1c84bcff5594\") " pod="openshift-console/console-759c6ffc5c-9bxqv" Apr 17 20:54:00.906209 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:00.906101 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c231d7b0-ed68-4d67-854a-1c84bcff5594-service-ca\") pod \"console-759c6ffc5c-9bxqv\" (UID: \"c231d7b0-ed68-4d67-854a-1c84bcff5594\") " pod="openshift-console/console-759c6ffc5c-9bxqv" Apr 17 20:54:00.906209 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:00.906146 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c231d7b0-ed68-4d67-854a-1c84bcff5594-trusted-ca-bundle\") pod \"console-759c6ffc5c-9bxqv\" (UID: \"c231d7b0-ed68-4d67-854a-1c84bcff5594\") " pod="openshift-console/console-759c6ffc5c-9bxqv" Apr 17 20:54:00.906396 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:00.906209 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22sz8\" (UniqueName: \"kubernetes.io/projected/c231d7b0-ed68-4d67-854a-1c84bcff5594-kube-api-access-22sz8\") pod \"console-759c6ffc5c-9bxqv\" (UID: \"c231d7b0-ed68-4d67-854a-1c84bcff5594\") " pod="openshift-console/console-759c6ffc5c-9bxqv" Apr 17 20:54:00.906396 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:00.906320 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c231d7b0-ed68-4d67-854a-1c84bcff5594-console-oauth-config\") pod \"console-759c6ffc5c-9bxqv\" (UID: \"c231d7b0-ed68-4d67-854a-1c84bcff5594\") " pod="openshift-console/console-759c6ffc5c-9bxqv" Apr 17 20:54:00.906396 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:00.906359 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c231d7b0-ed68-4d67-854a-1c84bcff5594-console-config\") pod \"console-759c6ffc5c-9bxqv\" (UID: \"c231d7b0-ed68-4d67-854a-1c84bcff5594\") " pod="openshift-console/console-759c6ffc5c-9bxqv" Apr 17 20:54:00.922726 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:00.922687 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-wsdk8_be99f611-577c-4ca5-8103-64206cf5c3bd/serve-healthcheck-canary/0.log" Apr 17 20:54:01.007722 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:01.007684 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c231d7b0-ed68-4d67-854a-1c84bcff5594-console-config\") pod \"console-759c6ffc5c-9bxqv\" (UID: \"c231d7b0-ed68-4d67-854a-1c84bcff5594\") " pod="openshift-console/console-759c6ffc5c-9bxqv" Apr 17 20:54:01.007883 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:01.007801 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c231d7b0-ed68-4d67-854a-1c84bcff5594-console-serving-cert\") pod \"console-759c6ffc5c-9bxqv\" (UID: \"c231d7b0-ed68-4d67-854a-1c84bcff5594\") " pod="openshift-console/console-759c6ffc5c-9bxqv" Apr 17 20:54:01.007883 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:01.007849 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c231d7b0-ed68-4d67-854a-1c84bcff5594-oauth-serving-cert\") pod \"console-759c6ffc5c-9bxqv\" (UID: \"c231d7b0-ed68-4d67-854a-1c84bcff5594\") " pod="openshift-console/console-759c6ffc5c-9bxqv" Apr 17 20:54:01.007883 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:01.007877 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c231d7b0-ed68-4d67-854a-1c84bcff5594-service-ca\") pod \"console-759c6ffc5c-9bxqv\" (UID: \"c231d7b0-ed68-4d67-854a-1c84bcff5594\") " pod="openshift-console/console-759c6ffc5c-9bxqv" Apr 17 20:54:01.008066 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:01.007918 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c231d7b0-ed68-4d67-854a-1c84bcff5594-trusted-ca-bundle\") pod \"console-759c6ffc5c-9bxqv\" (UID: \"c231d7b0-ed68-4d67-854a-1c84bcff5594\") " pod="openshift-console/console-759c6ffc5c-9bxqv" Apr 17 20:54:01.008066 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:01.007943 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-22sz8\" (UniqueName: \"kubernetes.io/projected/c231d7b0-ed68-4d67-854a-1c84bcff5594-kube-api-access-22sz8\") pod \"console-759c6ffc5c-9bxqv\" (UID: \"c231d7b0-ed68-4d67-854a-1c84bcff5594\") " pod="openshift-console/console-759c6ffc5c-9bxqv" Apr 17 20:54:01.008066 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:01.007986 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c231d7b0-ed68-4d67-854a-1c84bcff5594-console-oauth-config\") pod \"console-759c6ffc5c-9bxqv\" (UID: \"c231d7b0-ed68-4d67-854a-1c84bcff5594\") " pod="openshift-console/console-759c6ffc5c-9bxqv" Apr 17 20:54:01.008877 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:01.008847 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c231d7b0-ed68-4d67-854a-1c84bcff5594-oauth-serving-cert\") pod \"console-759c6ffc5c-9bxqv\" (UID: \"c231d7b0-ed68-4d67-854a-1c84bcff5594\") " pod="openshift-console/console-759c6ffc5c-9bxqv" Apr 17 20:54:01.008877 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:01.008899 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c231d7b0-ed68-4d67-854a-1c84bcff5594-console-config\") pod \"console-759c6ffc5c-9bxqv\" (UID: \"c231d7b0-ed68-4d67-854a-1c84bcff5594\") " pod="openshift-console/console-759c6ffc5c-9bxqv" Apr 17 20:54:01.009378 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:01.009356 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c231d7b0-ed68-4d67-854a-1c84bcff5594-service-ca\") pod \"console-759c6ffc5c-9bxqv\" (UID: \"c231d7b0-ed68-4d67-854a-1c84bcff5594\") " pod="openshift-console/console-759c6ffc5c-9bxqv" Apr 17 20:54:01.009484 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:01.009448 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c231d7b0-ed68-4d67-854a-1c84bcff5594-trusted-ca-bundle\") pod \"console-759c6ffc5c-9bxqv\" (UID: \"c231d7b0-ed68-4d67-854a-1c84bcff5594\") " pod="openshift-console/console-759c6ffc5c-9bxqv" Apr 17 20:54:01.011594 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:01.011568 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c231d7b0-ed68-4d67-854a-1c84bcff5594-console-serving-cert\") pod \"console-759c6ffc5c-9bxqv\" (UID: \"c231d7b0-ed68-4d67-854a-1c84bcff5594\") " pod="openshift-console/console-759c6ffc5c-9bxqv" Apr 17 20:54:01.011759 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:01.011711 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c231d7b0-ed68-4d67-854a-1c84bcff5594-console-oauth-config\") pod \"console-759c6ffc5c-9bxqv\" (UID: \"c231d7b0-ed68-4d67-854a-1c84bcff5594\") " pod="openshift-console/console-759c6ffc5c-9bxqv" Apr 17 20:54:01.017711 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:01.017669 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-22sz8\" (UniqueName: \"kubernetes.io/projected/c231d7b0-ed68-4d67-854a-1c84bcff5594-kube-api-access-22sz8\") pod \"console-759c6ffc5c-9bxqv\" (UID: \"c231d7b0-ed68-4d67-854a-1c84bcff5594\") " pod="openshift-console/console-759c6ffc5c-9bxqv" Apr 17 20:54:01.078282 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:01.078251 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-wf7nr" event={"ID":"6ddc2491-03fd-4683-9633-ee15d1c98c2d","Type":"ContainerStarted","Data":"350d56942d683f658bf1087080a6cdc370de8912cc3c1f56be1219651b84ac67"} Apr 17 20:54:01.081470 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:01.081379 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-ntz2w" event={"ID":"f7403c3a-b10b-4584-9045-252ae239c2ef","Type":"ContainerStarted","Data":"f86c81fe0092768ec746d010b32a225888543f13f55bf8f22ab0855bf36019f2"} Apr 17 20:54:01.131215 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:01.131185 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-759c6ffc5c-9bxqv" Apr 17 20:54:01.280062 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:01.279938 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-759c6ffc5c-9bxqv"] Apr 17 20:54:01.283728 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:54:01.283699 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc231d7b0_ed68_4d67_854a_1c84bcff5594.slice/crio-829141e8cddd07b2a8eb097b147188273041de4087b424ecfa5b00bde2eb8ae0 WatchSource:0}: Error finding container 829141e8cddd07b2a8eb097b147188273041de4087b424ecfa5b00bde2eb8ae0: Status 404 returned error can't find the container with id 829141e8cddd07b2a8eb097b147188273041de4087b424ecfa5b00bde2eb8ae0 Apr 17 20:54:02.087000 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:02.086954 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-759c6ffc5c-9bxqv" event={"ID":"c231d7b0-ed68-4d67-854a-1c84bcff5594","Type":"ContainerStarted","Data":"829141e8cddd07b2a8eb097b147188273041de4087b424ecfa5b00bde2eb8ae0"} Apr 17 20:54:04.094508 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:04.094464 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7cc8fc6744-qp5fg" event={"ID":"0197dff2-26be-43bc-a692-2098548fc84f","Type":"ContainerStarted","Data":"57d829919eaa437cec8652dc5ec9fd0c235f56fa2087b2e0388e49318ca47a9c"} Apr 17 20:54:04.095955 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:04.095926 2576 generic.go:358] "Generic (PLEG): container finished" podID="7e1e5242-44ff-4464-9b66-32c92d278373" containerID="b5eb56cf02370b7c1707efa47a8654c604ce2ded0f9af151640e781fc8aabf9f" exitCode=0 Apr 17 20:54:04.096088 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:04.096008 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gzcjp" event={"ID":"7e1e5242-44ff-4464-9b66-32c92d278373","Type":"ContainerDied","Data":"b5eb56cf02370b7c1707efa47a8654c604ce2ded0f9af151640e781fc8aabf9f"} Apr 17 20:54:04.096350 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:04.096330 2576 scope.go:117] "RemoveContainer" containerID="b5eb56cf02370b7c1707efa47a8654c604ce2ded0f9af151640e781fc8aabf9f" Apr 17 20:54:04.097470 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:04.097449 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-759c6ffc5c-9bxqv" event={"ID":"c231d7b0-ed68-4d67-854a-1c84bcff5594","Type":"ContainerStarted","Data":"94a8c08bede807bbdd4750062182c4173b5e3bdd1e3ee54b428cd7e3d48813ff"} Apr 17 20:54:04.112727 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:04.112661 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7cc8fc6744-qp5fg" podStartSLOduration=7.405354358 podStartE2EDuration="11.112646193s" podCreationTimestamp="2026-04-17 20:53:53 +0000 UTC" firstStartedPulling="2026-04-17 20:53:59.4847745 +0000 UTC m=+201.491188511" lastFinishedPulling="2026-04-17 20:54:03.192066322 +0000 UTC m=+205.198480346" observedRunningTime="2026-04-17 20:54:04.111167664 +0000 UTC m=+206.117581693" watchObservedRunningTime="2026-04-17 20:54:04.112646193 +0000 UTC m=+206.119060229" Apr 17 20:54:04.150143 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:04.150095 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-759c6ffc5c-9bxqv" podStartSLOduration=2.2432443539999998 podStartE2EDuration="4.150076795s" podCreationTimestamp="2026-04-17 20:54:00 +0000 UTC" firstStartedPulling="2026-04-17 20:54:01.286067742 +0000 UTC m=+203.292481752" lastFinishedPulling="2026-04-17 20:54:03.192900182 +0000 UTC m=+205.199314193" observedRunningTime="2026-04-17 20:54:04.148777527 +0000 UTC m=+206.155191559" watchObservedRunningTime="2026-04-17 20:54:04.150076795 +0000 UTC m=+206.156490828" Apr 17 20:54:05.103435 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:05.103393 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gzcjp" event={"ID":"7e1e5242-44ff-4464-9b66-32c92d278373","Type":"ContainerStarted","Data":"32d39f2ed3cdae0144e830f2de567c4c2f96d9c4749803dade7b58f400bfdcf9"} Apr 17 20:54:11.131887 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:11.131855 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-759c6ffc5c-9bxqv" Apr 17 20:54:11.131887 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:11.131892 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-759c6ffc5c-9bxqv" Apr 17 20:54:11.136812 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:11.136788 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-759c6ffc5c-9bxqv" Apr 17 20:54:12.127138 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:12.127112 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-759c6ffc5c-9bxqv" Apr 17 20:54:12.167230 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:12.167199 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7cc8fc6744-qp5fg"] Apr 17 20:54:13.560724 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:13.560694 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7cc8fc6744-qp5fg" Apr 17 20:54:37.186863 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:37.186801 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-7cc8fc6744-qp5fg" podUID="0197dff2-26be-43bc-a692-2098548fc84f" containerName="console" containerID="cri-o://57d829919eaa437cec8652dc5ec9fd0c235f56fa2087b2e0388e49318ca47a9c" gracePeriod=15 Apr 17 20:54:37.494712 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:37.494690 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7cc8fc6744-qp5fg_0197dff2-26be-43bc-a692-2098548fc84f/console/0.log" Apr 17 20:54:37.494817 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:37.494753 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7cc8fc6744-qp5fg" Apr 17 20:54:37.513570 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:37.513547 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0197dff2-26be-43bc-a692-2098548fc84f-console-oauth-config\") pod \"0197dff2-26be-43bc-a692-2098548fc84f\" (UID: \"0197dff2-26be-43bc-a692-2098548fc84f\") " Apr 17 20:54:37.513686 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:37.513587 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0197dff2-26be-43bc-a692-2098548fc84f-console-serving-cert\") pod \"0197dff2-26be-43bc-a692-2098548fc84f\" (UID: \"0197dff2-26be-43bc-a692-2098548fc84f\") " Apr 17 20:54:37.513686 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:37.513624 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6m5rs\" (UniqueName: \"kubernetes.io/projected/0197dff2-26be-43bc-a692-2098548fc84f-kube-api-access-6m5rs\") pod \"0197dff2-26be-43bc-a692-2098548fc84f\" (UID: \"0197dff2-26be-43bc-a692-2098548fc84f\") " Apr 17 20:54:37.513793 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:37.513695 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0197dff2-26be-43bc-a692-2098548fc84f-service-ca\") pod \"0197dff2-26be-43bc-a692-2098548fc84f\" (UID: \"0197dff2-26be-43bc-a692-2098548fc84f\") " Apr 17 20:54:37.513793 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:37.513738 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0197dff2-26be-43bc-a692-2098548fc84f-console-config\") pod \"0197dff2-26be-43bc-a692-2098548fc84f\" (UID: \"0197dff2-26be-43bc-a692-2098548fc84f\") " Apr 17 20:54:37.513793 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:37.513761 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0197dff2-26be-43bc-a692-2098548fc84f-oauth-serving-cert\") pod \"0197dff2-26be-43bc-a692-2098548fc84f\" (UID: \"0197dff2-26be-43bc-a692-2098548fc84f\") " Apr 17 20:54:37.514255 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:37.514137 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0197dff2-26be-43bc-a692-2098548fc84f-service-ca" (OuterVolumeSpecName: "service-ca") pod "0197dff2-26be-43bc-a692-2098548fc84f" (UID: "0197dff2-26be-43bc-a692-2098548fc84f"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 20:54:37.514413 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:37.514365 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0197dff2-26be-43bc-a692-2098548fc84f-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "0197dff2-26be-43bc-a692-2098548fc84f" (UID: "0197dff2-26be-43bc-a692-2098548fc84f"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 20:54:37.514743 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:37.514701 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0197dff2-26be-43bc-a692-2098548fc84f-console-config" (OuterVolumeSpecName: "console-config") pod "0197dff2-26be-43bc-a692-2098548fc84f" (UID: "0197dff2-26be-43bc-a692-2098548fc84f"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 20:54:37.516230 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:37.516192 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0197dff2-26be-43bc-a692-2098548fc84f-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "0197dff2-26be-43bc-a692-2098548fc84f" (UID: "0197dff2-26be-43bc-a692-2098548fc84f"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 20:54:37.516319 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:37.516301 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0197dff2-26be-43bc-a692-2098548fc84f-kube-api-access-6m5rs" (OuterVolumeSpecName: "kube-api-access-6m5rs") pod "0197dff2-26be-43bc-a692-2098548fc84f" (UID: "0197dff2-26be-43bc-a692-2098548fc84f"). InnerVolumeSpecName "kube-api-access-6m5rs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:54:37.516367 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:37.516340 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0197dff2-26be-43bc-a692-2098548fc84f-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "0197dff2-26be-43bc-a692-2098548fc84f" (UID: "0197dff2-26be-43bc-a692-2098548fc84f"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 20:54:37.614649 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:37.614623 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6m5rs\" (UniqueName: \"kubernetes.io/projected/0197dff2-26be-43bc-a692-2098548fc84f-kube-api-access-6m5rs\") on node \"ip-10-0-128-248.ec2.internal\" DevicePath \"\"" Apr 17 20:54:37.614649 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:37.614646 2576 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0197dff2-26be-43bc-a692-2098548fc84f-service-ca\") on node \"ip-10-0-128-248.ec2.internal\" DevicePath \"\"" Apr 17 20:54:37.614774 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:37.614655 2576 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0197dff2-26be-43bc-a692-2098548fc84f-console-config\") on node \"ip-10-0-128-248.ec2.internal\" DevicePath \"\"" Apr 17 20:54:37.614774 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:37.614665 2576 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0197dff2-26be-43bc-a692-2098548fc84f-oauth-serving-cert\") on node \"ip-10-0-128-248.ec2.internal\" DevicePath \"\"" Apr 17 20:54:37.614774 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:37.614674 2576 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0197dff2-26be-43bc-a692-2098548fc84f-console-oauth-config\") on node \"ip-10-0-128-248.ec2.internal\" DevicePath \"\"" Apr 17 20:54:37.614774 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:37.614682 2576 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0197dff2-26be-43bc-a692-2098548fc84f-console-serving-cert\") on node \"ip-10-0-128-248.ec2.internal\" DevicePath \"\"" Apr 17 20:54:38.196582 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:38.196554 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7cc8fc6744-qp5fg_0197dff2-26be-43bc-a692-2098548fc84f/console/0.log" Apr 17 20:54:38.196991 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:38.196593 2576 generic.go:358] "Generic (PLEG): container finished" podID="0197dff2-26be-43bc-a692-2098548fc84f" containerID="57d829919eaa437cec8652dc5ec9fd0c235f56fa2087b2e0388e49318ca47a9c" exitCode=2 Apr 17 20:54:38.196991 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:38.196620 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7cc8fc6744-qp5fg" event={"ID":"0197dff2-26be-43bc-a692-2098548fc84f","Type":"ContainerDied","Data":"57d829919eaa437cec8652dc5ec9fd0c235f56fa2087b2e0388e49318ca47a9c"} Apr 17 20:54:38.196991 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:38.196642 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7cc8fc6744-qp5fg" event={"ID":"0197dff2-26be-43bc-a692-2098548fc84f","Type":"ContainerDied","Data":"e10f33c31759a465694c375e0777ea964f4308b98390f3c041d32dedabef75cc"} Apr 17 20:54:38.196991 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:38.196656 2576 scope.go:117] "RemoveContainer" containerID="57d829919eaa437cec8652dc5ec9fd0c235f56fa2087b2e0388e49318ca47a9c" Apr 17 20:54:38.196991 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:38.196653 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7cc8fc6744-qp5fg" Apr 17 20:54:38.204965 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:38.204950 2576 scope.go:117] "RemoveContainer" containerID="57d829919eaa437cec8652dc5ec9fd0c235f56fa2087b2e0388e49318ca47a9c" Apr 17 20:54:38.205222 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:54:38.205189 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57d829919eaa437cec8652dc5ec9fd0c235f56fa2087b2e0388e49318ca47a9c\": container with ID starting with 57d829919eaa437cec8652dc5ec9fd0c235f56fa2087b2e0388e49318ca47a9c not found: ID does not exist" containerID="57d829919eaa437cec8652dc5ec9fd0c235f56fa2087b2e0388e49318ca47a9c" Apr 17 20:54:38.205264 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:38.205228 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57d829919eaa437cec8652dc5ec9fd0c235f56fa2087b2e0388e49318ca47a9c"} err="failed to get container status \"57d829919eaa437cec8652dc5ec9fd0c235f56fa2087b2e0388e49318ca47a9c\": rpc error: code = NotFound desc = could not find container \"57d829919eaa437cec8652dc5ec9fd0c235f56fa2087b2e0388e49318ca47a9c\": container with ID starting with 57d829919eaa437cec8652dc5ec9fd0c235f56fa2087b2e0388e49318ca47a9c not found: ID does not exist" Apr 17 20:54:38.217171 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:38.217153 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7cc8fc6744-qp5fg"] Apr 17 20:54:38.222725 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:38.222705 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7cc8fc6744-qp5fg"] Apr 17 20:54:38.491489 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:38.491410 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0197dff2-26be-43bc-a692-2098548fc84f" path="/var/lib/kubelet/pods/0197dff2-26be-43bc-a692-2098548fc84f/volumes" Apr 17 20:54:47.820257 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:47.820224 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 20:54:48.226433 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:48.226327 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="ba720c9a-8559-424f-aabc-7fc45c75698f" containerName="alertmanager" containerID="cri-o://76f9de0395bbb686e33986f4fd7e326361d74143c8c67530a1a210de97eb2e14" gracePeriod=120 Apr 17 20:54:48.226433 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:48.226378 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="ba720c9a-8559-424f-aabc-7fc45c75698f" containerName="kube-rbac-proxy-metric" containerID="cri-o://c5ff7d7513137725e43083393acb9b3bd728d63fdfa23da1a381dcc01eab79cf" gracePeriod=120 Apr 17 20:54:48.226433 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:48.226411 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="ba720c9a-8559-424f-aabc-7fc45c75698f" containerName="kube-rbac-proxy-web" containerID="cri-o://970a56aa74b5ad8db16aefae334e746649d7895de2327f34cf3ca946743bdad6" gracePeriod=120 Apr 17 20:54:48.226711 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:48.226466 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="ba720c9a-8559-424f-aabc-7fc45c75698f" containerName="config-reloader" containerID="cri-o://fae5f9bd4cdd3f4674cc10a3c199f26b38ef35f2d356249f9f5fa1afa36b5ea0" gracePeriod=120 Apr 17 20:54:48.226711 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:48.226492 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="ba720c9a-8559-424f-aabc-7fc45c75698f" containerName="prom-label-proxy" containerID="cri-o://d4bd9bd282d20e69db6ec6d692047a45c9f18689ca13f4dbdf6ad465f25a7917" gracePeriod=120 Apr 17 20:54:48.226711 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:48.226498 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="ba720c9a-8559-424f-aabc-7fc45c75698f" containerName="kube-rbac-proxy" containerID="cri-o://2914f8f1cefde30258de862165dab33c08b5c3ba79882b188d285a98e947a238" gracePeriod=120 Apr 17 20:54:49.231536 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:49.231504 2576 generic.go:358] "Generic (PLEG): container finished" podID="ba720c9a-8559-424f-aabc-7fc45c75698f" containerID="d4bd9bd282d20e69db6ec6d692047a45c9f18689ca13f4dbdf6ad465f25a7917" exitCode=0 Apr 17 20:54:49.231536 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:49.231529 2576 generic.go:358] "Generic (PLEG): container finished" podID="ba720c9a-8559-424f-aabc-7fc45c75698f" containerID="c5ff7d7513137725e43083393acb9b3bd728d63fdfa23da1a381dcc01eab79cf" exitCode=0 Apr 17 20:54:49.231536 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:49.231536 2576 generic.go:358] "Generic (PLEG): container finished" podID="ba720c9a-8559-424f-aabc-7fc45c75698f" containerID="2914f8f1cefde30258de862165dab33c08b5c3ba79882b188d285a98e947a238" exitCode=0 Apr 17 20:54:49.231536 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:49.231541 2576 generic.go:358] "Generic (PLEG): container finished" podID="ba720c9a-8559-424f-aabc-7fc45c75698f" containerID="fae5f9bd4cdd3f4674cc10a3c199f26b38ef35f2d356249f9f5fa1afa36b5ea0" exitCode=0 Apr 17 20:54:49.231536 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:49.231547 2576 generic.go:358] "Generic (PLEG): container finished" podID="ba720c9a-8559-424f-aabc-7fc45c75698f" containerID="76f9de0395bbb686e33986f4fd7e326361d74143c8c67530a1a210de97eb2e14" exitCode=0 Apr 17 20:54:49.232021 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:49.231578 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ba720c9a-8559-424f-aabc-7fc45c75698f","Type":"ContainerDied","Data":"d4bd9bd282d20e69db6ec6d692047a45c9f18689ca13f4dbdf6ad465f25a7917"} Apr 17 20:54:49.232021 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:49.231617 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ba720c9a-8559-424f-aabc-7fc45c75698f","Type":"ContainerDied","Data":"c5ff7d7513137725e43083393acb9b3bd728d63fdfa23da1a381dcc01eab79cf"} Apr 17 20:54:49.232021 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:49.231632 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ba720c9a-8559-424f-aabc-7fc45c75698f","Type":"ContainerDied","Data":"2914f8f1cefde30258de862165dab33c08b5c3ba79882b188d285a98e947a238"} Apr 17 20:54:49.232021 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:49.231646 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ba720c9a-8559-424f-aabc-7fc45c75698f","Type":"ContainerDied","Data":"fae5f9bd4cdd3f4674cc10a3c199f26b38ef35f2d356249f9f5fa1afa36b5ea0"} Apr 17 20:54:49.232021 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:49.231661 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ba720c9a-8559-424f-aabc-7fc45c75698f","Type":"ContainerDied","Data":"76f9de0395bbb686e33986f4fd7e326361d74143c8c67530a1a210de97eb2e14"} Apr 17 20:54:49.458522 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:49.458498 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:54:49.499486 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:49.499417 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tvkzx\" (UniqueName: \"kubernetes.io/projected/ba720c9a-8559-424f-aabc-7fc45c75698f-kube-api-access-tvkzx\") pod \"ba720c9a-8559-424f-aabc-7fc45c75698f\" (UID: \"ba720c9a-8559-424f-aabc-7fc45c75698f\") " Apr 17 20:54:49.499486 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:49.499449 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/ba720c9a-8559-424f-aabc-7fc45c75698f-config-volume\") pod \"ba720c9a-8559-424f-aabc-7fc45c75698f\" (UID: \"ba720c9a-8559-424f-aabc-7fc45c75698f\") " Apr 17 20:54:49.499486 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:49.499480 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ba720c9a-8559-424f-aabc-7fc45c75698f-secret-alertmanager-kube-rbac-proxy-web\") pod \"ba720c9a-8559-424f-aabc-7fc45c75698f\" (UID: \"ba720c9a-8559-424f-aabc-7fc45c75698f\") " Apr 17 20:54:49.499736 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:49.499507 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ba720c9a-8559-424f-aabc-7fc45c75698f-secret-alertmanager-kube-rbac-proxy\") pod \"ba720c9a-8559-424f-aabc-7fc45c75698f\" (UID: \"ba720c9a-8559-424f-aabc-7fc45c75698f\") " Apr 17 20:54:49.499736 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:49.499546 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba720c9a-8559-424f-aabc-7fc45c75698f-alertmanager-trusted-ca-bundle\") pod \"ba720c9a-8559-424f-aabc-7fc45c75698f\" (UID: \"ba720c9a-8559-424f-aabc-7fc45c75698f\") " Apr 17 20:54:49.499736 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:49.499612 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ba720c9a-8559-424f-aabc-7fc45c75698f-web-config\") pod \"ba720c9a-8559-424f-aabc-7fc45c75698f\" (UID: \"ba720c9a-8559-424f-aabc-7fc45c75698f\") " Apr 17 20:54:49.499736 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:49.499654 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/ba720c9a-8559-424f-aabc-7fc45c75698f-secret-alertmanager-main-tls\") pod \"ba720c9a-8559-424f-aabc-7fc45c75698f\" (UID: \"ba720c9a-8559-424f-aabc-7fc45c75698f\") " Apr 17 20:54:49.499736 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:49.499694 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/ba720c9a-8559-424f-aabc-7fc45c75698f-alertmanager-main-db\") pod \"ba720c9a-8559-424f-aabc-7fc45c75698f\" (UID: \"ba720c9a-8559-424f-aabc-7fc45c75698f\") " Apr 17 20:54:49.499736 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:49.499729 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/ba720c9a-8559-424f-aabc-7fc45c75698f-cluster-tls-config\") pod \"ba720c9a-8559-424f-aabc-7fc45c75698f\" (UID: \"ba720c9a-8559-424f-aabc-7fc45c75698f\") " Apr 17 20:54:49.500043 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:49.499770 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/ba720c9a-8559-424f-aabc-7fc45c75698f-secret-alertmanager-kube-rbac-proxy-metric\") pod \"ba720c9a-8559-424f-aabc-7fc45c75698f\" (UID: \"ba720c9a-8559-424f-aabc-7fc45c75698f\") " Apr 17 20:54:49.500043 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:49.499812 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ba720c9a-8559-424f-aabc-7fc45c75698f-metrics-client-ca\") pod \"ba720c9a-8559-424f-aabc-7fc45c75698f\" (UID: \"ba720c9a-8559-424f-aabc-7fc45c75698f\") " Apr 17 20:54:49.500043 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:49.499839 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ba720c9a-8559-424f-aabc-7fc45c75698f-tls-assets\") pod \"ba720c9a-8559-424f-aabc-7fc45c75698f\" (UID: \"ba720c9a-8559-424f-aabc-7fc45c75698f\") " Apr 17 20:54:49.500043 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:49.499873 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ba720c9a-8559-424f-aabc-7fc45c75698f-config-out\") pod \"ba720c9a-8559-424f-aabc-7fc45c75698f\" (UID: \"ba720c9a-8559-424f-aabc-7fc45c75698f\") " Apr 17 20:54:49.500248 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:49.500039 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba720c9a-8559-424f-aabc-7fc45c75698f-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "ba720c9a-8559-424f-aabc-7fc45c75698f" (UID: "ba720c9a-8559-424f-aabc-7fc45c75698f"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 20:54:49.500248 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:49.500148 2576 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/ba720c9a-8559-424f-aabc-7fc45c75698f-alertmanager-main-db\") on node \"ip-10-0-128-248.ec2.internal\" DevicePath \"\"" Apr 17 20:54:49.500586 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:49.500427 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba720c9a-8559-424f-aabc-7fc45c75698f-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "ba720c9a-8559-424f-aabc-7fc45c75698f" (UID: "ba720c9a-8559-424f-aabc-7fc45c75698f"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 20:54:49.501063 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:49.501020 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba720c9a-8559-424f-aabc-7fc45c75698f-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "ba720c9a-8559-424f-aabc-7fc45c75698f" (UID: "ba720c9a-8559-424f-aabc-7fc45c75698f"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 20:54:49.502501 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:49.502463 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba720c9a-8559-424f-aabc-7fc45c75698f-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "ba720c9a-8559-424f-aabc-7fc45c75698f" (UID: "ba720c9a-8559-424f-aabc-7fc45c75698f"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:54:49.502922 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:49.502898 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba720c9a-8559-424f-aabc-7fc45c75698f-config-volume" (OuterVolumeSpecName: "config-volume") pod "ba720c9a-8559-424f-aabc-7fc45c75698f" (UID: "ba720c9a-8559-424f-aabc-7fc45c75698f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 20:54:49.503734 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:49.503707 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba720c9a-8559-424f-aabc-7fc45c75698f-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "ba720c9a-8559-424f-aabc-7fc45c75698f" (UID: "ba720c9a-8559-424f-aabc-7fc45c75698f"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 20:54:49.504067 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:49.504047 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba720c9a-8559-424f-aabc-7fc45c75698f-kube-api-access-tvkzx" (OuterVolumeSpecName: "kube-api-access-tvkzx") pod "ba720c9a-8559-424f-aabc-7fc45c75698f" (UID: "ba720c9a-8559-424f-aabc-7fc45c75698f"). InnerVolumeSpecName "kube-api-access-tvkzx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:54:49.505677 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:49.505641 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba720c9a-8559-424f-aabc-7fc45c75698f-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "ba720c9a-8559-424f-aabc-7fc45c75698f" (UID: "ba720c9a-8559-424f-aabc-7fc45c75698f"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 20:54:49.505677 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:49.505665 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba720c9a-8559-424f-aabc-7fc45c75698f-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "ba720c9a-8559-424f-aabc-7fc45c75698f" (UID: "ba720c9a-8559-424f-aabc-7fc45c75698f"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 20:54:49.505806 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:49.505661 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba720c9a-8559-424f-aabc-7fc45c75698f-config-out" (OuterVolumeSpecName: "config-out") pod "ba720c9a-8559-424f-aabc-7fc45c75698f" (UID: "ba720c9a-8559-424f-aabc-7fc45c75698f"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 20:54:49.505991 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:49.505971 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba720c9a-8559-424f-aabc-7fc45c75698f-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "ba720c9a-8559-424f-aabc-7fc45c75698f" (UID: "ba720c9a-8559-424f-aabc-7fc45c75698f"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 20:54:49.508572 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:49.508548 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba720c9a-8559-424f-aabc-7fc45c75698f-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "ba720c9a-8559-424f-aabc-7fc45c75698f" (UID: "ba720c9a-8559-424f-aabc-7fc45c75698f"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 20:54:49.514887 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:49.514864 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba720c9a-8559-424f-aabc-7fc45c75698f-web-config" (OuterVolumeSpecName: "web-config") pod "ba720c9a-8559-424f-aabc-7fc45c75698f" (UID: "ba720c9a-8559-424f-aabc-7fc45c75698f"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 20:54:49.600758 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:49.600734 2576 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/ba720c9a-8559-424f-aabc-7fc45c75698f-config-volume\") on node \"ip-10-0-128-248.ec2.internal\" DevicePath \"\"" Apr 17 20:54:49.600844 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:49.600758 2576 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ba720c9a-8559-424f-aabc-7fc45c75698f-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-128-248.ec2.internal\" DevicePath \"\"" Apr 17 20:54:49.600844 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:49.600778 2576 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ba720c9a-8559-424f-aabc-7fc45c75698f-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-128-248.ec2.internal\" DevicePath \"\"" Apr 17 20:54:49.600844 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:49.600788 2576 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba720c9a-8559-424f-aabc-7fc45c75698f-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-128-248.ec2.internal\" DevicePath \"\"" Apr 17 20:54:49.600844 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:49.600797 2576 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ba720c9a-8559-424f-aabc-7fc45c75698f-web-config\") on node \"ip-10-0-128-248.ec2.internal\" DevicePath \"\"" Apr 17 20:54:49.600844 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:49.600805 2576 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/ba720c9a-8559-424f-aabc-7fc45c75698f-secret-alertmanager-main-tls\") on node \"ip-10-0-128-248.ec2.internal\" DevicePath \"\"" Apr 17 20:54:49.600844 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:49.600815 2576 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/ba720c9a-8559-424f-aabc-7fc45c75698f-cluster-tls-config\") on node \"ip-10-0-128-248.ec2.internal\" DevicePath \"\"" Apr 17 20:54:49.600844 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:49.600823 2576 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/ba720c9a-8559-424f-aabc-7fc45c75698f-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-128-248.ec2.internal\" DevicePath \"\"" Apr 17 20:54:49.600844 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:49.600833 2576 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ba720c9a-8559-424f-aabc-7fc45c75698f-metrics-client-ca\") on node \"ip-10-0-128-248.ec2.internal\" DevicePath \"\"" Apr 17 20:54:49.600844 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:49.600840 2576 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ba720c9a-8559-424f-aabc-7fc45c75698f-tls-assets\") on node \"ip-10-0-128-248.ec2.internal\" DevicePath \"\"" Apr 17 20:54:49.601111 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:49.600849 2576 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ba720c9a-8559-424f-aabc-7fc45c75698f-config-out\") on node \"ip-10-0-128-248.ec2.internal\" DevicePath \"\"" Apr 17 20:54:49.601111 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:49.600858 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tvkzx\" (UniqueName: \"kubernetes.io/projected/ba720c9a-8559-424f-aabc-7fc45c75698f-kube-api-access-tvkzx\") on node \"ip-10-0-128-248.ec2.internal\" DevicePath \"\"" Apr 17 20:54:50.237226 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:50.237192 2576 generic.go:358] "Generic (PLEG): container finished" podID="ba720c9a-8559-424f-aabc-7fc45c75698f" containerID="970a56aa74b5ad8db16aefae334e746649d7895de2327f34cf3ca946743bdad6" exitCode=0 Apr 17 20:54:50.237598 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:50.237248 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ba720c9a-8559-424f-aabc-7fc45c75698f","Type":"ContainerDied","Data":"970a56aa74b5ad8db16aefae334e746649d7895de2327f34cf3ca946743bdad6"} Apr 17 20:54:50.237598 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:50.237275 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ba720c9a-8559-424f-aabc-7fc45c75698f","Type":"ContainerDied","Data":"d6e9a21ad2a71e6dc927c081060806518b0fe0fa6aab65eebd18fbc6a8c16caf"} Apr 17 20:54:50.237598 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:50.237290 2576 scope.go:117] "RemoveContainer" containerID="d4bd9bd282d20e69db6ec6d692047a45c9f18689ca13f4dbdf6ad465f25a7917" Apr 17 20:54:50.237598 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:50.237310 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:54:50.244917 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:50.244896 2576 scope.go:117] "RemoveContainer" containerID="c5ff7d7513137725e43083393acb9b3bd728d63fdfa23da1a381dcc01eab79cf" Apr 17 20:54:50.251526 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:50.251508 2576 scope.go:117] "RemoveContainer" containerID="2914f8f1cefde30258de862165dab33c08b5c3ba79882b188d285a98e947a238" Apr 17 20:54:50.257415 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:50.257398 2576 scope.go:117] "RemoveContainer" containerID="970a56aa74b5ad8db16aefae334e746649d7895de2327f34cf3ca946743bdad6" Apr 17 20:54:50.260804 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:50.260784 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 20:54:50.264497 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:50.264478 2576 scope.go:117] "RemoveContainer" containerID="fae5f9bd4cdd3f4674cc10a3c199f26b38ef35f2d356249f9f5fa1afa36b5ea0" Apr 17 20:54:50.264585 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:50.264568 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 20:54:50.270513 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:50.270496 2576 scope.go:117] "RemoveContainer" containerID="76f9de0395bbb686e33986f4fd7e326361d74143c8c67530a1a210de97eb2e14" Apr 17 20:54:50.276987 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:50.276971 2576 scope.go:117] "RemoveContainer" containerID="d2d171ec4d0a1ecfcb1b5765ae2ed5ec487b0953803f1e9da636cee410f6c728" Apr 17 20:54:50.282825 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:50.282810 2576 scope.go:117] "RemoveContainer" containerID="d4bd9bd282d20e69db6ec6d692047a45c9f18689ca13f4dbdf6ad465f25a7917" Apr 17 20:54:50.283089 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:54:50.283069 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4bd9bd282d20e69db6ec6d692047a45c9f18689ca13f4dbdf6ad465f25a7917\": container with ID starting with d4bd9bd282d20e69db6ec6d692047a45c9f18689ca13f4dbdf6ad465f25a7917 not found: ID does not exist" containerID="d4bd9bd282d20e69db6ec6d692047a45c9f18689ca13f4dbdf6ad465f25a7917" Apr 17 20:54:50.283158 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:50.283099 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4bd9bd282d20e69db6ec6d692047a45c9f18689ca13f4dbdf6ad465f25a7917"} err="failed to get container status \"d4bd9bd282d20e69db6ec6d692047a45c9f18689ca13f4dbdf6ad465f25a7917\": rpc error: code = NotFound desc = could not find container \"d4bd9bd282d20e69db6ec6d692047a45c9f18689ca13f4dbdf6ad465f25a7917\": container with ID starting with d4bd9bd282d20e69db6ec6d692047a45c9f18689ca13f4dbdf6ad465f25a7917 not found: ID does not exist" Apr 17 20:54:50.283158 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:50.283124 2576 scope.go:117] "RemoveContainer" containerID="c5ff7d7513137725e43083393acb9b3bd728d63fdfa23da1a381dcc01eab79cf" Apr 17 20:54:50.283373 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:54:50.283356 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5ff7d7513137725e43083393acb9b3bd728d63fdfa23da1a381dcc01eab79cf\": container with ID starting with c5ff7d7513137725e43083393acb9b3bd728d63fdfa23da1a381dcc01eab79cf not found: ID does not exist" containerID="c5ff7d7513137725e43083393acb9b3bd728d63fdfa23da1a381dcc01eab79cf" Apr 17 20:54:50.283412 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:50.283381 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5ff7d7513137725e43083393acb9b3bd728d63fdfa23da1a381dcc01eab79cf"} err="failed to get container status \"c5ff7d7513137725e43083393acb9b3bd728d63fdfa23da1a381dcc01eab79cf\": rpc error: code = NotFound desc = could not find container \"c5ff7d7513137725e43083393acb9b3bd728d63fdfa23da1a381dcc01eab79cf\": container with ID starting with c5ff7d7513137725e43083393acb9b3bd728d63fdfa23da1a381dcc01eab79cf not found: ID does not exist" Apr 17 20:54:50.283412 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:50.283399 2576 scope.go:117] "RemoveContainer" containerID="2914f8f1cefde30258de862165dab33c08b5c3ba79882b188d285a98e947a238" Apr 17 20:54:50.283612 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:54:50.283596 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2914f8f1cefde30258de862165dab33c08b5c3ba79882b188d285a98e947a238\": container with ID starting with 2914f8f1cefde30258de862165dab33c08b5c3ba79882b188d285a98e947a238 not found: ID does not exist" containerID="2914f8f1cefde30258de862165dab33c08b5c3ba79882b188d285a98e947a238" Apr 17 20:54:50.283687 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:50.283619 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2914f8f1cefde30258de862165dab33c08b5c3ba79882b188d285a98e947a238"} err="failed to get container status \"2914f8f1cefde30258de862165dab33c08b5c3ba79882b188d285a98e947a238\": rpc error: code = NotFound desc = could not find container \"2914f8f1cefde30258de862165dab33c08b5c3ba79882b188d285a98e947a238\": container with ID starting with 2914f8f1cefde30258de862165dab33c08b5c3ba79882b188d285a98e947a238 not found: ID does not exist" Apr 17 20:54:50.283687 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:50.283638 2576 scope.go:117] "RemoveContainer" containerID="970a56aa74b5ad8db16aefae334e746649d7895de2327f34cf3ca946743bdad6" Apr 17 20:54:50.283852 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:54:50.283838 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"970a56aa74b5ad8db16aefae334e746649d7895de2327f34cf3ca946743bdad6\": container with ID starting with 970a56aa74b5ad8db16aefae334e746649d7895de2327f34cf3ca946743bdad6 not found: ID does not exist" containerID="970a56aa74b5ad8db16aefae334e746649d7895de2327f34cf3ca946743bdad6" Apr 17 20:54:50.283885 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:50.283855 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"970a56aa74b5ad8db16aefae334e746649d7895de2327f34cf3ca946743bdad6"} err="failed to get container status \"970a56aa74b5ad8db16aefae334e746649d7895de2327f34cf3ca946743bdad6\": rpc error: code = NotFound desc = could not find container \"970a56aa74b5ad8db16aefae334e746649d7895de2327f34cf3ca946743bdad6\": container with ID starting with 970a56aa74b5ad8db16aefae334e746649d7895de2327f34cf3ca946743bdad6 not found: ID does not exist" Apr 17 20:54:50.283885 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:50.283876 2576 scope.go:117] "RemoveContainer" containerID="fae5f9bd4cdd3f4674cc10a3c199f26b38ef35f2d356249f9f5fa1afa36b5ea0" Apr 17 20:54:50.284108 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:54:50.284090 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fae5f9bd4cdd3f4674cc10a3c199f26b38ef35f2d356249f9f5fa1afa36b5ea0\": container with ID starting with fae5f9bd4cdd3f4674cc10a3c199f26b38ef35f2d356249f9f5fa1afa36b5ea0 not found: ID does not exist" containerID="fae5f9bd4cdd3f4674cc10a3c199f26b38ef35f2d356249f9f5fa1afa36b5ea0" Apr 17 20:54:50.284181 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:50.284117 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fae5f9bd4cdd3f4674cc10a3c199f26b38ef35f2d356249f9f5fa1afa36b5ea0"} err="failed to get container status \"fae5f9bd4cdd3f4674cc10a3c199f26b38ef35f2d356249f9f5fa1afa36b5ea0\": rpc error: code = NotFound desc = could not find container \"fae5f9bd4cdd3f4674cc10a3c199f26b38ef35f2d356249f9f5fa1afa36b5ea0\": container with ID starting with fae5f9bd4cdd3f4674cc10a3c199f26b38ef35f2d356249f9f5fa1afa36b5ea0 not found: ID does not exist" Apr 17 20:54:50.284181 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:50.284135 2576 scope.go:117] "RemoveContainer" containerID="76f9de0395bbb686e33986f4fd7e326361d74143c8c67530a1a210de97eb2e14" Apr 17 20:54:50.284367 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:54:50.284350 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76f9de0395bbb686e33986f4fd7e326361d74143c8c67530a1a210de97eb2e14\": container with ID starting with 76f9de0395bbb686e33986f4fd7e326361d74143c8c67530a1a210de97eb2e14 not found: ID does not exist" containerID="76f9de0395bbb686e33986f4fd7e326361d74143c8c67530a1a210de97eb2e14" Apr 17 20:54:50.284412 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:50.284373 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76f9de0395bbb686e33986f4fd7e326361d74143c8c67530a1a210de97eb2e14"} err="failed to get container status \"76f9de0395bbb686e33986f4fd7e326361d74143c8c67530a1a210de97eb2e14\": rpc error: code = NotFound desc = could not find container \"76f9de0395bbb686e33986f4fd7e326361d74143c8c67530a1a210de97eb2e14\": container with ID starting with 76f9de0395bbb686e33986f4fd7e326361d74143c8c67530a1a210de97eb2e14 not found: ID does not exist" Apr 17 20:54:50.284412 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:50.284386 2576 scope.go:117] "RemoveContainer" containerID="d2d171ec4d0a1ecfcb1b5765ae2ed5ec487b0953803f1e9da636cee410f6c728" Apr 17 20:54:50.284593 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:54:50.284580 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2d171ec4d0a1ecfcb1b5765ae2ed5ec487b0953803f1e9da636cee410f6c728\": container with ID starting with d2d171ec4d0a1ecfcb1b5765ae2ed5ec487b0953803f1e9da636cee410f6c728 not found: ID does not exist" containerID="d2d171ec4d0a1ecfcb1b5765ae2ed5ec487b0953803f1e9da636cee410f6c728" Apr 17 20:54:50.284634 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:50.284596 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2d171ec4d0a1ecfcb1b5765ae2ed5ec487b0953803f1e9da636cee410f6c728"} err="failed to get container status \"d2d171ec4d0a1ecfcb1b5765ae2ed5ec487b0953803f1e9da636cee410f6c728\": rpc error: code = NotFound desc = could not find container \"d2d171ec4d0a1ecfcb1b5765ae2ed5ec487b0953803f1e9da636cee410f6c728\": container with ID starting with d2d171ec4d0a1ecfcb1b5765ae2ed5ec487b0953803f1e9da636cee410f6c728 not found: ID does not exist" Apr 17 20:54:50.291674 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:50.291653 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 20:54:50.291936 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:50.291924 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ba720c9a-8559-424f-aabc-7fc45c75698f" containerName="kube-rbac-proxy-web" Apr 17 20:54:50.291972 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:50.291938 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba720c9a-8559-424f-aabc-7fc45c75698f" containerName="kube-rbac-proxy-web" Apr 17 20:54:50.291972 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:50.291950 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ba720c9a-8559-424f-aabc-7fc45c75698f" containerName="alertmanager" Apr 17 20:54:50.291972 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:50.291955 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba720c9a-8559-424f-aabc-7fc45c75698f" containerName="alertmanager" Apr 17 20:54:50.291972 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:50.291965 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ba720c9a-8559-424f-aabc-7fc45c75698f" containerName="init-config-reloader" Apr 17 20:54:50.291972 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:50.291970 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba720c9a-8559-424f-aabc-7fc45c75698f" containerName="init-config-reloader" Apr 17 20:54:50.292132 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:50.291978 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ba720c9a-8559-424f-aabc-7fc45c75698f" containerName="prom-label-proxy" Apr 17 20:54:50.292132 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:50.291983 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba720c9a-8559-424f-aabc-7fc45c75698f" containerName="prom-label-proxy" Apr 17 20:54:50.292132 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:50.291990 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ba720c9a-8559-424f-aabc-7fc45c75698f" containerName="kube-rbac-proxy" Apr 17 20:54:50.292132 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:50.291995 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba720c9a-8559-424f-aabc-7fc45c75698f" containerName="kube-rbac-proxy" Apr 17 20:54:50.292132 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:50.292003 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0197dff2-26be-43bc-a692-2098548fc84f" containerName="console" Apr 17 20:54:50.292132 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:50.292007 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="0197dff2-26be-43bc-a692-2098548fc84f" containerName="console" Apr 17 20:54:50.292132 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:50.292013 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ba720c9a-8559-424f-aabc-7fc45c75698f" containerName="config-reloader" Apr 17 20:54:50.292132 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:50.292018 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba720c9a-8559-424f-aabc-7fc45c75698f" containerName="config-reloader" Apr 17 20:54:50.292132 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:50.292040 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ba720c9a-8559-424f-aabc-7fc45c75698f" containerName="kube-rbac-proxy-metric" Apr 17 20:54:50.292132 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:50.292045 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba720c9a-8559-424f-aabc-7fc45c75698f" containerName="kube-rbac-proxy-metric" Apr 17 20:54:50.292132 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:50.292092 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="ba720c9a-8559-424f-aabc-7fc45c75698f" containerName="kube-rbac-proxy-metric" Apr 17 20:54:50.292132 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:50.292099 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="ba720c9a-8559-424f-aabc-7fc45c75698f" containerName="alertmanager" Apr 17 20:54:50.292132 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:50.292106 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="ba720c9a-8559-424f-aabc-7fc45c75698f" containerName="prom-label-proxy" Apr 17 20:54:50.292132 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:50.292112 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="ba720c9a-8559-424f-aabc-7fc45c75698f" containerName="kube-rbac-proxy" Apr 17 20:54:50.292132 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:50.292119 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="0197dff2-26be-43bc-a692-2098548fc84f" containerName="console" Apr 17 20:54:50.292132 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:50.292125 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="ba720c9a-8559-424f-aabc-7fc45c75698f" containerName="kube-rbac-proxy-web" Apr 17 20:54:50.292132 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:50.292131 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="ba720c9a-8559-424f-aabc-7fc45c75698f" containerName="config-reloader" Apr 17 20:54:50.297042 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:50.297010 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:54:50.299614 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:50.299593 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 17 20:54:50.299745 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:50.299663 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 17 20:54:50.299745 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:50.299684 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 17 20:54:50.299874 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:50.299750 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 17 20:54:50.299874 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:50.299686 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 17 20:54:50.299874 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:50.299777 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 17 20:54:50.300047 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:50.300019 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 17 20:54:50.300113 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:50.300040 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-ml5lb\"" Apr 17 20:54:50.300113 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:50.300084 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 17 20:54:50.304802 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:50.304779 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 17 20:54:50.307540 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:50.307519 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 20:54:50.406504 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:50.406478 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d64d740e-c62f-415b-9506-0ddd2f3a03c9-config-volume\") pod \"alertmanager-main-0\" (UID: \"d64d740e-c62f-415b-9506-0ddd2f3a03c9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:54:50.406504 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:50.406506 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tk2s5\" (UniqueName: \"kubernetes.io/projected/d64d740e-c62f-415b-9506-0ddd2f3a03c9-kube-api-access-tk2s5\") pod \"alertmanager-main-0\" (UID: \"d64d740e-c62f-415b-9506-0ddd2f3a03c9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:54:50.406699 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:50.406540 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f35870d4-a814-46cd-8764-30e79f3a7d79-metrics-certs\") pod \"network-metrics-daemon-z2lfh\" (UID: \"f35870d4-a814-46cd-8764-30e79f3a7d79\") " pod="openshift-multus/network-metrics-daemon-z2lfh" Apr 17 20:54:50.406699 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:50.406557 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d64d740e-c62f-415b-9506-0ddd2f3a03c9-tls-assets\") pod \"alertmanager-main-0\" (UID: \"d64d740e-c62f-415b-9506-0ddd2f3a03c9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:54:50.406699 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:50.406640 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/d64d740e-c62f-415b-9506-0ddd2f3a03c9-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"d64d740e-c62f-415b-9506-0ddd2f3a03c9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:54:50.406699 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:50.406692 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d64d740e-c62f-415b-9506-0ddd2f3a03c9-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"d64d740e-c62f-415b-9506-0ddd2f3a03c9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:54:50.406901 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:50.406731 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d64d740e-c62f-415b-9506-0ddd2f3a03c9-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"d64d740e-c62f-415b-9506-0ddd2f3a03c9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:54:50.406901 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:50.406756 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d64d740e-c62f-415b-9506-0ddd2f3a03c9-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"d64d740e-c62f-415b-9506-0ddd2f3a03c9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:54:50.406901 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:50.406779 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/d64d740e-c62f-415b-9506-0ddd2f3a03c9-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"d64d740e-c62f-415b-9506-0ddd2f3a03c9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:54:50.406901 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:50.406812 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d64d740e-c62f-415b-9506-0ddd2f3a03c9-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"d64d740e-c62f-415b-9506-0ddd2f3a03c9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:54:50.406901 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:50.406838 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d64d740e-c62f-415b-9506-0ddd2f3a03c9-web-config\") pod \"alertmanager-main-0\" (UID: \"d64d740e-c62f-415b-9506-0ddd2f3a03c9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:54:50.406901 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:50.406879 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/d64d740e-c62f-415b-9506-0ddd2f3a03c9-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"d64d740e-c62f-415b-9506-0ddd2f3a03c9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:54:50.407178 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:50.406902 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d64d740e-c62f-415b-9506-0ddd2f3a03c9-config-out\") pod \"alertmanager-main-0\" (UID: \"d64d740e-c62f-415b-9506-0ddd2f3a03c9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:54:50.407178 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:50.406925 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d64d740e-c62f-415b-9506-0ddd2f3a03c9-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"d64d740e-c62f-415b-9506-0ddd2f3a03c9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:54:50.408655 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:50.408640 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f35870d4-a814-46cd-8764-30e79f3a7d79-metrics-certs\") pod \"network-metrics-daemon-z2lfh\" (UID: \"f35870d4-a814-46cd-8764-30e79f3a7d79\") " pod="openshift-multus/network-metrics-daemon-z2lfh" Apr 17 20:54:50.487205 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:50.487177 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-f22jz\"" Apr 17 20:54:50.487987 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:50.487942 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba720c9a-8559-424f-aabc-7fc45c75698f" path="/var/lib/kubelet/pods/ba720c9a-8559-424f-aabc-7fc45c75698f/volumes" Apr 17 20:54:50.495327 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:50.495308 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z2lfh" Apr 17 20:54:50.508167 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:50.508143 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/d64d740e-c62f-415b-9506-0ddd2f3a03c9-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"d64d740e-c62f-415b-9506-0ddd2f3a03c9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:54:50.508283 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:50.508264 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d64d740e-c62f-415b-9506-0ddd2f3a03c9-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"d64d740e-c62f-415b-9506-0ddd2f3a03c9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:54:50.508346 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:50.508311 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d64d740e-c62f-415b-9506-0ddd2f3a03c9-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"d64d740e-c62f-415b-9506-0ddd2f3a03c9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:54:50.508346 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:50.508339 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d64d740e-c62f-415b-9506-0ddd2f3a03c9-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"d64d740e-c62f-415b-9506-0ddd2f3a03c9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:54:50.508442 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:50.508366 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/d64d740e-c62f-415b-9506-0ddd2f3a03c9-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"d64d740e-c62f-415b-9506-0ddd2f3a03c9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:54:50.508442 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:50.508396 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d64d740e-c62f-415b-9506-0ddd2f3a03c9-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"d64d740e-c62f-415b-9506-0ddd2f3a03c9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:54:50.508442 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:50.508423 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d64d740e-c62f-415b-9506-0ddd2f3a03c9-web-config\") pod \"alertmanager-main-0\" (UID: \"d64d740e-c62f-415b-9506-0ddd2f3a03c9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:54:50.508583 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:50.508466 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/d64d740e-c62f-415b-9506-0ddd2f3a03c9-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"d64d740e-c62f-415b-9506-0ddd2f3a03c9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:54:50.508583 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:50.508492 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d64d740e-c62f-415b-9506-0ddd2f3a03c9-config-out\") pod \"alertmanager-main-0\" (UID: \"d64d740e-c62f-415b-9506-0ddd2f3a03c9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:54:50.508583 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:50.508518 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d64d740e-c62f-415b-9506-0ddd2f3a03c9-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"d64d740e-c62f-415b-9506-0ddd2f3a03c9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:54:50.508583 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:50.508548 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d64d740e-c62f-415b-9506-0ddd2f3a03c9-config-volume\") pod \"alertmanager-main-0\" (UID: \"d64d740e-c62f-415b-9506-0ddd2f3a03c9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:54:50.508583 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:50.508575 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tk2s5\" (UniqueName: \"kubernetes.io/projected/d64d740e-c62f-415b-9506-0ddd2f3a03c9-kube-api-access-tk2s5\") pod \"alertmanager-main-0\" (UID: \"d64d740e-c62f-415b-9506-0ddd2f3a03c9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:54:50.508801 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:50.508624 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d64d740e-c62f-415b-9506-0ddd2f3a03c9-tls-assets\") pod \"alertmanager-main-0\" (UID: \"d64d740e-c62f-415b-9506-0ddd2f3a03c9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:54:50.508854 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:50.508810 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/d64d740e-c62f-415b-9506-0ddd2f3a03c9-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"d64d740e-c62f-415b-9506-0ddd2f3a03c9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:54:50.510013 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:50.508931 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d64d740e-c62f-415b-9506-0ddd2f3a03c9-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"d64d740e-c62f-415b-9506-0ddd2f3a03c9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:54:50.510013 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:50.509727 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d64d740e-c62f-415b-9506-0ddd2f3a03c9-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"d64d740e-c62f-415b-9506-0ddd2f3a03c9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:54:50.511291 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:50.511263 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d64d740e-c62f-415b-9506-0ddd2f3a03c9-tls-assets\") pod \"alertmanager-main-0\" (UID: \"d64d740e-c62f-415b-9506-0ddd2f3a03c9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:54:50.511914 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:50.511471 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d64d740e-c62f-415b-9506-0ddd2f3a03c9-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"d64d740e-c62f-415b-9506-0ddd2f3a03c9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:54:50.511914 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:50.511491 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/d64d740e-c62f-415b-9506-0ddd2f3a03c9-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"d64d740e-c62f-415b-9506-0ddd2f3a03c9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:54:50.512275 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:50.512216 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/d64d740e-c62f-415b-9506-0ddd2f3a03c9-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"d64d740e-c62f-415b-9506-0ddd2f3a03c9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:54:50.512430 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:50.512358 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d64d740e-c62f-415b-9506-0ddd2f3a03c9-config-out\") pod \"alertmanager-main-0\" (UID: \"d64d740e-c62f-415b-9506-0ddd2f3a03c9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:54:50.513039 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:50.512805 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d64d740e-c62f-415b-9506-0ddd2f3a03c9-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"d64d740e-c62f-415b-9506-0ddd2f3a03c9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:54:50.513192 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:50.513178 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d64d740e-c62f-415b-9506-0ddd2f3a03c9-web-config\") pod \"alertmanager-main-0\" (UID: \"d64d740e-c62f-415b-9506-0ddd2f3a03c9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:54:50.513260 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:50.513204 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d64d740e-c62f-415b-9506-0ddd2f3a03c9-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"d64d740e-c62f-415b-9506-0ddd2f3a03c9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:54:50.514076 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:50.514058 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d64d740e-c62f-415b-9506-0ddd2f3a03c9-config-volume\") pod \"alertmanager-main-0\" (UID: \"d64d740e-c62f-415b-9506-0ddd2f3a03c9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:54:50.518848 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:50.518830 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tk2s5\" (UniqueName: \"kubernetes.io/projected/d64d740e-c62f-415b-9506-0ddd2f3a03c9-kube-api-access-tk2s5\") pod \"alertmanager-main-0\" (UID: \"d64d740e-c62f-415b-9506-0ddd2f3a03c9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:54:50.606397 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:50.606371 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:54:50.612199 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:50.612159 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-z2lfh"] Apr 17 20:54:50.615237 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:54:50.615211 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf35870d4_a814_46cd_8764_30e79f3a7d79.slice/crio-33c472fade374539d40c5b7e32c188444a1422550be185e691566eb05d155a5c WatchSource:0}: Error finding container 33c472fade374539d40c5b7e32c188444a1422550be185e691566eb05d155a5c: Status 404 returned error can't find the container with id 33c472fade374539d40c5b7e32c188444a1422550be185e691566eb05d155a5c Apr 17 20:54:50.725109 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:50.725014 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 20:54:50.727170 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:54:50.727138 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd64d740e_c62f_415b_9506_0ddd2f3a03c9.slice/crio-efa619febec2c01b3c11908f23ea69adbbdf1915fcf3602a3ae2254c6db2278e WatchSource:0}: Error finding container efa619febec2c01b3c11908f23ea69adbbdf1915fcf3602a3ae2254c6db2278e: Status 404 returned error can't find the container with id efa619febec2c01b3c11908f23ea69adbbdf1915fcf3602a3ae2254c6db2278e Apr 17 20:54:51.247338 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:51.247296 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-z2lfh" event={"ID":"f35870d4-a814-46cd-8764-30e79f3a7d79","Type":"ContainerStarted","Data":"33c472fade374539d40c5b7e32c188444a1422550be185e691566eb05d155a5c"} Apr 17 20:54:51.248786 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:51.248759 2576 generic.go:358] "Generic (PLEG): container finished" podID="d64d740e-c62f-415b-9506-0ddd2f3a03c9" containerID="c15c04ffdcce0cc8a8a33dcb16e338f7a9fefe20a3cf5e8404e0ecfb283c4700" exitCode=0 Apr 17 20:54:51.248908 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:51.248794 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d64d740e-c62f-415b-9506-0ddd2f3a03c9","Type":"ContainerDied","Data":"c15c04ffdcce0cc8a8a33dcb16e338f7a9fefe20a3cf5e8404e0ecfb283c4700"} Apr 17 20:54:51.248908 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:51.248812 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d64d740e-c62f-415b-9506-0ddd2f3a03c9","Type":"ContainerStarted","Data":"efa619febec2c01b3c11908f23ea69adbbdf1915fcf3602a3ae2254c6db2278e"} Apr 17 20:54:51.854589 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:51.854568 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-69d7584d88-zt59r"] Apr 17 20:54:51.858528 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:51.858509 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-69d7584d88-zt59r" Apr 17 20:54:51.861096 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:51.861075 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 17 20:54:51.861346 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:51.861329 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 17 20:54:51.861477 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:51.861411 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 17 20:54:51.861751 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:51.861728 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 17 20:54:51.861941 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:51.861921 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-7fjst\"" Apr 17 20:54:51.862063 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:51.862020 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 17 20:54:51.866531 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:51.866449 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 17 20:54:51.870923 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:51.870903 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-69d7584d88-zt59r"] Apr 17 20:54:51.921983 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:51.921887 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/8467193a-45b0-480d-b09a-7583a0813a24-federate-client-tls\") pod \"telemeter-client-69d7584d88-zt59r\" (UID: \"8467193a-45b0-480d-b09a-7583a0813a24\") " pod="openshift-monitoring/telemeter-client-69d7584d88-zt59r" Apr 17 20:54:51.921983 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:51.921943 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8467193a-45b0-480d-b09a-7583a0813a24-metrics-client-ca\") pod \"telemeter-client-69d7584d88-zt59r\" (UID: \"8467193a-45b0-480d-b09a-7583a0813a24\") " pod="openshift-monitoring/telemeter-client-69d7584d88-zt59r" Apr 17 20:54:51.921983 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:51.921966 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8467193a-45b0-480d-b09a-7583a0813a24-telemeter-trusted-ca-bundle\") pod \"telemeter-client-69d7584d88-zt59r\" (UID: \"8467193a-45b0-480d-b09a-7583a0813a24\") " pod="openshift-monitoring/telemeter-client-69d7584d88-zt59r" Apr 17 20:54:51.922183 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:51.921994 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8467193a-45b0-480d-b09a-7583a0813a24-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-69d7584d88-zt59r\" (UID: \"8467193a-45b0-480d-b09a-7583a0813a24\") " pod="openshift-monitoring/telemeter-client-69d7584d88-zt59r" Apr 17 20:54:51.922183 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:51.922055 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/8467193a-45b0-480d-b09a-7583a0813a24-secret-telemeter-client\") pod \"telemeter-client-69d7584d88-zt59r\" (UID: \"8467193a-45b0-480d-b09a-7583a0813a24\") " pod="openshift-monitoring/telemeter-client-69d7584d88-zt59r" Apr 17 20:54:51.922183 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:51.922084 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/8467193a-45b0-480d-b09a-7583a0813a24-telemeter-client-tls\") pod \"telemeter-client-69d7584d88-zt59r\" (UID: \"8467193a-45b0-480d-b09a-7583a0813a24\") " pod="openshift-monitoring/telemeter-client-69d7584d88-zt59r" Apr 17 20:54:51.922183 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:51.922141 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8467193a-45b0-480d-b09a-7583a0813a24-serving-certs-ca-bundle\") pod \"telemeter-client-69d7584d88-zt59r\" (UID: \"8467193a-45b0-480d-b09a-7583a0813a24\") " pod="openshift-monitoring/telemeter-client-69d7584d88-zt59r" Apr 17 20:54:51.922183 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:51.922178 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z574z\" (UniqueName: \"kubernetes.io/projected/8467193a-45b0-480d-b09a-7583a0813a24-kube-api-access-z574z\") pod \"telemeter-client-69d7584d88-zt59r\" (UID: \"8467193a-45b0-480d-b09a-7583a0813a24\") " pod="openshift-monitoring/telemeter-client-69d7584d88-zt59r" Apr 17 20:54:52.022792 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:52.022753 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/8467193a-45b0-480d-b09a-7583a0813a24-secret-telemeter-client\") pod \"telemeter-client-69d7584d88-zt59r\" (UID: \"8467193a-45b0-480d-b09a-7583a0813a24\") " pod="openshift-monitoring/telemeter-client-69d7584d88-zt59r" Apr 17 20:54:52.022892 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:52.022799 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/8467193a-45b0-480d-b09a-7583a0813a24-telemeter-client-tls\") pod \"telemeter-client-69d7584d88-zt59r\" (UID: \"8467193a-45b0-480d-b09a-7583a0813a24\") " pod="openshift-monitoring/telemeter-client-69d7584d88-zt59r" Apr 17 20:54:52.022892 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:52.022843 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8467193a-45b0-480d-b09a-7583a0813a24-serving-certs-ca-bundle\") pod \"telemeter-client-69d7584d88-zt59r\" (UID: \"8467193a-45b0-480d-b09a-7583a0813a24\") " pod="openshift-monitoring/telemeter-client-69d7584d88-zt59r" Apr 17 20:54:52.022892 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:52.022881 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z574z\" (UniqueName: \"kubernetes.io/projected/8467193a-45b0-480d-b09a-7583a0813a24-kube-api-access-z574z\") pod \"telemeter-client-69d7584d88-zt59r\" (UID: \"8467193a-45b0-480d-b09a-7583a0813a24\") " pod="openshift-monitoring/telemeter-client-69d7584d88-zt59r" Apr 17 20:54:52.023021 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:52.022962 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/8467193a-45b0-480d-b09a-7583a0813a24-federate-client-tls\") pod \"telemeter-client-69d7584d88-zt59r\" (UID: \"8467193a-45b0-480d-b09a-7583a0813a24\") " pod="openshift-monitoring/telemeter-client-69d7584d88-zt59r" Apr 17 20:54:52.023177 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:52.023148 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8467193a-45b0-480d-b09a-7583a0813a24-metrics-client-ca\") pod \"telemeter-client-69d7584d88-zt59r\" (UID: \"8467193a-45b0-480d-b09a-7583a0813a24\") " pod="openshift-monitoring/telemeter-client-69d7584d88-zt59r" Apr 17 20:54:52.023298 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:52.023199 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8467193a-45b0-480d-b09a-7583a0813a24-telemeter-trusted-ca-bundle\") pod \"telemeter-client-69d7584d88-zt59r\" (UID: \"8467193a-45b0-480d-b09a-7583a0813a24\") " pod="openshift-monitoring/telemeter-client-69d7584d88-zt59r" Apr 17 20:54:52.023298 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:52.023239 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8467193a-45b0-480d-b09a-7583a0813a24-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-69d7584d88-zt59r\" (UID: \"8467193a-45b0-480d-b09a-7583a0813a24\") " pod="openshift-monitoring/telemeter-client-69d7584d88-zt59r" Apr 17 20:54:52.023621 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:52.023601 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8467193a-45b0-480d-b09a-7583a0813a24-serving-certs-ca-bundle\") pod \"telemeter-client-69d7584d88-zt59r\" (UID: \"8467193a-45b0-480d-b09a-7583a0813a24\") " pod="openshift-monitoring/telemeter-client-69d7584d88-zt59r" Apr 17 20:54:52.023766 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:52.023743 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8467193a-45b0-480d-b09a-7583a0813a24-metrics-client-ca\") pod \"telemeter-client-69d7584d88-zt59r\" (UID: \"8467193a-45b0-480d-b09a-7583a0813a24\") " pod="openshift-monitoring/telemeter-client-69d7584d88-zt59r" Apr 17 20:54:52.024065 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:52.024043 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8467193a-45b0-480d-b09a-7583a0813a24-telemeter-trusted-ca-bundle\") pod \"telemeter-client-69d7584d88-zt59r\" (UID: \"8467193a-45b0-480d-b09a-7583a0813a24\") " pod="openshift-monitoring/telemeter-client-69d7584d88-zt59r" Apr 17 20:54:52.025174 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:52.025155 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/8467193a-45b0-480d-b09a-7583a0813a24-secret-telemeter-client\") pod \"telemeter-client-69d7584d88-zt59r\" (UID: \"8467193a-45b0-480d-b09a-7583a0813a24\") " pod="openshift-monitoring/telemeter-client-69d7584d88-zt59r" Apr 17 20:54:52.025355 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:52.025328 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/8467193a-45b0-480d-b09a-7583a0813a24-telemeter-client-tls\") pod \"telemeter-client-69d7584d88-zt59r\" (UID: \"8467193a-45b0-480d-b09a-7583a0813a24\") " pod="openshift-monitoring/telemeter-client-69d7584d88-zt59r" Apr 17 20:54:52.025355 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:52.025347 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/8467193a-45b0-480d-b09a-7583a0813a24-federate-client-tls\") pod \"telemeter-client-69d7584d88-zt59r\" (UID: \"8467193a-45b0-480d-b09a-7583a0813a24\") " pod="openshift-monitoring/telemeter-client-69d7584d88-zt59r" Apr 17 20:54:52.025432 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:52.025390 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8467193a-45b0-480d-b09a-7583a0813a24-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-69d7584d88-zt59r\" (UID: \"8467193a-45b0-480d-b09a-7583a0813a24\") " pod="openshift-monitoring/telemeter-client-69d7584d88-zt59r" Apr 17 20:54:52.030430 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:52.030409 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z574z\" (UniqueName: \"kubernetes.io/projected/8467193a-45b0-480d-b09a-7583a0813a24-kube-api-access-z574z\") pod \"telemeter-client-69d7584d88-zt59r\" (UID: \"8467193a-45b0-480d-b09a-7583a0813a24\") " pod="openshift-monitoring/telemeter-client-69d7584d88-zt59r" Apr 17 20:54:52.169520 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:52.169466 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-69d7584d88-zt59r" Apr 17 20:54:52.255319 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:52.255283 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d64d740e-c62f-415b-9506-0ddd2f3a03c9","Type":"ContainerStarted","Data":"500beda3544426996010365376bdb358bebaa28dd887a113d599828e13de6221"} Apr 17 20:54:52.255743 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:52.255329 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d64d740e-c62f-415b-9506-0ddd2f3a03c9","Type":"ContainerStarted","Data":"d96b9a255a22d2b2cb8f6fd19385e521c6c7586132f4ed5b89a92f15df8a064d"} Apr 17 20:54:52.255743 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:52.255342 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d64d740e-c62f-415b-9506-0ddd2f3a03c9","Type":"ContainerStarted","Data":"906d98e8c36ad2d2a31452c90d4ffe3768cc03ae49cde12d886f83e27b3eea27"} Apr 17 20:54:52.255743 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:52.255354 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d64d740e-c62f-415b-9506-0ddd2f3a03c9","Type":"ContainerStarted","Data":"b924d9690349bc970a86cc5c4634f5e527211d1b1938881a392aea052d9eef6d"} Apr 17 20:54:52.255743 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:52.255367 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d64d740e-c62f-415b-9506-0ddd2f3a03c9","Type":"ContainerStarted","Data":"1122282188afef416b5cd184d8bc0836fc2890d25f9b94619651206470eddd6b"} Apr 17 20:54:52.255743 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:52.255378 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d64d740e-c62f-415b-9506-0ddd2f3a03c9","Type":"ContainerStarted","Data":"2f1aaa32fcb752c2132753f07ced5d1adf5398aa7f6a669f156705d7d442fd6e"} Apr 17 20:54:52.257490 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:52.257434 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-z2lfh" event={"ID":"f35870d4-a814-46cd-8764-30e79f3a7d79","Type":"ContainerStarted","Data":"0b9373079ace1a7bbcfdb38a333b4dc5377fd647b34d9f81c1679e55c52f8902"} Apr 17 20:54:52.257490 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:52.257469 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-z2lfh" event={"ID":"f35870d4-a814-46cd-8764-30e79f3a7d79","Type":"ContainerStarted","Data":"dc6e67f93a1392a10c11139ba13066103168f9a806486bb57791411cbd5bd820"} Apr 17 20:54:52.283905 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:52.283866 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.283853478 podStartE2EDuration="2.283853478s" podCreationTimestamp="2026-04-17 20:54:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 20:54:52.281620025 +0000 UTC m=+254.288034079" watchObservedRunningTime="2026-04-17 20:54:52.283853478 +0000 UTC m=+254.290267559" Apr 17 20:54:52.287528 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:52.287507 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-69d7584d88-zt59r"] Apr 17 20:54:52.289960 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:54:52.289920 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8467193a_45b0_480d_b09a_7583a0813a24.slice/crio-23412bcb96a2e91cd5c5a9e4b4fefbe75f1dd3070f9ce0d6badee4e92df79fbf WatchSource:0}: Error finding container 23412bcb96a2e91cd5c5a9e4b4fefbe75f1dd3070f9ce0d6badee4e92df79fbf: Status 404 returned error can't find the container with id 23412bcb96a2e91cd5c5a9e4b4fefbe75f1dd3070f9ce0d6badee4e92df79fbf Apr 17 20:54:52.299928 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:52.299889 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-z2lfh" podStartSLOduration=253.199545567 podStartE2EDuration="4m14.299879334s" podCreationTimestamp="2026-04-17 20:50:38 +0000 UTC" firstStartedPulling="2026-04-17 20:54:50.617119665 +0000 UTC m=+252.623533686" lastFinishedPulling="2026-04-17 20:54:51.717453434 +0000 UTC m=+253.723867453" observedRunningTime="2026-04-17 20:54:52.299111075 +0000 UTC m=+254.305525109" watchObservedRunningTime="2026-04-17 20:54:52.299879334 +0000 UTC m=+254.306293365" Apr 17 20:54:53.260951 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:53.260913 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-69d7584d88-zt59r" event={"ID":"8467193a-45b0-480d-b09a-7583a0813a24","Type":"ContainerStarted","Data":"23412bcb96a2e91cd5c5a9e4b4fefbe75f1dd3070f9ce0d6badee4e92df79fbf"} Apr 17 20:54:54.266209 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:54.266131 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-69d7584d88-zt59r" event={"ID":"8467193a-45b0-480d-b09a-7583a0813a24","Type":"ContainerStarted","Data":"10f7525ed746f61b40d04e2197310d13035eaac3ed084afa7d55715920bdf4b3"} Apr 17 20:54:54.266209 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:54.266170 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-69d7584d88-zt59r" event={"ID":"8467193a-45b0-480d-b09a-7583a0813a24","Type":"ContainerStarted","Data":"a0f818478132b008428c5f660d434297f679c66957dfa56e23064e0e3f253970"} Apr 17 20:54:54.266209 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:54.266181 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-69d7584d88-zt59r" event={"ID":"8467193a-45b0-480d-b09a-7583a0813a24","Type":"ContainerStarted","Data":"2450ab3e5f24ec278a704333b6324236d449a6eca1284d94645d6159508b91d0"} Apr 17 20:54:54.288102 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:54.288057 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-69d7584d88-zt59r" podStartSLOduration=1.68950049 podStartE2EDuration="3.288044157s" podCreationTimestamp="2026-04-17 20:54:51 +0000 UTC" firstStartedPulling="2026-04-17 20:54:52.291934555 +0000 UTC m=+254.298348569" lastFinishedPulling="2026-04-17 20:54:53.890478212 +0000 UTC m=+255.896892236" observedRunningTime="2026-04-17 20:54:54.286306013 +0000 UTC m=+256.292720042" watchObservedRunningTime="2026-04-17 20:54:54.288044157 +0000 UTC m=+256.294458185" Apr 17 20:54:54.954242 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:54.954203 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-94cdd66bc-n5tbc"] Apr 17 20:54:54.957478 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:54.957456 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-94cdd66bc-n5tbc" Apr 17 20:54:54.969007 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:54.968983 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-94cdd66bc-n5tbc"] Apr 17 20:54:55.049359 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:55.049331 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9c7245a9-ee41-4e6b-b7ee-4597dffa756f-oauth-serving-cert\") pod \"console-94cdd66bc-n5tbc\" (UID: \"9c7245a9-ee41-4e6b-b7ee-4597dffa756f\") " pod="openshift-console/console-94cdd66bc-n5tbc" Apr 17 20:54:55.049501 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:55.049367 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9c7245a9-ee41-4e6b-b7ee-4597dffa756f-service-ca\") pod \"console-94cdd66bc-n5tbc\" (UID: \"9c7245a9-ee41-4e6b-b7ee-4597dffa756f\") " pod="openshift-console/console-94cdd66bc-n5tbc" Apr 17 20:54:55.049501 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:55.049391 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvqkp\" (UniqueName: \"kubernetes.io/projected/9c7245a9-ee41-4e6b-b7ee-4597dffa756f-kube-api-access-fvqkp\") pod \"console-94cdd66bc-n5tbc\" (UID: \"9c7245a9-ee41-4e6b-b7ee-4597dffa756f\") " pod="openshift-console/console-94cdd66bc-n5tbc" Apr 17 20:54:55.049501 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:55.049416 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9c7245a9-ee41-4e6b-b7ee-4597dffa756f-console-oauth-config\") pod \"console-94cdd66bc-n5tbc\" (UID: \"9c7245a9-ee41-4e6b-b7ee-4597dffa756f\") " pod="openshift-console/console-94cdd66bc-n5tbc" Apr 17 20:54:55.049501 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:55.049441 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9c7245a9-ee41-4e6b-b7ee-4597dffa756f-console-config\") pod \"console-94cdd66bc-n5tbc\" (UID: \"9c7245a9-ee41-4e6b-b7ee-4597dffa756f\") " pod="openshift-console/console-94cdd66bc-n5tbc" Apr 17 20:54:55.049501 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:55.049457 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c7245a9-ee41-4e6b-b7ee-4597dffa756f-trusted-ca-bundle\") pod \"console-94cdd66bc-n5tbc\" (UID: \"9c7245a9-ee41-4e6b-b7ee-4597dffa756f\") " pod="openshift-console/console-94cdd66bc-n5tbc" Apr 17 20:54:55.049675 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:55.049550 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9c7245a9-ee41-4e6b-b7ee-4597dffa756f-console-serving-cert\") pod \"console-94cdd66bc-n5tbc\" (UID: \"9c7245a9-ee41-4e6b-b7ee-4597dffa756f\") " pod="openshift-console/console-94cdd66bc-n5tbc" Apr 17 20:54:55.150625 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:55.150592 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9c7245a9-ee41-4e6b-b7ee-4597dffa756f-console-serving-cert\") pod \"console-94cdd66bc-n5tbc\" (UID: \"9c7245a9-ee41-4e6b-b7ee-4597dffa756f\") " pod="openshift-console/console-94cdd66bc-n5tbc" Apr 17 20:54:55.150747 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:55.150629 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9c7245a9-ee41-4e6b-b7ee-4597dffa756f-oauth-serving-cert\") pod \"console-94cdd66bc-n5tbc\" (UID: \"9c7245a9-ee41-4e6b-b7ee-4597dffa756f\") " pod="openshift-console/console-94cdd66bc-n5tbc" Apr 17 20:54:55.150747 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:55.150664 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9c7245a9-ee41-4e6b-b7ee-4597dffa756f-service-ca\") pod \"console-94cdd66bc-n5tbc\" (UID: \"9c7245a9-ee41-4e6b-b7ee-4597dffa756f\") " pod="openshift-console/console-94cdd66bc-n5tbc" Apr 17 20:54:55.150747 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:55.150697 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fvqkp\" (UniqueName: \"kubernetes.io/projected/9c7245a9-ee41-4e6b-b7ee-4597dffa756f-kube-api-access-fvqkp\") pod \"console-94cdd66bc-n5tbc\" (UID: \"9c7245a9-ee41-4e6b-b7ee-4597dffa756f\") " pod="openshift-console/console-94cdd66bc-n5tbc" Apr 17 20:54:55.150747 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:55.150736 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9c7245a9-ee41-4e6b-b7ee-4597dffa756f-console-oauth-config\") pod \"console-94cdd66bc-n5tbc\" (UID: \"9c7245a9-ee41-4e6b-b7ee-4597dffa756f\") " pod="openshift-console/console-94cdd66bc-n5tbc" Apr 17 20:54:55.150941 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:55.150770 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9c7245a9-ee41-4e6b-b7ee-4597dffa756f-console-config\") pod \"console-94cdd66bc-n5tbc\" (UID: \"9c7245a9-ee41-4e6b-b7ee-4597dffa756f\") " pod="openshift-console/console-94cdd66bc-n5tbc" Apr 17 20:54:55.150941 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:55.150799 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c7245a9-ee41-4e6b-b7ee-4597dffa756f-trusted-ca-bundle\") pod \"console-94cdd66bc-n5tbc\" (UID: \"9c7245a9-ee41-4e6b-b7ee-4597dffa756f\") " pod="openshift-console/console-94cdd66bc-n5tbc" Apr 17 20:54:55.151374 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:55.151344 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9c7245a9-ee41-4e6b-b7ee-4597dffa756f-oauth-serving-cert\") pod \"console-94cdd66bc-n5tbc\" (UID: \"9c7245a9-ee41-4e6b-b7ee-4597dffa756f\") " pod="openshift-console/console-94cdd66bc-n5tbc" Apr 17 20:54:55.151465 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:55.151426 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9c7245a9-ee41-4e6b-b7ee-4597dffa756f-service-ca\") pod \"console-94cdd66bc-n5tbc\" (UID: \"9c7245a9-ee41-4e6b-b7ee-4597dffa756f\") " pod="openshift-console/console-94cdd66bc-n5tbc" Apr 17 20:54:55.151618 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:55.151597 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9c7245a9-ee41-4e6b-b7ee-4597dffa756f-console-config\") pod \"console-94cdd66bc-n5tbc\" (UID: \"9c7245a9-ee41-4e6b-b7ee-4597dffa756f\") " pod="openshift-console/console-94cdd66bc-n5tbc" Apr 17 20:54:55.151668 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:55.151601 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c7245a9-ee41-4e6b-b7ee-4597dffa756f-trusted-ca-bundle\") pod \"console-94cdd66bc-n5tbc\" (UID: \"9c7245a9-ee41-4e6b-b7ee-4597dffa756f\") " pod="openshift-console/console-94cdd66bc-n5tbc" Apr 17 20:54:55.153080 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:55.153053 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9c7245a9-ee41-4e6b-b7ee-4597dffa756f-console-serving-cert\") pod \"console-94cdd66bc-n5tbc\" (UID: \"9c7245a9-ee41-4e6b-b7ee-4597dffa756f\") " pod="openshift-console/console-94cdd66bc-n5tbc" Apr 17 20:54:55.153158 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:55.153092 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9c7245a9-ee41-4e6b-b7ee-4597dffa756f-console-oauth-config\") pod \"console-94cdd66bc-n5tbc\" (UID: \"9c7245a9-ee41-4e6b-b7ee-4597dffa756f\") " pod="openshift-console/console-94cdd66bc-n5tbc" Apr 17 20:54:55.158948 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:55.158929 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvqkp\" (UniqueName: \"kubernetes.io/projected/9c7245a9-ee41-4e6b-b7ee-4597dffa756f-kube-api-access-fvqkp\") pod \"console-94cdd66bc-n5tbc\" (UID: \"9c7245a9-ee41-4e6b-b7ee-4597dffa756f\") " pod="openshift-console/console-94cdd66bc-n5tbc" Apr 17 20:54:55.266662 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:55.266604 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-94cdd66bc-n5tbc" Apr 17 20:54:55.383839 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:55.383815 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-94cdd66bc-n5tbc"] Apr 17 20:54:55.386224 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:54:55.386200 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c7245a9_ee41_4e6b_b7ee_4597dffa756f.slice/crio-c6e616464c0a7554fe8587c1c66a049265498e853cbfae52ebb07d224e656e6e WatchSource:0}: Error finding container c6e616464c0a7554fe8587c1c66a049265498e853cbfae52ebb07d224e656e6e: Status 404 returned error can't find the container with id c6e616464c0a7554fe8587c1c66a049265498e853cbfae52ebb07d224e656e6e Apr 17 20:54:56.276021 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:56.275983 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-94cdd66bc-n5tbc" event={"ID":"9c7245a9-ee41-4e6b-b7ee-4597dffa756f","Type":"ContainerStarted","Data":"49b3e620e7c419b73de7a0a79b53a1453a9ed71675483ba00cfbf8d860362c55"} Apr 17 20:54:56.276021 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:56.276017 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-94cdd66bc-n5tbc" event={"ID":"9c7245a9-ee41-4e6b-b7ee-4597dffa756f","Type":"ContainerStarted","Data":"c6e616464c0a7554fe8587c1c66a049265498e853cbfae52ebb07d224e656e6e"} Apr 17 20:54:56.293591 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:54:56.293543 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-94cdd66bc-n5tbc" podStartSLOduration=2.2935291380000002 podStartE2EDuration="2.293529138s" podCreationTimestamp="2026-04-17 20:54:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 20:54:56.291888695 +0000 UTC m=+258.298302760" watchObservedRunningTime="2026-04-17 20:54:56.293529138 +0000 UTC m=+258.299943169" Apr 17 20:55:05.267406 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:55:05.267356 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-94cdd66bc-n5tbc" Apr 17 20:55:05.267406 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:55:05.267415 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-94cdd66bc-n5tbc" Apr 17 20:55:05.271806 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:55:05.271786 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-94cdd66bc-n5tbc" Apr 17 20:55:05.304680 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:55:05.304657 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-94cdd66bc-n5tbc" Apr 17 20:55:05.350289 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:55:05.350259 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-759c6ffc5c-9bxqv"] Apr 17 20:55:30.372220 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:55:30.372172 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-759c6ffc5c-9bxqv" podUID="c231d7b0-ed68-4d67-854a-1c84bcff5594" containerName="console" containerID="cri-o://94a8c08bede807bbdd4750062182c4173b5e3bdd1e3ee54b428cd7e3d48813ff" gracePeriod=15 Apr 17 20:55:30.610213 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:55:30.610191 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-759c6ffc5c-9bxqv_c231d7b0-ed68-4d67-854a-1c84bcff5594/console/0.log" Apr 17 20:55:30.610316 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:55:30.610250 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-759c6ffc5c-9bxqv" Apr 17 20:55:30.742837 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:55:30.742757 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c231d7b0-ed68-4d67-854a-1c84bcff5594-trusted-ca-bundle\") pod \"c231d7b0-ed68-4d67-854a-1c84bcff5594\" (UID: \"c231d7b0-ed68-4d67-854a-1c84bcff5594\") " Apr 17 20:55:30.743007 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:55:30.742863 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-22sz8\" (UniqueName: \"kubernetes.io/projected/c231d7b0-ed68-4d67-854a-1c84bcff5594-kube-api-access-22sz8\") pod \"c231d7b0-ed68-4d67-854a-1c84bcff5594\" (UID: \"c231d7b0-ed68-4d67-854a-1c84bcff5594\") " Apr 17 20:55:30.743007 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:55:30.742894 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c231d7b0-ed68-4d67-854a-1c84bcff5594-service-ca\") pod \"c231d7b0-ed68-4d67-854a-1c84bcff5594\" (UID: \"c231d7b0-ed68-4d67-854a-1c84bcff5594\") " Apr 17 20:55:30.743007 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:55:30.742928 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c231d7b0-ed68-4d67-854a-1c84bcff5594-console-config\") pod \"c231d7b0-ed68-4d67-854a-1c84bcff5594\" (UID: \"c231d7b0-ed68-4d67-854a-1c84bcff5594\") " Apr 17 20:55:30.743007 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:55:30.742974 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c231d7b0-ed68-4d67-854a-1c84bcff5594-console-serving-cert\") pod \"c231d7b0-ed68-4d67-854a-1c84bcff5594\" (UID: \"c231d7b0-ed68-4d67-854a-1c84bcff5594\") " Apr 17 20:55:30.743007 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:55:30.742994 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c231d7b0-ed68-4d67-854a-1c84bcff5594-oauth-serving-cert\") pod \"c231d7b0-ed68-4d67-854a-1c84bcff5594\" (UID: \"c231d7b0-ed68-4d67-854a-1c84bcff5594\") " Apr 17 20:55:30.743272 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:55:30.743066 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c231d7b0-ed68-4d67-854a-1c84bcff5594-console-oauth-config\") pod \"c231d7b0-ed68-4d67-854a-1c84bcff5594\" (UID: \"c231d7b0-ed68-4d67-854a-1c84bcff5594\") " Apr 17 20:55:30.743324 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:55:30.743277 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c231d7b0-ed68-4d67-854a-1c84bcff5594-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "c231d7b0-ed68-4d67-854a-1c84bcff5594" (UID: "c231d7b0-ed68-4d67-854a-1c84bcff5594"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 20:55:30.743441 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:55:30.743399 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c231d7b0-ed68-4d67-854a-1c84bcff5594-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "c231d7b0-ed68-4d67-854a-1c84bcff5594" (UID: "c231d7b0-ed68-4d67-854a-1c84bcff5594"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 20:55:30.743441 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:55:30.743404 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c231d7b0-ed68-4d67-854a-1c84bcff5594-service-ca" (OuterVolumeSpecName: "service-ca") pod "c231d7b0-ed68-4d67-854a-1c84bcff5594" (UID: "c231d7b0-ed68-4d67-854a-1c84bcff5594"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 20:55:30.743612 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:55:30.743409 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c231d7b0-ed68-4d67-854a-1c84bcff5594-console-config" (OuterVolumeSpecName: "console-config") pod "c231d7b0-ed68-4d67-854a-1c84bcff5594" (UID: "c231d7b0-ed68-4d67-854a-1c84bcff5594"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 20:55:30.745073 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:55:30.745044 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c231d7b0-ed68-4d67-854a-1c84bcff5594-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "c231d7b0-ed68-4d67-854a-1c84bcff5594" (UID: "c231d7b0-ed68-4d67-854a-1c84bcff5594"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 20:55:30.745073 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:55:30.745057 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c231d7b0-ed68-4d67-854a-1c84bcff5594-kube-api-access-22sz8" (OuterVolumeSpecName: "kube-api-access-22sz8") pod "c231d7b0-ed68-4d67-854a-1c84bcff5594" (UID: "c231d7b0-ed68-4d67-854a-1c84bcff5594"). InnerVolumeSpecName "kube-api-access-22sz8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:55:30.745311 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:55:30.745289 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c231d7b0-ed68-4d67-854a-1c84bcff5594-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "c231d7b0-ed68-4d67-854a-1c84bcff5594" (UID: "c231d7b0-ed68-4d67-854a-1c84bcff5594"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 20:55:30.844196 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:55:30.844144 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-22sz8\" (UniqueName: \"kubernetes.io/projected/c231d7b0-ed68-4d67-854a-1c84bcff5594-kube-api-access-22sz8\") on node \"ip-10-0-128-248.ec2.internal\" DevicePath \"\"" Apr 17 20:55:30.844196 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:55:30.844190 2576 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c231d7b0-ed68-4d67-854a-1c84bcff5594-service-ca\") on node \"ip-10-0-128-248.ec2.internal\" DevicePath \"\"" Apr 17 20:55:30.844196 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:55:30.844201 2576 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c231d7b0-ed68-4d67-854a-1c84bcff5594-console-config\") on node \"ip-10-0-128-248.ec2.internal\" DevicePath \"\"" Apr 17 20:55:30.844196 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:55:30.844210 2576 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c231d7b0-ed68-4d67-854a-1c84bcff5594-console-serving-cert\") on node \"ip-10-0-128-248.ec2.internal\" DevicePath \"\"" Apr 17 20:55:30.844467 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:55:30.844219 2576 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c231d7b0-ed68-4d67-854a-1c84bcff5594-oauth-serving-cert\") on node \"ip-10-0-128-248.ec2.internal\" DevicePath \"\"" Apr 17 20:55:30.844467 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:55:30.844230 2576 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c231d7b0-ed68-4d67-854a-1c84bcff5594-console-oauth-config\") on node \"ip-10-0-128-248.ec2.internal\" DevicePath \"\"" Apr 17 20:55:30.844467 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:55:30.844239 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c231d7b0-ed68-4d67-854a-1c84bcff5594-trusted-ca-bundle\") on node \"ip-10-0-128-248.ec2.internal\" DevicePath \"\"" Apr 17 20:55:31.379963 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:55:31.379936 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-759c6ffc5c-9bxqv_c231d7b0-ed68-4d67-854a-1c84bcff5594/console/0.log" Apr 17 20:55:31.380463 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:55:31.379976 2576 generic.go:358] "Generic (PLEG): container finished" podID="c231d7b0-ed68-4d67-854a-1c84bcff5594" containerID="94a8c08bede807bbdd4750062182c4173b5e3bdd1e3ee54b428cd7e3d48813ff" exitCode=2 Apr 17 20:55:31.380463 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:55:31.380069 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-759c6ffc5c-9bxqv" Apr 17 20:55:31.380463 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:55:31.380076 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-759c6ffc5c-9bxqv" event={"ID":"c231d7b0-ed68-4d67-854a-1c84bcff5594","Type":"ContainerDied","Data":"94a8c08bede807bbdd4750062182c4173b5e3bdd1e3ee54b428cd7e3d48813ff"} Apr 17 20:55:31.380463 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:55:31.380120 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-759c6ffc5c-9bxqv" event={"ID":"c231d7b0-ed68-4d67-854a-1c84bcff5594","Type":"ContainerDied","Data":"829141e8cddd07b2a8eb097b147188273041de4087b424ecfa5b00bde2eb8ae0"} Apr 17 20:55:31.380463 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:55:31.380140 2576 scope.go:117] "RemoveContainer" containerID="94a8c08bede807bbdd4750062182c4173b5e3bdd1e3ee54b428cd7e3d48813ff" Apr 17 20:55:31.390720 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:55:31.390694 2576 scope.go:117] "RemoveContainer" containerID="94a8c08bede807bbdd4750062182c4173b5e3bdd1e3ee54b428cd7e3d48813ff" Apr 17 20:55:31.391129 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:55:31.391098 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94a8c08bede807bbdd4750062182c4173b5e3bdd1e3ee54b428cd7e3d48813ff\": container with ID starting with 94a8c08bede807bbdd4750062182c4173b5e3bdd1e3ee54b428cd7e3d48813ff not found: ID does not exist" containerID="94a8c08bede807bbdd4750062182c4173b5e3bdd1e3ee54b428cd7e3d48813ff" Apr 17 20:55:31.391234 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:55:31.391140 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94a8c08bede807bbdd4750062182c4173b5e3bdd1e3ee54b428cd7e3d48813ff"} err="failed to get container status \"94a8c08bede807bbdd4750062182c4173b5e3bdd1e3ee54b428cd7e3d48813ff\": rpc error: code = NotFound desc = could not find container \"94a8c08bede807bbdd4750062182c4173b5e3bdd1e3ee54b428cd7e3d48813ff\": container with ID starting with 94a8c08bede807bbdd4750062182c4173b5e3bdd1e3ee54b428cd7e3d48813ff not found: ID does not exist" Apr 17 20:55:31.400976 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:55:31.400950 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-759c6ffc5c-9bxqv"] Apr 17 20:55:31.402867 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:55:31.402844 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-759c6ffc5c-9bxqv"] Apr 17 20:55:32.487358 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:55:32.487328 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c231d7b0-ed68-4d67-854a-1c84bcff5594" path="/var/lib/kubelet/pods/c231d7b0-ed68-4d67-854a-1c84bcff5594/volumes" Apr 17 20:55:38.408489 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:55:38.408451 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pwcjq_c5da4be1-c2a5-411b-a362-4b558f88d556/ovn-acl-logging/0.log" Apr 17 20:55:38.412364 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:55:38.412340 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pwcjq_c5da4be1-c2a5-411b-a362-4b558f88d556/ovn-acl-logging/0.log" Apr 17 20:55:38.419010 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:55:38.418990 2576 kubelet.go:1628] "Image garbage collection succeeded" Apr 17 20:56:02.948083 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:56:02.948052 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-676df7fdc7-s829p"] Apr 17 20:56:02.950649 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:56:02.948397 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c231d7b0-ed68-4d67-854a-1c84bcff5594" containerName="console" Apr 17 20:56:02.950649 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:56:02.948409 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="c231d7b0-ed68-4d67-854a-1c84bcff5594" containerName="console" Apr 17 20:56:02.950649 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:56:02.948464 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="c231d7b0-ed68-4d67-854a-1c84bcff5594" containerName="console" Apr 17 20:56:02.951492 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:56:02.951476 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-676df7fdc7-s829p" Apr 17 20:56:02.962004 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:56:02.961977 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-676df7fdc7-s829p"] Apr 17 20:56:03.092337 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:56:03.092295 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ba3f396b-dc67-4c7a-8171-03acae476c79-console-serving-cert\") pod \"console-676df7fdc7-s829p\" (UID: \"ba3f396b-dc67-4c7a-8171-03acae476c79\") " pod="openshift-console/console-676df7fdc7-s829p" Apr 17 20:56:03.092517 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:56:03.092352 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ba3f396b-dc67-4c7a-8171-03acae476c79-oauth-serving-cert\") pod \"console-676df7fdc7-s829p\" (UID: \"ba3f396b-dc67-4c7a-8171-03acae476c79\") " pod="openshift-console/console-676df7fdc7-s829p" Apr 17 20:56:03.092517 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:56:03.092375 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9ng2\" (UniqueName: \"kubernetes.io/projected/ba3f396b-dc67-4c7a-8171-03acae476c79-kube-api-access-j9ng2\") pod \"console-676df7fdc7-s829p\" (UID: \"ba3f396b-dc67-4c7a-8171-03acae476c79\") " pod="openshift-console/console-676df7fdc7-s829p" Apr 17 20:56:03.092517 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:56:03.092404 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ba3f396b-dc67-4c7a-8171-03acae476c79-console-oauth-config\") pod \"console-676df7fdc7-s829p\" (UID: \"ba3f396b-dc67-4c7a-8171-03acae476c79\") " pod="openshift-console/console-676df7fdc7-s829p" Apr 17 20:56:03.092517 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:56:03.092419 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ba3f396b-dc67-4c7a-8171-03acae476c79-console-config\") pod \"console-676df7fdc7-s829p\" (UID: \"ba3f396b-dc67-4c7a-8171-03acae476c79\") " pod="openshift-console/console-676df7fdc7-s829p" Apr 17 20:56:03.092517 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:56:03.092438 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ba3f396b-dc67-4c7a-8171-03acae476c79-service-ca\") pod \"console-676df7fdc7-s829p\" (UID: \"ba3f396b-dc67-4c7a-8171-03acae476c79\") " pod="openshift-console/console-676df7fdc7-s829p" Apr 17 20:56:03.092707 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:56:03.092512 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba3f396b-dc67-4c7a-8171-03acae476c79-trusted-ca-bundle\") pod \"console-676df7fdc7-s829p\" (UID: \"ba3f396b-dc67-4c7a-8171-03acae476c79\") " pod="openshift-console/console-676df7fdc7-s829p" Apr 17 20:56:03.193196 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:56:03.193151 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ba3f396b-dc67-4c7a-8171-03acae476c79-console-serving-cert\") pod \"console-676df7fdc7-s829p\" (UID: \"ba3f396b-dc67-4c7a-8171-03acae476c79\") " pod="openshift-console/console-676df7fdc7-s829p" Apr 17 20:56:03.193196 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:56:03.193204 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ba3f396b-dc67-4c7a-8171-03acae476c79-oauth-serving-cert\") pod \"console-676df7fdc7-s829p\" (UID: \"ba3f396b-dc67-4c7a-8171-03acae476c79\") " pod="openshift-console/console-676df7fdc7-s829p" Apr 17 20:56:03.193421 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:56:03.193229 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j9ng2\" (UniqueName: \"kubernetes.io/projected/ba3f396b-dc67-4c7a-8171-03acae476c79-kube-api-access-j9ng2\") pod \"console-676df7fdc7-s829p\" (UID: \"ba3f396b-dc67-4c7a-8171-03acae476c79\") " pod="openshift-console/console-676df7fdc7-s829p" Apr 17 20:56:03.193421 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:56:03.193264 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ba3f396b-dc67-4c7a-8171-03acae476c79-console-oauth-config\") pod \"console-676df7fdc7-s829p\" (UID: \"ba3f396b-dc67-4c7a-8171-03acae476c79\") " pod="openshift-console/console-676df7fdc7-s829p" Apr 17 20:56:03.193421 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:56:03.193286 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ba3f396b-dc67-4c7a-8171-03acae476c79-console-config\") pod \"console-676df7fdc7-s829p\" (UID: \"ba3f396b-dc67-4c7a-8171-03acae476c79\") " pod="openshift-console/console-676df7fdc7-s829p" Apr 17 20:56:03.193421 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:56:03.193316 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ba3f396b-dc67-4c7a-8171-03acae476c79-service-ca\") pod \"console-676df7fdc7-s829p\" (UID: \"ba3f396b-dc67-4c7a-8171-03acae476c79\") " pod="openshift-console/console-676df7fdc7-s829p" Apr 17 20:56:03.193421 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:56:03.193375 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba3f396b-dc67-4c7a-8171-03acae476c79-trusted-ca-bundle\") pod \"console-676df7fdc7-s829p\" (UID: \"ba3f396b-dc67-4c7a-8171-03acae476c79\") " pod="openshift-console/console-676df7fdc7-s829p" Apr 17 20:56:03.194069 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:56:03.194041 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ba3f396b-dc67-4c7a-8171-03acae476c79-console-config\") pod \"console-676df7fdc7-s829p\" (UID: \"ba3f396b-dc67-4c7a-8171-03acae476c79\") " pod="openshift-console/console-676df7fdc7-s829p" Apr 17 20:56:03.194189 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:56:03.194045 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ba3f396b-dc67-4c7a-8171-03acae476c79-oauth-serving-cert\") pod \"console-676df7fdc7-s829p\" (UID: \"ba3f396b-dc67-4c7a-8171-03acae476c79\") " pod="openshift-console/console-676df7fdc7-s829p" Apr 17 20:56:03.194189 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:56:03.194166 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ba3f396b-dc67-4c7a-8171-03acae476c79-service-ca\") pod \"console-676df7fdc7-s829p\" (UID: \"ba3f396b-dc67-4c7a-8171-03acae476c79\") " pod="openshift-console/console-676df7fdc7-s829p" Apr 17 20:56:03.194283 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:56:03.194267 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba3f396b-dc67-4c7a-8171-03acae476c79-trusted-ca-bundle\") pod \"console-676df7fdc7-s829p\" (UID: \"ba3f396b-dc67-4c7a-8171-03acae476c79\") " pod="openshift-console/console-676df7fdc7-s829p" Apr 17 20:56:03.195648 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:56:03.195620 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ba3f396b-dc67-4c7a-8171-03acae476c79-console-oauth-config\") pod \"console-676df7fdc7-s829p\" (UID: \"ba3f396b-dc67-4c7a-8171-03acae476c79\") " pod="openshift-console/console-676df7fdc7-s829p" Apr 17 20:56:03.195730 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:56:03.195707 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ba3f396b-dc67-4c7a-8171-03acae476c79-console-serving-cert\") pod \"console-676df7fdc7-s829p\" (UID: \"ba3f396b-dc67-4c7a-8171-03acae476c79\") " pod="openshift-console/console-676df7fdc7-s829p" Apr 17 20:56:03.202588 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:56:03.202526 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9ng2\" (UniqueName: \"kubernetes.io/projected/ba3f396b-dc67-4c7a-8171-03acae476c79-kube-api-access-j9ng2\") pod \"console-676df7fdc7-s829p\" (UID: \"ba3f396b-dc67-4c7a-8171-03acae476c79\") " pod="openshift-console/console-676df7fdc7-s829p" Apr 17 20:56:03.261217 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:56:03.261175 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-676df7fdc7-s829p" Apr 17 20:56:03.380636 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:56:03.380610 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-676df7fdc7-s829p"] Apr 17 20:56:03.383255 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:56:03.383229 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba3f396b_dc67_4c7a_8171_03acae476c79.slice/crio-75d240729ef30cc6ea70f10c646129e8f3303a60691519acbab751f5770094d1 WatchSource:0}: Error finding container 75d240729ef30cc6ea70f10c646129e8f3303a60691519acbab751f5770094d1: Status 404 returned error can't find the container with id 75d240729ef30cc6ea70f10c646129e8f3303a60691519acbab751f5770094d1 Apr 17 20:56:03.390527 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:56:03.390494 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 20:56:03.476103 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:56:03.475988 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-676df7fdc7-s829p" event={"ID":"ba3f396b-dc67-4c7a-8171-03acae476c79","Type":"ContainerStarted","Data":"a4d74f750d875474f81b7e49b2590d3a8fc1972cb07ec3209389cd085dee3aa0"} Apr 17 20:56:03.476103 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:56:03.476049 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-676df7fdc7-s829p" event={"ID":"ba3f396b-dc67-4c7a-8171-03acae476c79","Type":"ContainerStarted","Data":"75d240729ef30cc6ea70f10c646129e8f3303a60691519acbab751f5770094d1"} Apr 17 20:56:03.492622 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:56:03.492511 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-676df7fdc7-s829p" podStartSLOduration=1.492495423 podStartE2EDuration="1.492495423s" podCreationTimestamp="2026-04-17 20:56:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 20:56:03.49175483 +0000 UTC m=+325.498168864" watchObservedRunningTime="2026-04-17 20:56:03.492495423 +0000 UTC m=+325.498909456" Apr 17 20:56:13.261348 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:56:13.261297 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-676df7fdc7-s829p" Apr 17 20:56:13.261348 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:56:13.261361 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-676df7fdc7-s829p" Apr 17 20:56:13.265792 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:56:13.265773 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-676df7fdc7-s829p" Apr 17 20:56:13.506510 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:56:13.506482 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-676df7fdc7-s829p" Apr 17 20:56:13.549422 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:56:13.549343 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-94cdd66bc-n5tbc"] Apr 17 20:56:38.574472 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:56:38.574402 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-94cdd66bc-n5tbc" podUID="9c7245a9-ee41-4e6b-b7ee-4597dffa756f" containerName="console" containerID="cri-o://49b3e620e7c419b73de7a0a79b53a1453a9ed71675483ba00cfbf8d860362c55" gracePeriod=15 Apr 17 20:56:38.817558 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:56:38.817532 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-94cdd66bc-n5tbc_9c7245a9-ee41-4e6b-b7ee-4597dffa756f/console/0.log" Apr 17 20:56:38.817688 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:56:38.817598 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-94cdd66bc-n5tbc" Apr 17 20:56:38.903013 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:56:38.902924 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9c7245a9-ee41-4e6b-b7ee-4597dffa756f-console-config\") pod \"9c7245a9-ee41-4e6b-b7ee-4597dffa756f\" (UID: \"9c7245a9-ee41-4e6b-b7ee-4597dffa756f\") " Apr 17 20:56:38.903013 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:56:38.902987 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9c7245a9-ee41-4e6b-b7ee-4597dffa756f-console-serving-cert\") pod \"9c7245a9-ee41-4e6b-b7ee-4597dffa756f\" (UID: \"9c7245a9-ee41-4e6b-b7ee-4597dffa756f\") " Apr 17 20:56:38.903013 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:56:38.903007 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9c7245a9-ee41-4e6b-b7ee-4597dffa756f-console-oauth-config\") pod \"9c7245a9-ee41-4e6b-b7ee-4597dffa756f\" (UID: \"9c7245a9-ee41-4e6b-b7ee-4597dffa756f\") " Apr 17 20:56:38.903303 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:56:38.903048 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9c7245a9-ee41-4e6b-b7ee-4597dffa756f-oauth-serving-cert\") pod \"9c7245a9-ee41-4e6b-b7ee-4597dffa756f\" (UID: \"9c7245a9-ee41-4e6b-b7ee-4597dffa756f\") " Apr 17 20:56:38.903303 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:56:38.903065 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fvqkp\" (UniqueName: \"kubernetes.io/projected/9c7245a9-ee41-4e6b-b7ee-4597dffa756f-kube-api-access-fvqkp\") pod \"9c7245a9-ee41-4e6b-b7ee-4597dffa756f\" (UID: \"9c7245a9-ee41-4e6b-b7ee-4597dffa756f\") " Apr 17 20:56:38.903303 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:56:38.903092 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c7245a9-ee41-4e6b-b7ee-4597dffa756f-trusted-ca-bundle\") pod \"9c7245a9-ee41-4e6b-b7ee-4597dffa756f\" (UID: \"9c7245a9-ee41-4e6b-b7ee-4597dffa756f\") " Apr 17 20:56:38.903303 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:56:38.903128 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9c7245a9-ee41-4e6b-b7ee-4597dffa756f-service-ca\") pod \"9c7245a9-ee41-4e6b-b7ee-4597dffa756f\" (UID: \"9c7245a9-ee41-4e6b-b7ee-4597dffa756f\") " Apr 17 20:56:38.903567 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:56:38.903373 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c7245a9-ee41-4e6b-b7ee-4597dffa756f-console-config" (OuterVolumeSpecName: "console-config") pod "9c7245a9-ee41-4e6b-b7ee-4597dffa756f" (UID: "9c7245a9-ee41-4e6b-b7ee-4597dffa756f"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 20:56:38.903567 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:56:38.903544 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c7245a9-ee41-4e6b-b7ee-4597dffa756f-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "9c7245a9-ee41-4e6b-b7ee-4597dffa756f" (UID: "9c7245a9-ee41-4e6b-b7ee-4597dffa756f"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 20:56:38.903643 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:56:38.903586 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c7245a9-ee41-4e6b-b7ee-4597dffa756f-service-ca" (OuterVolumeSpecName: "service-ca") pod "9c7245a9-ee41-4e6b-b7ee-4597dffa756f" (UID: "9c7245a9-ee41-4e6b-b7ee-4597dffa756f"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 20:56:38.903643 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:56:38.903590 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c7245a9-ee41-4e6b-b7ee-4597dffa756f-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "9c7245a9-ee41-4e6b-b7ee-4597dffa756f" (UID: "9c7245a9-ee41-4e6b-b7ee-4597dffa756f"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 20:56:38.903754 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:56:38.903741 2576 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9c7245a9-ee41-4e6b-b7ee-4597dffa756f-oauth-serving-cert\") on node \"ip-10-0-128-248.ec2.internal\" DevicePath \"\"" Apr 17 20:56:38.903805 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:56:38.903756 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c7245a9-ee41-4e6b-b7ee-4597dffa756f-trusted-ca-bundle\") on node \"ip-10-0-128-248.ec2.internal\" DevicePath \"\"" Apr 17 20:56:38.903805 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:56:38.903770 2576 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9c7245a9-ee41-4e6b-b7ee-4597dffa756f-service-ca\") on node \"ip-10-0-128-248.ec2.internal\" DevicePath \"\"" Apr 17 20:56:38.903805 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:56:38.903784 2576 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9c7245a9-ee41-4e6b-b7ee-4597dffa756f-console-config\") on node \"ip-10-0-128-248.ec2.internal\" DevicePath \"\"" Apr 17 20:56:38.905369 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:56:38.905334 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c7245a9-ee41-4e6b-b7ee-4597dffa756f-kube-api-access-fvqkp" (OuterVolumeSpecName: "kube-api-access-fvqkp") pod "9c7245a9-ee41-4e6b-b7ee-4597dffa756f" (UID: "9c7245a9-ee41-4e6b-b7ee-4597dffa756f"). InnerVolumeSpecName "kube-api-access-fvqkp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:56:38.905627 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:56:38.905596 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c7245a9-ee41-4e6b-b7ee-4597dffa756f-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "9c7245a9-ee41-4e6b-b7ee-4597dffa756f" (UID: "9c7245a9-ee41-4e6b-b7ee-4597dffa756f"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 20:56:38.905627 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:56:38.905609 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c7245a9-ee41-4e6b-b7ee-4597dffa756f-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "9c7245a9-ee41-4e6b-b7ee-4597dffa756f" (UID: "9c7245a9-ee41-4e6b-b7ee-4597dffa756f"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 20:56:39.004361 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:56:39.004331 2576 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9c7245a9-ee41-4e6b-b7ee-4597dffa756f-console-serving-cert\") on node \"ip-10-0-128-248.ec2.internal\" DevicePath \"\"" Apr 17 20:56:39.004361 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:56:39.004358 2576 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9c7245a9-ee41-4e6b-b7ee-4597dffa756f-console-oauth-config\") on node \"ip-10-0-128-248.ec2.internal\" DevicePath \"\"" Apr 17 20:56:39.004361 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:56:39.004369 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fvqkp\" (UniqueName: \"kubernetes.io/projected/9c7245a9-ee41-4e6b-b7ee-4597dffa756f-kube-api-access-fvqkp\") on node \"ip-10-0-128-248.ec2.internal\" DevicePath \"\"" Apr 17 20:56:39.581351 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:56:39.581322 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-94cdd66bc-n5tbc_9c7245a9-ee41-4e6b-b7ee-4597dffa756f/console/0.log" Apr 17 20:56:39.581745 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:56:39.581363 2576 generic.go:358] "Generic (PLEG): container finished" podID="9c7245a9-ee41-4e6b-b7ee-4597dffa756f" containerID="49b3e620e7c419b73de7a0a79b53a1453a9ed71675483ba00cfbf8d860362c55" exitCode=2 Apr 17 20:56:39.581745 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:56:39.581396 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-94cdd66bc-n5tbc" event={"ID":"9c7245a9-ee41-4e6b-b7ee-4597dffa756f","Type":"ContainerDied","Data":"49b3e620e7c419b73de7a0a79b53a1453a9ed71675483ba00cfbf8d860362c55"} Apr 17 20:56:39.581745 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:56:39.581436 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-94cdd66bc-n5tbc" event={"ID":"9c7245a9-ee41-4e6b-b7ee-4597dffa756f","Type":"ContainerDied","Data":"c6e616464c0a7554fe8587c1c66a049265498e853cbfae52ebb07d224e656e6e"} Apr 17 20:56:39.581745 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:56:39.581457 2576 scope.go:117] "RemoveContainer" containerID="49b3e620e7c419b73de7a0a79b53a1453a9ed71675483ba00cfbf8d860362c55" Apr 17 20:56:39.581745 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:56:39.581459 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-94cdd66bc-n5tbc" Apr 17 20:56:39.589944 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:56:39.589926 2576 scope.go:117] "RemoveContainer" containerID="49b3e620e7c419b73de7a0a79b53a1453a9ed71675483ba00cfbf8d860362c55" Apr 17 20:56:39.590230 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:56:39.590211 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49b3e620e7c419b73de7a0a79b53a1453a9ed71675483ba00cfbf8d860362c55\": container with ID starting with 49b3e620e7c419b73de7a0a79b53a1453a9ed71675483ba00cfbf8d860362c55 not found: ID does not exist" containerID="49b3e620e7c419b73de7a0a79b53a1453a9ed71675483ba00cfbf8d860362c55" Apr 17 20:56:39.590299 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:56:39.590243 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49b3e620e7c419b73de7a0a79b53a1453a9ed71675483ba00cfbf8d860362c55"} err="failed to get container status \"49b3e620e7c419b73de7a0a79b53a1453a9ed71675483ba00cfbf8d860362c55\": rpc error: code = NotFound desc = could not find container \"49b3e620e7c419b73de7a0a79b53a1453a9ed71675483ba00cfbf8d860362c55\": container with ID starting with 49b3e620e7c419b73de7a0a79b53a1453a9ed71675483ba00cfbf8d860362c55 not found: ID does not exist" Apr 17 20:56:39.601828 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:56:39.601798 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-94cdd66bc-n5tbc"] Apr 17 20:56:39.605786 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:56:39.605760 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-94cdd66bc-n5tbc"] Apr 17 20:56:40.486883 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:56:40.486852 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c7245a9-ee41-4e6b-b7ee-4597dffa756f" path="/var/lib/kubelet/pods/9c7245a9-ee41-4e6b-b7ee-4597dffa756f/volumes" Apr 17 20:57:07.633310 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:07.633271 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56m5qk"] Apr 17 20:57:07.633748 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:07.633602 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9c7245a9-ee41-4e6b-b7ee-4597dffa756f" containerName="console" Apr 17 20:57:07.633748 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:07.633613 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c7245a9-ee41-4e6b-b7ee-4597dffa756f" containerName="console" Apr 17 20:57:07.633748 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:07.633677 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="9c7245a9-ee41-4e6b-b7ee-4597dffa756f" containerName="console" Apr 17 20:57:07.636639 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:07.636621 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56m5qk" Apr 17 20:57:07.639395 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:07.639375 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 20:57:07.640616 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:07.640580 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 20:57:07.640725 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:07.640637 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-5vn7h\"" Apr 17 20:57:07.650192 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:07.648555 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56m5qk"] Apr 17 20:57:07.749431 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:07.749400 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d09ab3a5-084e-4b23-a4db-0eaf956f1f11-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56m5qk\" (UID: \"d09ab3a5-084e-4b23-a4db-0eaf956f1f11\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56m5qk" Apr 17 20:57:07.749575 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:07.749441 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d09ab3a5-084e-4b23-a4db-0eaf956f1f11-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56m5qk\" (UID: \"d09ab3a5-084e-4b23-a4db-0eaf956f1f11\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56m5qk" Apr 17 20:57:07.749575 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:07.749468 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzfh4\" (UniqueName: \"kubernetes.io/projected/d09ab3a5-084e-4b23-a4db-0eaf956f1f11-kube-api-access-hzfh4\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56m5qk\" (UID: \"d09ab3a5-084e-4b23-a4db-0eaf956f1f11\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56m5qk" Apr 17 20:57:07.849801 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:07.849766 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d09ab3a5-084e-4b23-a4db-0eaf956f1f11-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56m5qk\" (UID: \"d09ab3a5-084e-4b23-a4db-0eaf956f1f11\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56m5qk" Apr 17 20:57:07.849944 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:07.849811 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d09ab3a5-084e-4b23-a4db-0eaf956f1f11-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56m5qk\" (UID: \"d09ab3a5-084e-4b23-a4db-0eaf956f1f11\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56m5qk" Apr 17 20:57:07.849944 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:07.849839 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hzfh4\" (UniqueName: \"kubernetes.io/projected/d09ab3a5-084e-4b23-a4db-0eaf956f1f11-kube-api-access-hzfh4\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56m5qk\" (UID: \"d09ab3a5-084e-4b23-a4db-0eaf956f1f11\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56m5qk" Apr 17 20:57:07.850192 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:07.850173 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d09ab3a5-084e-4b23-a4db-0eaf956f1f11-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56m5qk\" (UID: \"d09ab3a5-084e-4b23-a4db-0eaf956f1f11\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56m5qk" Apr 17 20:57:07.850241 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:07.850196 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d09ab3a5-084e-4b23-a4db-0eaf956f1f11-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56m5qk\" (UID: \"d09ab3a5-084e-4b23-a4db-0eaf956f1f11\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56m5qk" Apr 17 20:57:07.858329 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:07.858306 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzfh4\" (UniqueName: \"kubernetes.io/projected/d09ab3a5-084e-4b23-a4db-0eaf956f1f11-kube-api-access-hzfh4\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56m5qk\" (UID: \"d09ab3a5-084e-4b23-a4db-0eaf956f1f11\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56m5qk" Apr 17 20:57:07.953289 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:07.953224 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56m5qk" Apr 17 20:57:08.069876 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:08.069794 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56m5qk"] Apr 17 20:57:08.072083 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:57:08.072054 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd09ab3a5_084e_4b23_a4db_0eaf956f1f11.slice/crio-ae86cb505ca9cb5b806ef2c1f84323121a309a8bd37b9cf0bf69e57aa2c580a3 WatchSource:0}: Error finding container ae86cb505ca9cb5b806ef2c1f84323121a309a8bd37b9cf0bf69e57aa2c580a3: Status 404 returned error can't find the container with id ae86cb505ca9cb5b806ef2c1f84323121a309a8bd37b9cf0bf69e57aa2c580a3 Apr 17 20:57:08.664433 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:08.664400 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56m5qk" event={"ID":"d09ab3a5-084e-4b23-a4db-0eaf956f1f11","Type":"ContainerStarted","Data":"ae86cb505ca9cb5b806ef2c1f84323121a309a8bd37b9cf0bf69e57aa2c580a3"} Apr 17 20:57:13.681567 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:13.681532 2576 generic.go:358] "Generic (PLEG): container finished" podID="d09ab3a5-084e-4b23-a4db-0eaf956f1f11" containerID="93ed6d8cc3f309d22963844ad95748648cbc3742fcf5d4779075fcf4f5b6fafa" exitCode=0 Apr 17 20:57:13.681976 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:13.681585 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56m5qk" event={"ID":"d09ab3a5-084e-4b23-a4db-0eaf956f1f11","Type":"ContainerDied","Data":"93ed6d8cc3f309d22963844ad95748648cbc3742fcf5d4779075fcf4f5b6fafa"} Apr 17 20:57:15.688743 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:15.688711 2576 generic.go:358] "Generic (PLEG): container finished" podID="d09ab3a5-084e-4b23-a4db-0eaf956f1f11" containerID="fb49b872912e9eab18c86968b3daaa6fc8aed072fea9ed2b2026c11065d006be" exitCode=0 Apr 17 20:57:15.689092 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:15.688770 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56m5qk" event={"ID":"d09ab3a5-084e-4b23-a4db-0eaf956f1f11","Type":"ContainerDied","Data":"fb49b872912e9eab18c86968b3daaa6fc8aed072fea9ed2b2026c11065d006be"} Apr 17 20:57:22.713238 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:22.713204 2576 generic.go:358] "Generic (PLEG): container finished" podID="d09ab3a5-084e-4b23-a4db-0eaf956f1f11" containerID="9c6bc4005ba72002117e3b1a8ba63457bec5f67cd05bc42379da3e96e59b957e" exitCode=0 Apr 17 20:57:22.713620 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:22.713297 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56m5qk" event={"ID":"d09ab3a5-084e-4b23-a4db-0eaf956f1f11","Type":"ContainerDied","Data":"9c6bc4005ba72002117e3b1a8ba63457bec5f67cd05bc42379da3e96e59b957e"} Apr 17 20:57:23.839877 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:23.839856 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56m5qk" Apr 17 20:57:23.905691 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:23.905667 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d09ab3a5-084e-4b23-a4db-0eaf956f1f11-bundle\") pod \"d09ab3a5-084e-4b23-a4db-0eaf956f1f11\" (UID: \"d09ab3a5-084e-4b23-a4db-0eaf956f1f11\") " Apr 17 20:57:23.905808 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:23.905763 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d09ab3a5-084e-4b23-a4db-0eaf956f1f11-util\") pod \"d09ab3a5-084e-4b23-a4db-0eaf956f1f11\" (UID: \"d09ab3a5-084e-4b23-a4db-0eaf956f1f11\") " Apr 17 20:57:23.905808 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:23.905786 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hzfh4\" (UniqueName: \"kubernetes.io/projected/d09ab3a5-084e-4b23-a4db-0eaf956f1f11-kube-api-access-hzfh4\") pod \"d09ab3a5-084e-4b23-a4db-0eaf956f1f11\" (UID: \"d09ab3a5-084e-4b23-a4db-0eaf956f1f11\") " Apr 17 20:57:23.906301 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:23.906273 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d09ab3a5-084e-4b23-a4db-0eaf956f1f11-bundle" (OuterVolumeSpecName: "bundle") pod "d09ab3a5-084e-4b23-a4db-0eaf956f1f11" (UID: "d09ab3a5-084e-4b23-a4db-0eaf956f1f11"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 20:57:23.907912 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:23.907879 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d09ab3a5-084e-4b23-a4db-0eaf956f1f11-kube-api-access-hzfh4" (OuterVolumeSpecName: "kube-api-access-hzfh4") pod "d09ab3a5-084e-4b23-a4db-0eaf956f1f11" (UID: "d09ab3a5-084e-4b23-a4db-0eaf956f1f11"). InnerVolumeSpecName "kube-api-access-hzfh4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:57:23.909908 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:23.909884 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d09ab3a5-084e-4b23-a4db-0eaf956f1f11-util" (OuterVolumeSpecName: "util") pod "d09ab3a5-084e-4b23-a4db-0eaf956f1f11" (UID: "d09ab3a5-084e-4b23-a4db-0eaf956f1f11"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 20:57:24.006437 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:24.006382 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d09ab3a5-084e-4b23-a4db-0eaf956f1f11-bundle\") on node \"ip-10-0-128-248.ec2.internal\" DevicePath \"\"" Apr 17 20:57:24.006437 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:24.006403 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d09ab3a5-084e-4b23-a4db-0eaf956f1f11-util\") on node \"ip-10-0-128-248.ec2.internal\" DevicePath \"\"" Apr 17 20:57:24.006437 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:24.006412 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hzfh4\" (UniqueName: \"kubernetes.io/projected/d09ab3a5-084e-4b23-a4db-0eaf956f1f11-kube-api-access-hzfh4\") on node \"ip-10-0-128-248.ec2.internal\" DevicePath \"\"" Apr 17 20:57:24.720505 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:24.720464 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56m5qk" event={"ID":"d09ab3a5-084e-4b23-a4db-0eaf956f1f11","Type":"ContainerDied","Data":"ae86cb505ca9cb5b806ef2c1f84323121a309a8bd37b9cf0bf69e57aa2c580a3"} Apr 17 20:57:24.720505 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:24.720501 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ae86cb505ca9cb5b806ef2c1f84323121a309a8bd37b9cf0bf69e57aa2c580a3" Apr 17 20:57:24.720505 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:24.720502 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56m5qk" Apr 17 20:57:29.914775 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:29.914740 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-pl8q9"] Apr 17 20:57:29.915215 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:29.915050 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d09ab3a5-084e-4b23-a4db-0eaf956f1f11" containerName="util" Apr 17 20:57:29.915215 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:29.915061 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="d09ab3a5-084e-4b23-a4db-0eaf956f1f11" containerName="util" Apr 17 20:57:29.915215 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:29.915070 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d09ab3a5-084e-4b23-a4db-0eaf956f1f11" containerName="pull" Apr 17 20:57:29.915215 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:29.915076 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="d09ab3a5-084e-4b23-a4db-0eaf956f1f11" containerName="pull" Apr 17 20:57:29.915215 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:29.915094 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d09ab3a5-084e-4b23-a4db-0eaf956f1f11" containerName="extract" Apr 17 20:57:29.915215 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:29.915100 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="d09ab3a5-084e-4b23-a4db-0eaf956f1f11" containerName="extract" Apr 17 20:57:29.915215 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:29.915149 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="d09ab3a5-084e-4b23-a4db-0eaf956f1f11" containerName="extract" Apr 17 20:57:29.920129 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:29.920114 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-pl8q9" Apr 17 20:57:29.922715 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:29.922692 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 17 20:57:29.922829 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:29.922733 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-zrpdl\"" Apr 17 20:57:29.922829 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:29.922741 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 17 20:57:29.929416 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:29.929392 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-pl8q9"] Apr 17 20:57:29.954388 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:29.954364 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b1d9be11-9788-4111-a252-342e430b46d3-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-pl8q9\" (UID: \"b1d9be11-9788-4111-a252-342e430b46d3\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-pl8q9" Apr 17 20:57:29.954473 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:29.954400 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsvp4\" (UniqueName: \"kubernetes.io/projected/b1d9be11-9788-4111-a252-342e430b46d3-kube-api-access-hsvp4\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-pl8q9\" (UID: \"b1d9be11-9788-4111-a252-342e430b46d3\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-pl8q9" Apr 17 20:57:30.055264 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:30.055223 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b1d9be11-9788-4111-a252-342e430b46d3-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-pl8q9\" (UID: \"b1d9be11-9788-4111-a252-342e430b46d3\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-pl8q9" Apr 17 20:57:30.055264 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:30.055270 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hsvp4\" (UniqueName: \"kubernetes.io/projected/b1d9be11-9788-4111-a252-342e430b46d3-kube-api-access-hsvp4\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-pl8q9\" (UID: \"b1d9be11-9788-4111-a252-342e430b46d3\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-pl8q9" Apr 17 20:57:30.055677 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:30.055657 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b1d9be11-9788-4111-a252-342e430b46d3-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-pl8q9\" (UID: \"b1d9be11-9788-4111-a252-342e430b46d3\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-pl8q9" Apr 17 20:57:30.064119 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:30.064094 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hsvp4\" (UniqueName: \"kubernetes.io/projected/b1d9be11-9788-4111-a252-342e430b46d3-kube-api-access-hsvp4\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-pl8q9\" (UID: \"b1d9be11-9788-4111-a252-342e430b46d3\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-pl8q9" Apr 17 20:57:30.229991 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:30.229907 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-pl8q9" Apr 17 20:57:30.355190 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:30.355157 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-pl8q9"] Apr 17 20:57:30.359417 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:57:30.359387 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1d9be11_9788_4111_a252_342e430b46d3.slice/crio-df300259bfa4eef391a653c8417bd040719648f4f58abd098ec24caa889043c7 WatchSource:0}: Error finding container df300259bfa4eef391a653c8417bd040719648f4f58abd098ec24caa889043c7: Status 404 returned error can't find the container with id df300259bfa4eef391a653c8417bd040719648f4f58abd098ec24caa889043c7 Apr 17 20:57:30.745322 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:30.745287 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-pl8q9" event={"ID":"b1d9be11-9788-4111-a252-342e430b46d3","Type":"ContainerStarted","Data":"df300259bfa4eef391a653c8417bd040719648f4f58abd098ec24caa889043c7"} Apr 17 20:57:32.753521 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:32.753474 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-pl8q9" event={"ID":"b1d9be11-9788-4111-a252-342e430b46d3","Type":"ContainerStarted","Data":"ccccdfe07cc7bd51e462bd0d1c52cccd8b339ab0cc9204682249cc9d5b1702e0"} Apr 17 20:57:32.776224 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:32.776174 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-pl8q9" podStartSLOduration=1.96006319 podStartE2EDuration="3.776160876s" podCreationTimestamp="2026-04-17 20:57:29 +0000 UTC" firstStartedPulling="2026-04-17 20:57:30.361948212 +0000 UTC m=+412.368362223" lastFinishedPulling="2026-04-17 20:57:32.17804588 +0000 UTC m=+414.184459909" observedRunningTime="2026-04-17 20:57:32.774434893 +0000 UTC m=+414.780848938" watchObservedRunningTime="2026-04-17 20:57:32.776160876 +0000 UTC m=+414.782574913" Apr 17 20:57:33.806313 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:33.806281 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4trlj"] Apr 17 20:57:33.809944 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:33.809924 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4trlj" Apr 17 20:57:33.812578 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:33.812556 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 20:57:33.813657 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:33.813638 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-5vn7h\"" Apr 17 20:57:33.813657 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:33.813649 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 20:57:33.817367 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:33.817344 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4trlj"] Apr 17 20:57:33.887451 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:33.887412 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/22f514b0-9738-4b03-be54-a38449246fbd-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4trlj\" (UID: \"22f514b0-9738-4b03-be54-a38449246fbd\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4trlj" Apr 17 20:57:33.887451 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:33.887452 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/22f514b0-9738-4b03-be54-a38449246fbd-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4trlj\" (UID: \"22f514b0-9738-4b03-be54-a38449246fbd\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4trlj" Apr 17 20:57:33.887682 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:33.887479 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wncx8\" (UniqueName: \"kubernetes.io/projected/22f514b0-9738-4b03-be54-a38449246fbd-kube-api-access-wncx8\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4trlj\" (UID: \"22f514b0-9738-4b03-be54-a38449246fbd\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4trlj" Apr 17 20:57:33.988854 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:33.988810 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/22f514b0-9738-4b03-be54-a38449246fbd-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4trlj\" (UID: \"22f514b0-9738-4b03-be54-a38449246fbd\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4trlj" Apr 17 20:57:33.988854 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:33.988854 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/22f514b0-9738-4b03-be54-a38449246fbd-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4trlj\" (UID: \"22f514b0-9738-4b03-be54-a38449246fbd\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4trlj" Apr 17 20:57:33.989139 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:33.988911 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wncx8\" (UniqueName: \"kubernetes.io/projected/22f514b0-9738-4b03-be54-a38449246fbd-kube-api-access-wncx8\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4trlj\" (UID: \"22f514b0-9738-4b03-be54-a38449246fbd\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4trlj" Apr 17 20:57:33.989278 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:33.989257 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/22f514b0-9738-4b03-be54-a38449246fbd-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4trlj\" (UID: \"22f514b0-9738-4b03-be54-a38449246fbd\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4trlj" Apr 17 20:57:33.989341 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:33.989312 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/22f514b0-9738-4b03-be54-a38449246fbd-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4trlj\" (UID: \"22f514b0-9738-4b03-be54-a38449246fbd\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4trlj" Apr 17 20:57:34.000009 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:33.999983 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wncx8\" (UniqueName: \"kubernetes.io/projected/22f514b0-9738-4b03-be54-a38449246fbd-kube-api-access-wncx8\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4trlj\" (UID: \"22f514b0-9738-4b03-be54-a38449246fbd\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4trlj" Apr 17 20:57:34.120732 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:34.120691 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4trlj" Apr 17 20:57:34.250697 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:34.250556 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4trlj"] Apr 17 20:57:34.252839 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:57:34.252801 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22f514b0_9738_4b03_be54_a38449246fbd.slice/crio-1c84bece7a52b21dbc1f2200d797e73622655638b2d5164d304cd980c5c690f2 WatchSource:0}: Error finding container 1c84bece7a52b21dbc1f2200d797e73622655638b2d5164d304cd980c5c690f2: Status 404 returned error can't find the container with id 1c84bece7a52b21dbc1f2200d797e73622655638b2d5164d304cd980c5c690f2 Apr 17 20:57:34.762039 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:34.761998 2576 generic.go:358] "Generic (PLEG): container finished" podID="22f514b0-9738-4b03-be54-a38449246fbd" containerID="ab31133dfa69f44e930d8195aebe76969400e21cf6bca15fcb1c24a2c86a3b1a" exitCode=0 Apr 17 20:57:34.762236 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:34.762082 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4trlj" event={"ID":"22f514b0-9738-4b03-be54-a38449246fbd","Type":"ContainerDied","Data":"ab31133dfa69f44e930d8195aebe76969400e21cf6bca15fcb1c24a2c86a3b1a"} Apr 17 20:57:34.762236 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:34.762116 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4trlj" event={"ID":"22f514b0-9738-4b03-be54-a38449246fbd","Type":"ContainerStarted","Data":"1c84bece7a52b21dbc1f2200d797e73622655638b2d5164d304cd980c5c690f2"} Apr 17 20:57:36.536341 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:36.536251 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-lnx9f"] Apr 17 20:57:36.547379 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:36.547351 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-lnx9f" Apr 17 20:57:36.548668 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:36.548638 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-lnx9f"] Apr 17 20:57:36.550001 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:36.549972 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 17 20:57:36.551282 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:36.551259 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 17 20:57:36.551385 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:36.551262 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-rk7wr\"" Apr 17 20:57:36.613219 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:36.613179 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvmrr\" (UniqueName: \"kubernetes.io/projected/dce5c551-63b4-432e-8e1c-dc1ed36ef751-kube-api-access-rvmrr\") pod \"cert-manager-webhook-597b96b99b-lnx9f\" (UID: \"dce5c551-63b4-432e-8e1c-dc1ed36ef751\") " pod="cert-manager/cert-manager-webhook-597b96b99b-lnx9f" Apr 17 20:57:36.613390 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:36.613243 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/dce5c551-63b4-432e-8e1c-dc1ed36ef751-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-lnx9f\" (UID: \"dce5c551-63b4-432e-8e1c-dc1ed36ef751\") " pod="cert-manager/cert-manager-webhook-597b96b99b-lnx9f" Apr 17 20:57:36.714540 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:36.714506 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rvmrr\" (UniqueName: \"kubernetes.io/projected/dce5c551-63b4-432e-8e1c-dc1ed36ef751-kube-api-access-rvmrr\") pod \"cert-manager-webhook-597b96b99b-lnx9f\" (UID: \"dce5c551-63b4-432e-8e1c-dc1ed36ef751\") " pod="cert-manager/cert-manager-webhook-597b96b99b-lnx9f" Apr 17 20:57:36.714694 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:36.714550 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/dce5c551-63b4-432e-8e1c-dc1ed36ef751-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-lnx9f\" (UID: \"dce5c551-63b4-432e-8e1c-dc1ed36ef751\") " pod="cert-manager/cert-manager-webhook-597b96b99b-lnx9f" Apr 17 20:57:36.723013 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:36.722987 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/dce5c551-63b4-432e-8e1c-dc1ed36ef751-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-lnx9f\" (UID: \"dce5c551-63b4-432e-8e1c-dc1ed36ef751\") " pod="cert-manager/cert-manager-webhook-597b96b99b-lnx9f" Apr 17 20:57:36.723129 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:36.723082 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvmrr\" (UniqueName: \"kubernetes.io/projected/dce5c551-63b4-432e-8e1c-dc1ed36ef751-kube-api-access-rvmrr\") pod \"cert-manager-webhook-597b96b99b-lnx9f\" (UID: \"dce5c551-63b4-432e-8e1c-dc1ed36ef751\") " pod="cert-manager/cert-manager-webhook-597b96b99b-lnx9f" Apr 17 20:57:36.866519 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:36.866495 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-lnx9f" Apr 17 20:57:36.982937 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:36.982913 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-lnx9f"] Apr 17 20:57:36.985332 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:57:36.985306 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddce5c551_63b4_432e_8e1c_dc1ed36ef751.slice/crio-4b5a4eb8afb54f10a722e412495768c9110da604b8dbfb0e364590018080256d WatchSource:0}: Error finding container 4b5a4eb8afb54f10a722e412495768c9110da604b8dbfb0e364590018080256d: Status 404 returned error can't find the container with id 4b5a4eb8afb54f10a722e412495768c9110da604b8dbfb0e364590018080256d Apr 17 20:57:37.773452 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:37.773419 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-lnx9f" event={"ID":"dce5c551-63b4-432e-8e1c-dc1ed36ef751","Type":"ContainerStarted","Data":"4b5a4eb8afb54f10a722e412495768c9110da604b8dbfb0e364590018080256d"} Apr 17 20:57:37.778339 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:37.775854 2576 generic.go:358] "Generic (PLEG): container finished" podID="22f514b0-9738-4b03-be54-a38449246fbd" containerID="2073b294b63c9d072daf61004fcbc5899e903f4af15badad9199f83d237454c7" exitCode=0 Apr 17 20:57:37.778339 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:37.775931 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4trlj" event={"ID":"22f514b0-9738-4b03-be54-a38449246fbd","Type":"ContainerDied","Data":"2073b294b63c9d072daf61004fcbc5899e903f4af15badad9199f83d237454c7"} Apr 17 20:57:38.780674 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:38.780591 2576 generic.go:358] "Generic (PLEG): container finished" podID="22f514b0-9738-4b03-be54-a38449246fbd" containerID="d2540380c424837f6c8879ae1df167bff062301c78921fa17c25e1b5f4598f66" exitCode=0 Apr 17 20:57:38.781018 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:38.780669 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4trlj" event={"ID":"22f514b0-9738-4b03-be54-a38449246fbd","Type":"ContainerDied","Data":"d2540380c424837f6c8879ae1df167bff062301c78921fa17c25e1b5f4598f66"} Apr 17 20:57:39.674483 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:39.674453 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-wxm7c"] Apr 17 20:57:39.680518 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:39.680498 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-wxm7c" Apr 17 20:57:39.682948 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:39.682928 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-nsnjk\"" Apr 17 20:57:39.686708 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:39.686684 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-wxm7c"] Apr 17 20:57:39.742207 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:39.742182 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffws9\" (UniqueName: \"kubernetes.io/projected/c3902a9c-3f01-43fd-8c5b-7854b0b6171e-kube-api-access-ffws9\") pod \"cert-manager-cainjector-8966b78d4-wxm7c\" (UID: \"c3902a9c-3f01-43fd-8c5b-7854b0b6171e\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-wxm7c" Apr 17 20:57:39.742315 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:39.742221 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c3902a9c-3f01-43fd-8c5b-7854b0b6171e-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-wxm7c\" (UID: \"c3902a9c-3f01-43fd-8c5b-7854b0b6171e\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-wxm7c" Apr 17 20:57:39.843567 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:39.843540 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ffws9\" (UniqueName: \"kubernetes.io/projected/c3902a9c-3f01-43fd-8c5b-7854b0b6171e-kube-api-access-ffws9\") pod \"cert-manager-cainjector-8966b78d4-wxm7c\" (UID: \"c3902a9c-3f01-43fd-8c5b-7854b0b6171e\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-wxm7c" Apr 17 20:57:39.843969 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:39.843583 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c3902a9c-3f01-43fd-8c5b-7854b0b6171e-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-wxm7c\" (UID: \"c3902a9c-3f01-43fd-8c5b-7854b0b6171e\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-wxm7c" Apr 17 20:57:39.851590 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:39.851550 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c3902a9c-3f01-43fd-8c5b-7854b0b6171e-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-wxm7c\" (UID: \"c3902a9c-3f01-43fd-8c5b-7854b0b6171e\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-wxm7c" Apr 17 20:57:39.851718 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:39.851657 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffws9\" (UniqueName: \"kubernetes.io/projected/c3902a9c-3f01-43fd-8c5b-7854b0b6171e-kube-api-access-ffws9\") pod \"cert-manager-cainjector-8966b78d4-wxm7c\" (UID: \"c3902a9c-3f01-43fd-8c5b-7854b0b6171e\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-wxm7c" Apr 17 20:57:39.898498 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:39.898472 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4trlj" Apr 17 20:57:39.990803 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:39.990738 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-wxm7c" Apr 17 20:57:40.045690 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:40.045660 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wncx8\" (UniqueName: \"kubernetes.io/projected/22f514b0-9738-4b03-be54-a38449246fbd-kube-api-access-wncx8\") pod \"22f514b0-9738-4b03-be54-a38449246fbd\" (UID: \"22f514b0-9738-4b03-be54-a38449246fbd\") " Apr 17 20:57:40.045834 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:40.045768 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/22f514b0-9738-4b03-be54-a38449246fbd-util\") pod \"22f514b0-9738-4b03-be54-a38449246fbd\" (UID: \"22f514b0-9738-4b03-be54-a38449246fbd\") " Apr 17 20:57:40.045834 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:40.045813 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/22f514b0-9738-4b03-be54-a38449246fbd-bundle\") pod \"22f514b0-9738-4b03-be54-a38449246fbd\" (UID: \"22f514b0-9738-4b03-be54-a38449246fbd\") " Apr 17 20:57:40.046242 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:40.046219 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22f514b0-9738-4b03-be54-a38449246fbd-bundle" (OuterVolumeSpecName: "bundle") pod "22f514b0-9738-4b03-be54-a38449246fbd" (UID: "22f514b0-9738-4b03-be54-a38449246fbd"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 20:57:40.047642 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:40.047612 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22f514b0-9738-4b03-be54-a38449246fbd-kube-api-access-wncx8" (OuterVolumeSpecName: "kube-api-access-wncx8") pod "22f514b0-9738-4b03-be54-a38449246fbd" (UID: "22f514b0-9738-4b03-be54-a38449246fbd"). InnerVolumeSpecName "kube-api-access-wncx8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:57:40.050935 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:40.050910 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22f514b0-9738-4b03-be54-a38449246fbd-util" (OuterVolumeSpecName: "util") pod "22f514b0-9738-4b03-be54-a38449246fbd" (UID: "22f514b0-9738-4b03-be54-a38449246fbd"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 20:57:40.107286 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:40.107262 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-wxm7c"] Apr 17 20:57:40.109056 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:57:40.109017 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc3902a9c_3f01_43fd_8c5b_7854b0b6171e.slice/crio-e9720f930c08b733125e4eabdc7bd1041e2be1b3c14632b0843869ff488d48b4 WatchSource:0}: Error finding container e9720f930c08b733125e4eabdc7bd1041e2be1b3c14632b0843869ff488d48b4: Status 404 returned error can't find the container with id e9720f930c08b733125e4eabdc7bd1041e2be1b3c14632b0843869ff488d48b4 Apr 17 20:57:40.146565 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:40.146542 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wncx8\" (UniqueName: \"kubernetes.io/projected/22f514b0-9738-4b03-be54-a38449246fbd-kube-api-access-wncx8\") on node \"ip-10-0-128-248.ec2.internal\" DevicePath \"\"" Apr 17 20:57:40.146641 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:40.146564 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/22f514b0-9738-4b03-be54-a38449246fbd-util\") on node \"ip-10-0-128-248.ec2.internal\" DevicePath \"\"" Apr 17 20:57:40.146641 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:40.146576 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/22f514b0-9738-4b03-be54-a38449246fbd-bundle\") on node \"ip-10-0-128-248.ec2.internal\" DevicePath \"\"" Apr 17 20:57:40.789442 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:40.789359 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-wxm7c" event={"ID":"c3902a9c-3f01-43fd-8c5b-7854b0b6171e","Type":"ContainerStarted","Data":"e9720f930c08b733125e4eabdc7bd1041e2be1b3c14632b0843869ff488d48b4"} Apr 17 20:57:40.791620 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:40.791580 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4trlj" event={"ID":"22f514b0-9738-4b03-be54-a38449246fbd","Type":"ContainerDied","Data":"1c84bece7a52b21dbc1f2200d797e73622655638b2d5164d304cd980c5c690f2"} Apr 17 20:57:40.791620 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:40.791620 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1c84bece7a52b21dbc1f2200d797e73622655638b2d5164d304cd980c5c690f2" Apr 17 20:57:40.791809 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:40.791670 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4trlj" Apr 17 20:57:43.804845 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:43.804809 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-wxm7c" event={"ID":"c3902a9c-3f01-43fd-8c5b-7854b0b6171e","Type":"ContainerStarted","Data":"5d70f4769e28c0f9cc7a350096f5c311ed7c74deb6195977a2ef09a254f8ec20"} Apr 17 20:57:43.806359 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:43.806329 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-lnx9f" event={"ID":"dce5c551-63b4-432e-8e1c-dc1ed36ef751","Type":"ContainerStarted","Data":"38a7717f55d9d9a5249514dc5f466bab9186778b15b54d184612a17a46c57149"} Apr 17 20:57:43.806490 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:43.806468 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-597b96b99b-lnx9f" Apr 17 20:57:43.824587 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:43.824545 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-8966b78d4-wxm7c" podStartSLOduration=1.682702685 podStartE2EDuration="4.824533543s" podCreationTimestamp="2026-04-17 20:57:39 +0000 UTC" firstStartedPulling="2026-04-17 20:57:40.110767682 +0000 UTC m=+422.117181696" lastFinishedPulling="2026-04-17 20:57:43.252598544 +0000 UTC m=+425.259012554" observedRunningTime="2026-04-17 20:57:43.823302957 +0000 UTC m=+425.829716990" watchObservedRunningTime="2026-04-17 20:57:43.824533543 +0000 UTC m=+425.830947575" Apr 17 20:57:43.839272 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:43.839231 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-597b96b99b-lnx9f" podStartSLOduration=1.567091158 podStartE2EDuration="7.839218997s" podCreationTimestamp="2026-04-17 20:57:36 +0000 UTC" firstStartedPulling="2026-04-17 20:57:36.987066519 +0000 UTC m=+418.993480530" lastFinishedPulling="2026-04-17 20:57:43.259194358 +0000 UTC m=+425.265608369" observedRunningTime="2026-04-17 20:57:43.83784018 +0000 UTC m=+425.844254214" watchObservedRunningTime="2026-04-17 20:57:43.839218997 +0000 UTC m=+425.845633030" Apr 17 20:57:45.211568 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:45.211534 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-wz56g"] Apr 17 20:57:45.212014 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:45.211872 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="22f514b0-9738-4b03-be54-a38449246fbd" containerName="pull" Apr 17 20:57:45.212014 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:45.211888 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="22f514b0-9738-4b03-be54-a38449246fbd" containerName="pull" Apr 17 20:57:45.212014 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:45.211914 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="22f514b0-9738-4b03-be54-a38449246fbd" containerName="extract" Apr 17 20:57:45.212014 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:45.211922 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="22f514b0-9738-4b03-be54-a38449246fbd" containerName="extract" Apr 17 20:57:45.212014 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:45.211941 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="22f514b0-9738-4b03-be54-a38449246fbd" containerName="util" Apr 17 20:57:45.212014 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:45.211950 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="22f514b0-9738-4b03-be54-a38449246fbd" containerName="util" Apr 17 20:57:45.212245 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:45.212060 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="22f514b0-9738-4b03-be54-a38449246fbd" containerName="extract" Apr 17 20:57:45.215212 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:45.215197 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-wz56g" Apr 17 20:57:45.217797 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:45.217777 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 17 20:57:45.218913 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:45.218885 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"openshift-lws-operator-dockercfg-gpj45\"" Apr 17 20:57:45.218987 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:45.218885 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 17 20:57:45.223989 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:45.223965 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-wz56g"] Apr 17 20:57:45.292971 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:45.292937 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/51bedda0-02fb-443a-bd4f-e9016f008085-tmp\") pod \"openshift-lws-operator-bfc7f696d-wz56g\" (UID: \"51bedda0-02fb-443a-bd4f-e9016f008085\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-wz56g" Apr 17 20:57:45.293154 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:45.293006 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vr78\" (UniqueName: \"kubernetes.io/projected/51bedda0-02fb-443a-bd4f-e9016f008085-kube-api-access-7vr78\") pod \"openshift-lws-operator-bfc7f696d-wz56g\" (UID: \"51bedda0-02fb-443a-bd4f-e9016f008085\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-wz56g" Apr 17 20:57:45.394280 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:45.394238 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/51bedda0-02fb-443a-bd4f-e9016f008085-tmp\") pod \"openshift-lws-operator-bfc7f696d-wz56g\" (UID: \"51bedda0-02fb-443a-bd4f-e9016f008085\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-wz56g" Apr 17 20:57:45.394466 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:45.394348 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7vr78\" (UniqueName: \"kubernetes.io/projected/51bedda0-02fb-443a-bd4f-e9016f008085-kube-api-access-7vr78\") pod \"openshift-lws-operator-bfc7f696d-wz56g\" (UID: \"51bedda0-02fb-443a-bd4f-e9016f008085\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-wz56g" Apr 17 20:57:45.394666 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:45.394647 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/51bedda0-02fb-443a-bd4f-e9016f008085-tmp\") pod \"openshift-lws-operator-bfc7f696d-wz56g\" (UID: \"51bedda0-02fb-443a-bd4f-e9016f008085\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-wz56g" Apr 17 20:57:45.402284 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:45.402250 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vr78\" (UniqueName: \"kubernetes.io/projected/51bedda0-02fb-443a-bd4f-e9016f008085-kube-api-access-7vr78\") pod \"openshift-lws-operator-bfc7f696d-wz56g\" (UID: \"51bedda0-02fb-443a-bd4f-e9016f008085\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-wz56g" Apr 17 20:57:45.525707 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:45.525621 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-wz56g" Apr 17 20:57:45.642155 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:45.642127 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-wz56g"] Apr 17 20:57:45.644178 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:57:45.644152 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod51bedda0_02fb_443a_bd4f_e9016f008085.slice/crio-54da917eff612f2c0fe0cae3d0052b6369d2329d38ce91c3d8befec8755dd7f0 WatchSource:0}: Error finding container 54da917eff612f2c0fe0cae3d0052b6369d2329d38ce91c3d8befec8755dd7f0: Status 404 returned error can't find the container with id 54da917eff612f2c0fe0cae3d0052b6369d2329d38ce91c3d8befec8755dd7f0 Apr 17 20:57:45.814167 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:45.814139 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-wz56g" event={"ID":"51bedda0-02fb-443a-bd4f-e9016f008085","Type":"ContainerStarted","Data":"54da917eff612f2c0fe0cae3d0052b6369d2329d38ce91c3d8befec8755dd7f0"} Apr 17 20:57:47.822728 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:47.822693 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-wz56g" event={"ID":"51bedda0-02fb-443a-bd4f-e9016f008085","Type":"ContainerStarted","Data":"275da0b250bb040cc3ae47a59347dda3e7b08d654d7dcd9352c746e079de782a"} Apr 17 20:57:49.811539 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:49.811512 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-597b96b99b-lnx9f" Apr 17 20:57:49.828453 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:49.828399 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-wz56g" podStartSLOduration=3.017091361 podStartE2EDuration="4.828382449s" podCreationTimestamp="2026-04-17 20:57:45 +0000 UTC" firstStartedPulling="2026-04-17 20:57:45.646071085 +0000 UTC m=+427.652485098" lastFinishedPulling="2026-04-17 20:57:47.457362171 +0000 UTC m=+429.463776186" observedRunningTime="2026-04-17 20:57:47.837804107 +0000 UTC m=+429.844218144" watchObservedRunningTime="2026-04-17 20:57:49.828382449 +0000 UTC m=+431.834796484" Apr 17 20:57:50.607732 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:50.607702 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wh2pr"] Apr 17 20:57:50.611264 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:50.611248 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wh2pr" Apr 17 20:57:50.613730 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:50.613700 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 20:57:50.614898 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:50.614867 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-5vn7h\"" Apr 17 20:57:50.614898 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:50.614885 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 20:57:50.623526 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:50.619791 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wh2pr"] Apr 17 20:57:50.739137 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:50.739113 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/26c2ce35-a429-4465-a8d8-f8e01acaf2b0-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wh2pr\" (UID: \"26c2ce35-a429-4465-a8d8-f8e01acaf2b0\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wh2pr" Apr 17 20:57:50.739263 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:50.739159 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/26c2ce35-a429-4465-a8d8-f8e01acaf2b0-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wh2pr\" (UID: \"26c2ce35-a429-4465-a8d8-f8e01acaf2b0\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wh2pr" Apr 17 20:57:50.739263 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:50.739254 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsmpf\" (UniqueName: \"kubernetes.io/projected/26c2ce35-a429-4465-a8d8-f8e01acaf2b0-kube-api-access-vsmpf\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wh2pr\" (UID: \"26c2ce35-a429-4465-a8d8-f8e01acaf2b0\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wh2pr" Apr 17 20:57:50.839948 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:50.839920 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vsmpf\" (UniqueName: \"kubernetes.io/projected/26c2ce35-a429-4465-a8d8-f8e01acaf2b0-kube-api-access-vsmpf\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wh2pr\" (UID: \"26c2ce35-a429-4465-a8d8-f8e01acaf2b0\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wh2pr" Apr 17 20:57:50.840371 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:50.839963 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/26c2ce35-a429-4465-a8d8-f8e01acaf2b0-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wh2pr\" (UID: \"26c2ce35-a429-4465-a8d8-f8e01acaf2b0\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wh2pr" Apr 17 20:57:50.840371 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:50.840001 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/26c2ce35-a429-4465-a8d8-f8e01acaf2b0-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wh2pr\" (UID: \"26c2ce35-a429-4465-a8d8-f8e01acaf2b0\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wh2pr" Apr 17 20:57:50.840371 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:50.840345 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/26c2ce35-a429-4465-a8d8-f8e01acaf2b0-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wh2pr\" (UID: \"26c2ce35-a429-4465-a8d8-f8e01acaf2b0\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wh2pr" Apr 17 20:57:50.840478 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:50.840393 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/26c2ce35-a429-4465-a8d8-f8e01acaf2b0-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wh2pr\" (UID: \"26c2ce35-a429-4465-a8d8-f8e01acaf2b0\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wh2pr" Apr 17 20:57:50.848199 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:50.848179 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsmpf\" (UniqueName: \"kubernetes.io/projected/26c2ce35-a429-4465-a8d8-f8e01acaf2b0-kube-api-access-vsmpf\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wh2pr\" (UID: \"26c2ce35-a429-4465-a8d8-f8e01acaf2b0\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wh2pr" Apr 17 20:57:50.923684 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:50.923631 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wh2pr" Apr 17 20:57:51.040917 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:51.040883 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wh2pr"] Apr 17 20:57:51.044349 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:57:51.044317 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26c2ce35_a429_4465_a8d8_f8e01acaf2b0.slice/crio-947928667a2473fcbdab02ae3c263d7e8a5001356eecc4afa10b1de1ca47ef63 WatchSource:0}: Error finding container 947928667a2473fcbdab02ae3c263d7e8a5001356eecc4afa10b1de1ca47ef63: Status 404 returned error can't find the container with id 947928667a2473fcbdab02ae3c263d7e8a5001356eecc4afa10b1de1ca47ef63 Apr 17 20:57:51.836987 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:51.836951 2576 generic.go:358] "Generic (PLEG): container finished" podID="26c2ce35-a429-4465-a8d8-f8e01acaf2b0" containerID="5558fb2af157024a10c452e09a03a9bc0a769c119285423790de23c3b4d63820" exitCode=0 Apr 17 20:57:51.837164 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:51.837038 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wh2pr" event={"ID":"26c2ce35-a429-4465-a8d8-f8e01acaf2b0","Type":"ContainerDied","Data":"5558fb2af157024a10c452e09a03a9bc0a769c119285423790de23c3b4d63820"} Apr 17 20:57:51.837164 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:51.837066 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wh2pr" event={"ID":"26c2ce35-a429-4465-a8d8-f8e01acaf2b0","Type":"ContainerStarted","Data":"947928667a2473fcbdab02ae3c263d7e8a5001356eecc4afa10b1de1ca47ef63"} Apr 17 20:57:52.841847 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:52.841810 2576 generic.go:358] "Generic (PLEG): container finished" podID="26c2ce35-a429-4465-a8d8-f8e01acaf2b0" containerID="10c0675e25971360c54d5e7d27c7fefeac0c2bf0dced9036713969e6d41a86c9" exitCode=0 Apr 17 20:57:52.842233 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:52.841886 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wh2pr" event={"ID":"26c2ce35-a429-4465-a8d8-f8e01acaf2b0","Type":"ContainerDied","Data":"10c0675e25971360c54d5e7d27c7fefeac0c2bf0dced9036713969e6d41a86c9"} Apr 17 20:57:53.850554 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:53.850522 2576 generic.go:358] "Generic (PLEG): container finished" podID="26c2ce35-a429-4465-a8d8-f8e01acaf2b0" containerID="589539ee094600b82520f3f5613ba585dbceef9b077f541c7ecfbcaa8bce9037" exitCode=0 Apr 17 20:57:53.850903 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:53.850602 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wh2pr" event={"ID":"26c2ce35-a429-4465-a8d8-f8e01acaf2b0","Type":"ContainerDied","Data":"589539ee094600b82520f3f5613ba585dbceef9b077f541c7ecfbcaa8bce9037"} Apr 17 20:57:54.980090 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:54.980066 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wh2pr" Apr 17 20:57:55.073007 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:55.072975 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/26c2ce35-a429-4465-a8d8-f8e01acaf2b0-util\") pod \"26c2ce35-a429-4465-a8d8-f8e01acaf2b0\" (UID: \"26c2ce35-a429-4465-a8d8-f8e01acaf2b0\") " Apr 17 20:57:55.073184 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:55.073046 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/26c2ce35-a429-4465-a8d8-f8e01acaf2b0-bundle\") pod \"26c2ce35-a429-4465-a8d8-f8e01acaf2b0\" (UID: \"26c2ce35-a429-4465-a8d8-f8e01acaf2b0\") " Apr 17 20:57:55.073184 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:55.073095 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vsmpf\" (UniqueName: \"kubernetes.io/projected/26c2ce35-a429-4465-a8d8-f8e01acaf2b0-kube-api-access-vsmpf\") pod \"26c2ce35-a429-4465-a8d8-f8e01acaf2b0\" (UID: \"26c2ce35-a429-4465-a8d8-f8e01acaf2b0\") " Apr 17 20:57:55.073699 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:55.073672 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26c2ce35-a429-4465-a8d8-f8e01acaf2b0-bundle" (OuterVolumeSpecName: "bundle") pod "26c2ce35-a429-4465-a8d8-f8e01acaf2b0" (UID: "26c2ce35-a429-4465-a8d8-f8e01acaf2b0"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 20:57:55.075286 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:55.075255 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26c2ce35-a429-4465-a8d8-f8e01acaf2b0-kube-api-access-vsmpf" (OuterVolumeSpecName: "kube-api-access-vsmpf") pod "26c2ce35-a429-4465-a8d8-f8e01acaf2b0" (UID: "26c2ce35-a429-4465-a8d8-f8e01acaf2b0"). InnerVolumeSpecName "kube-api-access-vsmpf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:57:55.078592 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:55.078562 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26c2ce35-a429-4465-a8d8-f8e01acaf2b0-util" (OuterVolumeSpecName: "util") pod "26c2ce35-a429-4465-a8d8-f8e01acaf2b0" (UID: "26c2ce35-a429-4465-a8d8-f8e01acaf2b0"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 20:57:55.173843 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:55.173788 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vsmpf\" (UniqueName: \"kubernetes.io/projected/26c2ce35-a429-4465-a8d8-f8e01acaf2b0-kube-api-access-vsmpf\") on node \"ip-10-0-128-248.ec2.internal\" DevicePath \"\"" Apr 17 20:57:55.173843 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:55.173813 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/26c2ce35-a429-4465-a8d8-f8e01acaf2b0-util\") on node \"ip-10-0-128-248.ec2.internal\" DevicePath \"\"" Apr 17 20:57:55.173843 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:55.173825 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/26c2ce35-a429-4465-a8d8-f8e01acaf2b0-bundle\") on node \"ip-10-0-128-248.ec2.internal\" DevicePath \"\"" Apr 17 20:57:55.858910 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:55.858873 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wh2pr" event={"ID":"26c2ce35-a429-4465-a8d8-f8e01acaf2b0","Type":"ContainerDied","Data":"947928667a2473fcbdab02ae3c263d7e8a5001356eecc4afa10b1de1ca47ef63"} Apr 17 20:57:55.858910 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:55.858908 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="947928667a2473fcbdab02ae3c263d7e8a5001356eecc4afa10b1de1ca47ef63" Apr 17 20:57:55.859140 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:57:55.858927 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wh2pr" Apr 17 20:58:09.733385 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:09.733344 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-5f74b9c8f9-9t7sq"] Apr 17 20:58:09.733849 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:09.733695 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="26c2ce35-a429-4465-a8d8-f8e01acaf2b0" containerName="pull" Apr 17 20:58:09.733849 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:09.733709 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="26c2ce35-a429-4465-a8d8-f8e01acaf2b0" containerName="pull" Apr 17 20:58:09.733849 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:09.733733 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="26c2ce35-a429-4465-a8d8-f8e01acaf2b0" containerName="util" Apr 17 20:58:09.733849 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:09.733739 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="26c2ce35-a429-4465-a8d8-f8e01acaf2b0" containerName="util" Apr 17 20:58:09.733849 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:09.733748 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="26c2ce35-a429-4465-a8d8-f8e01acaf2b0" containerName="extract" Apr 17 20:58:09.733849 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:09.733754 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="26c2ce35-a429-4465-a8d8-f8e01acaf2b0" containerName="extract" Apr 17 20:58:09.733849 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:09.733818 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="26c2ce35-a429-4465-a8d8-f8e01acaf2b0" containerName="extract" Apr 17 20:58:09.741208 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:09.741187 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-5f74b9c8f9-9t7sq" Apr 17 20:58:09.743875 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:09.743852 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 17 20:58:09.743997 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:09.743931 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 17 20:58:09.743997 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:09.743871 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 17 20:58:09.744148 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:09.743940 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-jpcvb\"" Apr 17 20:58:09.744148 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:09.743871 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 17 20:58:09.751469 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:09.751284 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-5f74b9c8f9-9t7sq"] Apr 17 20:58:09.797879 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:09.797857 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdzkz\" (UniqueName: \"kubernetes.io/projected/c32c12ad-6174-4995-b1ee-e43255b206ee-kube-api-access-qdzkz\") pod \"opendatahub-operator-controller-manager-5f74b9c8f9-9t7sq\" (UID: \"c32c12ad-6174-4995-b1ee-e43255b206ee\") " pod="opendatahub/opendatahub-operator-controller-manager-5f74b9c8f9-9t7sq" Apr 17 20:58:09.798052 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:09.797904 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c32c12ad-6174-4995-b1ee-e43255b206ee-apiservice-cert\") pod \"opendatahub-operator-controller-manager-5f74b9c8f9-9t7sq\" (UID: \"c32c12ad-6174-4995-b1ee-e43255b206ee\") " pod="opendatahub/opendatahub-operator-controller-manager-5f74b9c8f9-9t7sq" Apr 17 20:58:09.798132 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:09.798074 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c32c12ad-6174-4995-b1ee-e43255b206ee-webhook-cert\") pod \"opendatahub-operator-controller-manager-5f74b9c8f9-9t7sq\" (UID: \"c32c12ad-6174-4995-b1ee-e43255b206ee\") " pod="opendatahub/opendatahub-operator-controller-manager-5f74b9c8f9-9t7sq" Apr 17 20:58:09.899152 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:09.899123 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c32c12ad-6174-4995-b1ee-e43255b206ee-apiservice-cert\") pod \"opendatahub-operator-controller-manager-5f74b9c8f9-9t7sq\" (UID: \"c32c12ad-6174-4995-b1ee-e43255b206ee\") " pod="opendatahub/opendatahub-operator-controller-manager-5f74b9c8f9-9t7sq" Apr 17 20:58:09.899253 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:09.899198 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c32c12ad-6174-4995-b1ee-e43255b206ee-webhook-cert\") pod \"opendatahub-operator-controller-manager-5f74b9c8f9-9t7sq\" (UID: \"c32c12ad-6174-4995-b1ee-e43255b206ee\") " pod="opendatahub/opendatahub-operator-controller-manager-5f74b9c8f9-9t7sq" Apr 17 20:58:09.899253 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:09.899241 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qdzkz\" (UniqueName: \"kubernetes.io/projected/c32c12ad-6174-4995-b1ee-e43255b206ee-kube-api-access-qdzkz\") pod \"opendatahub-operator-controller-manager-5f74b9c8f9-9t7sq\" (UID: \"c32c12ad-6174-4995-b1ee-e43255b206ee\") " pod="opendatahub/opendatahub-operator-controller-manager-5f74b9c8f9-9t7sq" Apr 17 20:58:09.901631 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:09.901611 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c32c12ad-6174-4995-b1ee-e43255b206ee-webhook-cert\") pod \"opendatahub-operator-controller-manager-5f74b9c8f9-9t7sq\" (UID: \"c32c12ad-6174-4995-b1ee-e43255b206ee\") " pod="opendatahub/opendatahub-operator-controller-manager-5f74b9c8f9-9t7sq" Apr 17 20:58:09.901732 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:09.901630 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c32c12ad-6174-4995-b1ee-e43255b206ee-apiservice-cert\") pod \"opendatahub-operator-controller-manager-5f74b9c8f9-9t7sq\" (UID: \"c32c12ad-6174-4995-b1ee-e43255b206ee\") " pod="opendatahub/opendatahub-operator-controller-manager-5f74b9c8f9-9t7sq" Apr 17 20:58:09.907749 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:09.907726 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdzkz\" (UniqueName: \"kubernetes.io/projected/c32c12ad-6174-4995-b1ee-e43255b206ee-kube-api-access-qdzkz\") pod \"opendatahub-operator-controller-manager-5f74b9c8f9-9t7sq\" (UID: \"c32c12ad-6174-4995-b1ee-e43255b206ee\") " pod="opendatahub/opendatahub-operator-controller-manager-5f74b9c8f9-9t7sq" Apr 17 20:58:10.031649 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:10.031581 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9n5fcl"] Apr 17 20:58:10.035370 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:10.035356 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9n5fcl" Apr 17 20:58:10.038144 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:10.038122 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 20:58:10.038244 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:10.038126 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 20:58:10.038244 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:10.038198 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-5vn7h\"" Apr 17 20:58:10.043185 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:10.043165 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9n5fcl"] Apr 17 20:58:10.053105 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:10.053073 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-5f74b9c8f9-9t7sq" Apr 17 20:58:10.100944 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:10.100905 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1f0a6051-e7b4-4dc7-b9c6-44d2c2e397df-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9n5fcl\" (UID: \"1f0a6051-e7b4-4dc7-b9c6-44d2c2e397df\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9n5fcl" Apr 17 20:58:10.101166 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:10.101118 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1f0a6051-e7b4-4dc7-b9c6-44d2c2e397df-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9n5fcl\" (UID: \"1f0a6051-e7b4-4dc7-b9c6-44d2c2e397df\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9n5fcl" Apr 17 20:58:10.101282 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:10.101261 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7nd2\" (UniqueName: \"kubernetes.io/projected/1f0a6051-e7b4-4dc7-b9c6-44d2c2e397df-kube-api-access-z7nd2\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9n5fcl\" (UID: \"1f0a6051-e7b4-4dc7-b9c6-44d2c2e397df\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9n5fcl" Apr 17 20:58:10.178162 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:10.178139 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-5f74b9c8f9-9t7sq"] Apr 17 20:58:10.180960 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:58:10.180933 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc32c12ad_6174_4995_b1ee_e43255b206ee.slice/crio-aadd08da6ed537d754521da3511635f46019ca6f4d438dbcf6b55d9f508e87c9 WatchSource:0}: Error finding container aadd08da6ed537d754521da3511635f46019ca6f4d438dbcf6b55d9f508e87c9: Status 404 returned error can't find the container with id aadd08da6ed537d754521da3511635f46019ca6f4d438dbcf6b55d9f508e87c9 Apr 17 20:58:10.202202 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:10.202171 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1f0a6051-e7b4-4dc7-b9c6-44d2c2e397df-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9n5fcl\" (UID: \"1f0a6051-e7b4-4dc7-b9c6-44d2c2e397df\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9n5fcl" Apr 17 20:58:10.202314 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:10.202222 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1f0a6051-e7b4-4dc7-b9c6-44d2c2e397df-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9n5fcl\" (UID: \"1f0a6051-e7b4-4dc7-b9c6-44d2c2e397df\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9n5fcl" Apr 17 20:58:10.202371 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:10.202307 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z7nd2\" (UniqueName: \"kubernetes.io/projected/1f0a6051-e7b4-4dc7-b9c6-44d2c2e397df-kube-api-access-z7nd2\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9n5fcl\" (UID: \"1f0a6051-e7b4-4dc7-b9c6-44d2c2e397df\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9n5fcl" Apr 17 20:58:10.202506 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:10.202487 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1f0a6051-e7b4-4dc7-b9c6-44d2c2e397df-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9n5fcl\" (UID: \"1f0a6051-e7b4-4dc7-b9c6-44d2c2e397df\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9n5fcl" Apr 17 20:58:10.202559 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:10.202541 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1f0a6051-e7b4-4dc7-b9c6-44d2c2e397df-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9n5fcl\" (UID: \"1f0a6051-e7b4-4dc7-b9c6-44d2c2e397df\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9n5fcl" Apr 17 20:58:10.213885 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:10.213859 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7nd2\" (UniqueName: \"kubernetes.io/projected/1f0a6051-e7b4-4dc7-b9c6-44d2c2e397df-kube-api-access-z7nd2\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9n5fcl\" (UID: \"1f0a6051-e7b4-4dc7-b9c6-44d2c2e397df\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9n5fcl" Apr 17 20:58:10.345825 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:10.345795 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9n5fcl" Apr 17 20:58:10.489474 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:58:10.489445 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f0a6051_e7b4_4dc7_b9c6_44d2c2e397df.slice/crio-985f9bdeaf42d0f4cab8391baf372ba75fd58d67cef487d70f68489ec6108a47 WatchSource:0}: Error finding container 985f9bdeaf42d0f4cab8391baf372ba75fd58d67cef487d70f68489ec6108a47: Status 404 returned error can't find the container with id 985f9bdeaf42d0f4cab8391baf372ba75fd58d67cef487d70f68489ec6108a47 Apr 17 20:58:10.489575 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:10.489559 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9n5fcl"] Apr 17 20:58:10.919663 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:10.919582 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-5f74b9c8f9-9t7sq" event={"ID":"c32c12ad-6174-4995-b1ee-e43255b206ee","Type":"ContainerStarted","Data":"aadd08da6ed537d754521da3511635f46019ca6f4d438dbcf6b55d9f508e87c9"} Apr 17 20:58:10.921927 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:10.921475 2576 generic.go:358] "Generic (PLEG): container finished" podID="1f0a6051-e7b4-4dc7-b9c6-44d2c2e397df" containerID="aa9b2e99eabc662f8f287e3aa65ddc1470d37ccca155c86f63125e855253bc51" exitCode=0 Apr 17 20:58:10.921927 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:10.921586 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9n5fcl" event={"ID":"1f0a6051-e7b4-4dc7-b9c6-44d2c2e397df","Type":"ContainerDied","Data":"aa9b2e99eabc662f8f287e3aa65ddc1470d37ccca155c86f63125e855253bc51"} Apr 17 20:58:10.921927 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:10.921609 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9n5fcl" event={"ID":"1f0a6051-e7b4-4dc7-b9c6-44d2c2e397df","Type":"ContainerStarted","Data":"985f9bdeaf42d0f4cab8391baf372ba75fd58d67cef487d70f68489ec6108a47"} Apr 17 20:58:12.930343 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:12.930306 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-5f74b9c8f9-9t7sq" event={"ID":"c32c12ad-6174-4995-b1ee-e43255b206ee","Type":"ContainerStarted","Data":"9d087f6c40d9f5312c43ab95a8acdc2a8ea6f8bb2ff421277c1e9ca7d9aef333"} Apr 17 20:58:12.930758 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:12.930417 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-5f74b9c8f9-9t7sq" Apr 17 20:58:12.931969 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:12.931943 2576 generic.go:358] "Generic (PLEG): container finished" podID="1f0a6051-e7b4-4dc7-b9c6-44d2c2e397df" containerID="282fd3e81ef674606559e557eab97cc69d9039b31defce91c81d920640d10109" exitCode=0 Apr 17 20:58:12.932204 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:12.932020 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9n5fcl" event={"ID":"1f0a6051-e7b4-4dc7-b9c6-44d2c2e397df","Type":"ContainerDied","Data":"282fd3e81ef674606559e557eab97cc69d9039b31defce91c81d920640d10109"} Apr 17 20:58:12.960727 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:12.960684 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-5f74b9c8f9-9t7sq" podStartSLOduration=1.399444408 podStartE2EDuration="3.960671083s" podCreationTimestamp="2026-04-17 20:58:09 +0000 UTC" firstStartedPulling="2026-04-17 20:58:10.182630496 +0000 UTC m=+452.189044508" lastFinishedPulling="2026-04-17 20:58:12.743857169 +0000 UTC m=+454.750271183" observedRunningTime="2026-04-17 20:58:12.959962076 +0000 UTC m=+454.966376122" watchObservedRunningTime="2026-04-17 20:58:12.960671083 +0000 UTC m=+454.967085117" Apr 17 20:58:13.937087 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:13.937055 2576 generic.go:358] "Generic (PLEG): container finished" podID="1f0a6051-e7b4-4dc7-b9c6-44d2c2e397df" containerID="ec2d53a70f92fc2d7b21e9d161a33490deab8c8a85e1b26ab719d1486d0db9cf" exitCode=0 Apr 17 20:58:13.937440 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:13.937130 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9n5fcl" event={"ID":"1f0a6051-e7b4-4dc7-b9c6-44d2c2e397df","Type":"ContainerDied","Data":"ec2d53a70f92fc2d7b21e9d161a33490deab8c8a85e1b26ab719d1486d0db9cf"} Apr 17 20:58:15.058464 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:15.058441 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9n5fcl" Apr 17 20:58:15.146213 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:15.146179 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1f0a6051-e7b4-4dc7-b9c6-44d2c2e397df-bundle\") pod \"1f0a6051-e7b4-4dc7-b9c6-44d2c2e397df\" (UID: \"1f0a6051-e7b4-4dc7-b9c6-44d2c2e397df\") " Apr 17 20:58:15.146382 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:15.146224 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1f0a6051-e7b4-4dc7-b9c6-44d2c2e397df-util\") pod \"1f0a6051-e7b4-4dc7-b9c6-44d2c2e397df\" (UID: \"1f0a6051-e7b4-4dc7-b9c6-44d2c2e397df\") " Apr 17 20:58:15.146382 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:15.146273 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z7nd2\" (UniqueName: \"kubernetes.io/projected/1f0a6051-e7b4-4dc7-b9c6-44d2c2e397df-kube-api-access-z7nd2\") pod \"1f0a6051-e7b4-4dc7-b9c6-44d2c2e397df\" (UID: \"1f0a6051-e7b4-4dc7-b9c6-44d2c2e397df\") " Apr 17 20:58:15.146833 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:15.146807 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f0a6051-e7b4-4dc7-b9c6-44d2c2e397df-bundle" (OuterVolumeSpecName: "bundle") pod "1f0a6051-e7b4-4dc7-b9c6-44d2c2e397df" (UID: "1f0a6051-e7b4-4dc7-b9c6-44d2c2e397df"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 20:58:15.148464 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:15.148443 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f0a6051-e7b4-4dc7-b9c6-44d2c2e397df-kube-api-access-z7nd2" (OuterVolumeSpecName: "kube-api-access-z7nd2") pod "1f0a6051-e7b4-4dc7-b9c6-44d2c2e397df" (UID: "1f0a6051-e7b4-4dc7-b9c6-44d2c2e397df"). InnerVolumeSpecName "kube-api-access-z7nd2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:58:15.151734 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:15.151698 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f0a6051-e7b4-4dc7-b9c6-44d2c2e397df-util" (OuterVolumeSpecName: "util") pod "1f0a6051-e7b4-4dc7-b9c6-44d2c2e397df" (UID: "1f0a6051-e7b4-4dc7-b9c6-44d2c2e397df"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 20:58:15.247475 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:15.247397 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1f0a6051-e7b4-4dc7-b9c6-44d2c2e397df-bundle\") on node \"ip-10-0-128-248.ec2.internal\" DevicePath \"\"" Apr 17 20:58:15.247475 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:15.247427 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1f0a6051-e7b4-4dc7-b9c6-44d2c2e397df-util\") on node \"ip-10-0-128-248.ec2.internal\" DevicePath \"\"" Apr 17 20:58:15.247475 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:15.247441 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-z7nd2\" (UniqueName: \"kubernetes.io/projected/1f0a6051-e7b4-4dc7-b9c6-44d2c2e397df-kube-api-access-z7nd2\") on node \"ip-10-0-128-248.ec2.internal\" DevicePath \"\"" Apr 17 20:58:15.945439 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:15.945403 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9n5fcl" event={"ID":"1f0a6051-e7b4-4dc7-b9c6-44d2c2e397df","Type":"ContainerDied","Data":"985f9bdeaf42d0f4cab8391baf372ba75fd58d67cef487d70f68489ec6108a47"} Apr 17 20:58:15.945439 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:15.945436 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="985f9bdeaf42d0f4cab8391baf372ba75fd58d67cef487d70f68489ec6108a47" Apr 17 20:58:15.945643 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:15.945444 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9n5fcl" Apr 17 20:58:23.939959 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:23.939928 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-5f74b9c8f9-9t7sq" Apr 17 20:58:26.631342 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:26.631305 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835vnrf5"] Apr 17 20:58:26.631710 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:26.631666 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1f0a6051-e7b4-4dc7-b9c6-44d2c2e397df" containerName="pull" Apr 17 20:58:26.631710 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:26.631678 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f0a6051-e7b4-4dc7-b9c6-44d2c2e397df" containerName="pull" Apr 17 20:58:26.631710 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:26.631692 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1f0a6051-e7b4-4dc7-b9c6-44d2c2e397df" containerName="extract" Apr 17 20:58:26.631710 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:26.631697 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f0a6051-e7b4-4dc7-b9c6-44d2c2e397df" containerName="extract" Apr 17 20:58:26.631710 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:26.631708 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1f0a6051-e7b4-4dc7-b9c6-44d2c2e397df" containerName="util" Apr 17 20:58:26.631710 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:26.631713 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f0a6051-e7b4-4dc7-b9c6-44d2c2e397df" containerName="util" Apr 17 20:58:26.631895 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:26.631765 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="1f0a6051-e7b4-4dc7-b9c6-44d2c2e397df" containerName="extract" Apr 17 20:58:26.635107 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:26.635088 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835vnrf5" Apr 17 20:58:26.639114 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:26.639095 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 20:58:26.640101 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:26.640087 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-5vn7h\"" Apr 17 20:58:26.640154 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:26.640090 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 20:58:26.649154 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:26.649131 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835vnrf5"] Apr 17 20:58:26.743786 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:26.743741 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mz5d\" (UniqueName: \"kubernetes.io/projected/9aef61af-c9f8-46d4-b82a-80e72a2b064c-kube-api-access-4mz5d\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835vnrf5\" (UID: \"9aef61af-c9f8-46d4-b82a-80e72a2b064c\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835vnrf5" Apr 17 20:58:26.743957 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:26.743801 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9aef61af-c9f8-46d4-b82a-80e72a2b064c-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835vnrf5\" (UID: \"9aef61af-c9f8-46d4-b82a-80e72a2b064c\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835vnrf5" Apr 17 20:58:26.743957 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:26.743903 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9aef61af-c9f8-46d4-b82a-80e72a2b064c-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835vnrf5\" (UID: \"9aef61af-c9f8-46d4-b82a-80e72a2b064c\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835vnrf5" Apr 17 20:58:26.845222 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:26.845185 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4mz5d\" (UniqueName: \"kubernetes.io/projected/9aef61af-c9f8-46d4-b82a-80e72a2b064c-kube-api-access-4mz5d\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835vnrf5\" (UID: \"9aef61af-c9f8-46d4-b82a-80e72a2b064c\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835vnrf5" Apr 17 20:58:26.845402 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:26.845239 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9aef61af-c9f8-46d4-b82a-80e72a2b064c-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835vnrf5\" (UID: \"9aef61af-c9f8-46d4-b82a-80e72a2b064c\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835vnrf5" Apr 17 20:58:26.845484 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:26.845427 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9aef61af-c9f8-46d4-b82a-80e72a2b064c-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835vnrf5\" (UID: \"9aef61af-c9f8-46d4-b82a-80e72a2b064c\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835vnrf5" Apr 17 20:58:26.845641 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:26.845620 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9aef61af-c9f8-46d4-b82a-80e72a2b064c-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835vnrf5\" (UID: \"9aef61af-c9f8-46d4-b82a-80e72a2b064c\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835vnrf5" Apr 17 20:58:26.845746 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:26.845730 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9aef61af-c9f8-46d4-b82a-80e72a2b064c-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835vnrf5\" (UID: \"9aef61af-c9f8-46d4-b82a-80e72a2b064c\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835vnrf5" Apr 17 20:58:26.860476 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:26.860449 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mz5d\" (UniqueName: \"kubernetes.io/projected/9aef61af-c9f8-46d4-b82a-80e72a2b064c-kube-api-access-4mz5d\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835vnrf5\" (UID: \"9aef61af-c9f8-46d4-b82a-80e72a2b064c\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835vnrf5" Apr 17 20:58:26.943895 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:26.943816 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835vnrf5" Apr 17 20:58:27.071253 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:27.071222 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835vnrf5"] Apr 17 20:58:27.072356 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:58:27.072326 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9aef61af_c9f8_46d4_b82a_80e72a2b064c.slice/crio-612cf446810c5f1134f3a1ed8d7ec4966f62b5792bf588bfd242dc891c5a5fc0 WatchSource:0}: Error finding container 612cf446810c5f1134f3a1ed8d7ec4966f62b5792bf588bfd242dc891c5a5fc0: Status 404 returned error can't find the container with id 612cf446810c5f1134f3a1ed8d7ec4966f62b5792bf588bfd242dc891c5a5fc0 Apr 17 20:58:27.254213 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:27.254129 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-666889b9b6-dwtcj"] Apr 17 20:58:27.257705 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:27.257685 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-666889b9b6-dwtcj" Apr 17 20:58:27.261328 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:27.261304 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 17 20:58:27.263145 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:27.262493 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 17 20:58:27.263145 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:27.262953 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 17 20:58:27.263825 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:27.263702 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 17 20:58:27.271533 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:27.271505 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-666889b9b6-dwtcj"] Apr 17 20:58:27.349748 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:27.349705 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/43070034-fd46-413e-ac75-0b27ae25d08a-tmp\") pod \"kube-auth-proxy-666889b9b6-dwtcj\" (UID: \"43070034-fd46-413e-ac75-0b27ae25d08a\") " pod="openshift-ingress/kube-auth-proxy-666889b9b6-dwtcj" Apr 17 20:58:27.349950 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:27.349847 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/43070034-fd46-413e-ac75-0b27ae25d08a-tls-certs\") pod \"kube-auth-proxy-666889b9b6-dwtcj\" (UID: \"43070034-fd46-413e-ac75-0b27ae25d08a\") " pod="openshift-ingress/kube-auth-proxy-666889b9b6-dwtcj" Apr 17 20:58:27.349950 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:27.349939 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2474q\" (UniqueName: \"kubernetes.io/projected/43070034-fd46-413e-ac75-0b27ae25d08a-kube-api-access-2474q\") pod \"kube-auth-proxy-666889b9b6-dwtcj\" (UID: \"43070034-fd46-413e-ac75-0b27ae25d08a\") " pod="openshift-ingress/kube-auth-proxy-666889b9b6-dwtcj" Apr 17 20:58:27.450885 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:27.450846 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/43070034-fd46-413e-ac75-0b27ae25d08a-tls-certs\") pod \"kube-auth-proxy-666889b9b6-dwtcj\" (UID: \"43070034-fd46-413e-ac75-0b27ae25d08a\") " pod="openshift-ingress/kube-auth-proxy-666889b9b6-dwtcj" Apr 17 20:58:27.451103 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:27.450903 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2474q\" (UniqueName: \"kubernetes.io/projected/43070034-fd46-413e-ac75-0b27ae25d08a-kube-api-access-2474q\") pod \"kube-auth-proxy-666889b9b6-dwtcj\" (UID: \"43070034-fd46-413e-ac75-0b27ae25d08a\") " pod="openshift-ingress/kube-auth-proxy-666889b9b6-dwtcj" Apr 17 20:58:27.451103 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:27.450992 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/43070034-fd46-413e-ac75-0b27ae25d08a-tmp\") pod \"kube-auth-proxy-666889b9b6-dwtcj\" (UID: \"43070034-fd46-413e-ac75-0b27ae25d08a\") " pod="openshift-ingress/kube-auth-proxy-666889b9b6-dwtcj" Apr 17 20:58:27.453169 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:27.453145 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/43070034-fd46-413e-ac75-0b27ae25d08a-tmp\") pod \"kube-auth-proxy-666889b9b6-dwtcj\" (UID: \"43070034-fd46-413e-ac75-0b27ae25d08a\") " pod="openshift-ingress/kube-auth-proxy-666889b9b6-dwtcj" Apr 17 20:58:27.453416 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:27.453396 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/43070034-fd46-413e-ac75-0b27ae25d08a-tls-certs\") pod \"kube-auth-proxy-666889b9b6-dwtcj\" (UID: \"43070034-fd46-413e-ac75-0b27ae25d08a\") " pod="openshift-ingress/kube-auth-proxy-666889b9b6-dwtcj" Apr 17 20:58:27.459130 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:27.459111 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2474q\" (UniqueName: \"kubernetes.io/projected/43070034-fd46-413e-ac75-0b27ae25d08a-kube-api-access-2474q\") pod \"kube-auth-proxy-666889b9b6-dwtcj\" (UID: \"43070034-fd46-413e-ac75-0b27ae25d08a\") " pod="openshift-ingress/kube-auth-proxy-666889b9b6-dwtcj" Apr 17 20:58:27.572975 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:27.572949 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-666889b9b6-dwtcj" Apr 17 20:58:27.692473 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:27.692446 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-666889b9b6-dwtcj"] Apr 17 20:58:27.694377 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:58:27.694350 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43070034_fd46_413e_ac75_0b27ae25d08a.slice/crio-ea3497e45978da38b6e158de10cd7e69fa4de5db91ae84824e4fa820f936f659 WatchSource:0}: Error finding container ea3497e45978da38b6e158de10cd7e69fa4de5db91ae84824e4fa820f936f659: Status 404 returned error can't find the container with id ea3497e45978da38b6e158de10cd7e69fa4de5db91ae84824e4fa820f936f659 Apr 17 20:58:27.991668 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:27.991632 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-666889b9b6-dwtcj" event={"ID":"43070034-fd46-413e-ac75-0b27ae25d08a","Type":"ContainerStarted","Data":"ea3497e45978da38b6e158de10cd7e69fa4de5db91ae84824e4fa820f936f659"} Apr 17 20:58:27.992767 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:27.992741 2576 generic.go:358] "Generic (PLEG): container finished" podID="9aef61af-c9f8-46d4-b82a-80e72a2b064c" containerID="ef502bb382d9f29666b4981af42f6a67e9adc6fc4ad3d86943a2a8ac9a6968c7" exitCode=0 Apr 17 20:58:27.992876 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:27.992824 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835vnrf5" event={"ID":"9aef61af-c9f8-46d4-b82a-80e72a2b064c","Type":"ContainerDied","Data":"ef502bb382d9f29666b4981af42f6a67e9adc6fc4ad3d86943a2a8ac9a6968c7"} Apr 17 20:58:27.992876 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:27.992854 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835vnrf5" event={"ID":"9aef61af-c9f8-46d4-b82a-80e72a2b064c","Type":"ContainerStarted","Data":"612cf446810c5f1134f3a1ed8d7ec4966f62b5792bf588bfd242dc891c5a5fc0"} Apr 17 20:58:29.002774 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:29.002733 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835vnrf5" event={"ID":"9aef61af-c9f8-46d4-b82a-80e72a2b064c","Type":"ContainerStarted","Data":"21f67bd6d5beca10c7f79407758cb5181887afcc43c184c041dd0f11abd83bab"} Apr 17 20:58:30.008232 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:30.008138 2576 generic.go:358] "Generic (PLEG): container finished" podID="9aef61af-c9f8-46d4-b82a-80e72a2b064c" containerID="21f67bd6d5beca10c7f79407758cb5181887afcc43c184c041dd0f11abd83bab" exitCode=0 Apr 17 20:58:30.008668 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:30.008287 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835vnrf5" event={"ID":"9aef61af-c9f8-46d4-b82a-80e72a2b064c","Type":"ContainerDied","Data":"21f67bd6d5beca10c7f79407758cb5181887afcc43c184c041dd0f11abd83bab"} Apr 17 20:58:30.058050 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:30.057699 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-cx8hg"] Apr 17 20:58:30.062905 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:30.062881 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-cx8hg" Apr 17 20:58:30.067316 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:30.067201 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-dockercfg-qw2np\"" Apr 17 20:58:30.068819 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:30.067703 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-webhook-cert\"" Apr 17 20:58:30.070539 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:30.070500 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-cx8hg"] Apr 17 20:58:30.177128 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:30.177094 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9065c47d-482b-4274-9e3c-f9dfa8b506fb-cert\") pod \"odh-model-controller-858dbf95b8-cx8hg\" (UID: \"9065c47d-482b-4274-9e3c-f9dfa8b506fb\") " pod="opendatahub/odh-model-controller-858dbf95b8-cx8hg" Apr 17 20:58:30.177314 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:30.177192 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhv8d\" (UniqueName: \"kubernetes.io/projected/9065c47d-482b-4274-9e3c-f9dfa8b506fb-kube-api-access-zhv8d\") pod \"odh-model-controller-858dbf95b8-cx8hg\" (UID: \"9065c47d-482b-4274-9e3c-f9dfa8b506fb\") " pod="opendatahub/odh-model-controller-858dbf95b8-cx8hg" Apr 17 20:58:30.278645 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:30.278549 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9065c47d-482b-4274-9e3c-f9dfa8b506fb-cert\") pod \"odh-model-controller-858dbf95b8-cx8hg\" (UID: \"9065c47d-482b-4274-9e3c-f9dfa8b506fb\") " pod="opendatahub/odh-model-controller-858dbf95b8-cx8hg" Apr 17 20:58:30.278813 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:30.278669 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zhv8d\" (UniqueName: \"kubernetes.io/projected/9065c47d-482b-4274-9e3c-f9dfa8b506fb-kube-api-access-zhv8d\") pod \"odh-model-controller-858dbf95b8-cx8hg\" (UID: \"9065c47d-482b-4274-9e3c-f9dfa8b506fb\") " pod="opendatahub/odh-model-controller-858dbf95b8-cx8hg" Apr 17 20:58:30.278813 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:58:30.278724 2576 secret.go:189] Couldn't get secret opendatahub/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 17 20:58:30.278813 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:58:30.278807 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9065c47d-482b-4274-9e3c-f9dfa8b506fb-cert podName:9065c47d-482b-4274-9e3c-f9dfa8b506fb nodeName:}" failed. No retries permitted until 2026-04-17 20:58:30.778783754 +0000 UTC m=+472.785197768 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9065c47d-482b-4274-9e3c-f9dfa8b506fb-cert") pod "odh-model-controller-858dbf95b8-cx8hg" (UID: "9065c47d-482b-4274-9e3c-f9dfa8b506fb") : secret "odh-model-controller-webhook-cert" not found Apr 17 20:58:30.287957 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:30.287927 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhv8d\" (UniqueName: \"kubernetes.io/projected/9065c47d-482b-4274-9e3c-f9dfa8b506fb-kube-api-access-zhv8d\") pod \"odh-model-controller-858dbf95b8-cx8hg\" (UID: \"9065c47d-482b-4274-9e3c-f9dfa8b506fb\") " pod="opendatahub/odh-model-controller-858dbf95b8-cx8hg" Apr 17 20:58:30.784450 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:30.784412 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9065c47d-482b-4274-9e3c-f9dfa8b506fb-cert\") pod \"odh-model-controller-858dbf95b8-cx8hg\" (UID: \"9065c47d-482b-4274-9e3c-f9dfa8b506fb\") " pod="opendatahub/odh-model-controller-858dbf95b8-cx8hg" Apr 17 20:58:30.784616 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:58:30.784545 2576 secret.go:189] Couldn't get secret opendatahub/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 17 20:58:30.784684 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:58:30.784616 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9065c47d-482b-4274-9e3c-f9dfa8b506fb-cert podName:9065c47d-482b-4274-9e3c-f9dfa8b506fb nodeName:}" failed. No retries permitted until 2026-04-17 20:58:31.784595928 +0000 UTC m=+473.791009942 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9065c47d-482b-4274-9e3c-f9dfa8b506fb-cert") pod "odh-model-controller-858dbf95b8-cx8hg" (UID: "9065c47d-482b-4274-9e3c-f9dfa8b506fb") : secret "odh-model-controller-webhook-cert" not found Apr 17 20:58:31.013583 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:31.013502 2576 generic.go:358] "Generic (PLEG): container finished" podID="9aef61af-c9f8-46d4-b82a-80e72a2b064c" containerID="ce5af5a4f72755ca6ba97d29a8962c876454d7e84ef027a421caf514b24d2e37" exitCode=0 Apr 17 20:58:31.013970 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:31.013586 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835vnrf5" event={"ID":"9aef61af-c9f8-46d4-b82a-80e72a2b064c","Type":"ContainerDied","Data":"ce5af5a4f72755ca6ba97d29a8962c876454d7e84ef027a421caf514b24d2e37"} Apr 17 20:58:31.014956 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:31.014936 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-666889b9b6-dwtcj" event={"ID":"43070034-fd46-413e-ac75-0b27ae25d08a","Type":"ContainerStarted","Data":"559dc1ba0da86c39ae5342c2448c7dc32acca1625c14da44c9eae597fcd7fbce"} Apr 17 20:58:31.047373 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:31.047330 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-666889b9b6-dwtcj" podStartSLOduration=1.348730612 podStartE2EDuration="4.047318034s" podCreationTimestamp="2026-04-17 20:58:27 +0000 UTC" firstStartedPulling="2026-04-17 20:58:27.695988127 +0000 UTC m=+469.702402141" lastFinishedPulling="2026-04-17 20:58:30.394575553 +0000 UTC m=+472.400989563" observedRunningTime="2026-04-17 20:58:31.045400277 +0000 UTC m=+473.051814323" watchObservedRunningTime="2026-04-17 20:58:31.047318034 +0000 UTC m=+473.053732066" Apr 17 20:58:31.794541 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:31.794497 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9065c47d-482b-4274-9e3c-f9dfa8b506fb-cert\") pod \"odh-model-controller-858dbf95b8-cx8hg\" (UID: \"9065c47d-482b-4274-9e3c-f9dfa8b506fb\") " pod="opendatahub/odh-model-controller-858dbf95b8-cx8hg" Apr 17 20:58:31.796788 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:31.796757 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9065c47d-482b-4274-9e3c-f9dfa8b506fb-cert\") pod \"odh-model-controller-858dbf95b8-cx8hg\" (UID: \"9065c47d-482b-4274-9e3c-f9dfa8b506fb\") " pod="opendatahub/odh-model-controller-858dbf95b8-cx8hg" Apr 17 20:58:31.877758 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:31.877723 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-cx8hg" Apr 17 20:58:32.136543 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:32.136521 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835vnrf5" Apr 17 20:58:32.198318 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:32.198165 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4mz5d\" (UniqueName: \"kubernetes.io/projected/9aef61af-c9f8-46d4-b82a-80e72a2b064c-kube-api-access-4mz5d\") pod \"9aef61af-c9f8-46d4-b82a-80e72a2b064c\" (UID: \"9aef61af-c9f8-46d4-b82a-80e72a2b064c\") " Apr 17 20:58:32.198318 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:32.198254 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9aef61af-c9f8-46d4-b82a-80e72a2b064c-bundle\") pod \"9aef61af-c9f8-46d4-b82a-80e72a2b064c\" (UID: \"9aef61af-c9f8-46d4-b82a-80e72a2b064c\") " Apr 17 20:58:32.198599 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:32.198335 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9aef61af-c9f8-46d4-b82a-80e72a2b064c-util\") pod \"9aef61af-c9f8-46d4-b82a-80e72a2b064c\" (UID: \"9aef61af-c9f8-46d4-b82a-80e72a2b064c\") " Apr 17 20:58:32.199535 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:32.199486 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9aef61af-c9f8-46d4-b82a-80e72a2b064c-bundle" (OuterVolumeSpecName: "bundle") pod "9aef61af-c9f8-46d4-b82a-80e72a2b064c" (UID: "9aef61af-c9f8-46d4-b82a-80e72a2b064c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 20:58:32.200809 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:32.200774 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9aef61af-c9f8-46d4-b82a-80e72a2b064c-kube-api-access-4mz5d" (OuterVolumeSpecName: "kube-api-access-4mz5d") pod "9aef61af-c9f8-46d4-b82a-80e72a2b064c" (UID: "9aef61af-c9f8-46d4-b82a-80e72a2b064c"). InnerVolumeSpecName "kube-api-access-4mz5d". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:58:32.201070 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:32.201043 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-cx8hg"] Apr 17 20:58:32.203752 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:58:32.203730 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9065c47d_482b_4274_9e3c_f9dfa8b506fb.slice/crio-814c589e41e55fe4117767ac2532b8f70bfa654fa62c6d32a25a589254db98e4 WatchSource:0}: Error finding container 814c589e41e55fe4117767ac2532b8f70bfa654fa62c6d32a25a589254db98e4: Status 404 returned error can't find the container with id 814c589e41e55fe4117767ac2532b8f70bfa654fa62c6d32a25a589254db98e4 Apr 17 20:58:32.204956 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:32.204932 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9aef61af-c9f8-46d4-b82a-80e72a2b064c-util" (OuterVolumeSpecName: "util") pod "9aef61af-c9f8-46d4-b82a-80e72a2b064c" (UID: "9aef61af-c9f8-46d4-b82a-80e72a2b064c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 20:58:32.299267 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:32.299244 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9aef61af-c9f8-46d4-b82a-80e72a2b064c-bundle\") on node \"ip-10-0-128-248.ec2.internal\" DevicePath \"\"" Apr 17 20:58:32.299267 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:32.299267 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9aef61af-c9f8-46d4-b82a-80e72a2b064c-util\") on node \"ip-10-0-128-248.ec2.internal\" DevicePath \"\"" Apr 17 20:58:32.299423 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:32.299278 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4mz5d\" (UniqueName: \"kubernetes.io/projected/9aef61af-c9f8-46d4-b82a-80e72a2b064c-kube-api-access-4mz5d\") on node \"ip-10-0-128-248.ec2.internal\" DevicePath \"\"" Apr 17 20:58:33.024888 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:33.024850 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-cx8hg" event={"ID":"9065c47d-482b-4274-9e3c-f9dfa8b506fb","Type":"ContainerStarted","Data":"814c589e41e55fe4117767ac2532b8f70bfa654fa62c6d32a25a589254db98e4"} Apr 17 20:58:33.026802 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:33.026773 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835vnrf5" event={"ID":"9aef61af-c9f8-46d4-b82a-80e72a2b064c","Type":"ContainerDied","Data":"612cf446810c5f1134f3a1ed8d7ec4966f62b5792bf588bfd242dc891c5a5fc0"} Apr 17 20:58:33.026802 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:33.026804 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="612cf446810c5f1134f3a1ed8d7ec4966f62b5792bf588bfd242dc891c5a5fc0" Apr 17 20:58:33.026959 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:33.026827 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835vnrf5" Apr 17 20:58:35.037654 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:35.037566 2576 generic.go:358] "Generic (PLEG): container finished" podID="9065c47d-482b-4274-9e3c-f9dfa8b506fb" containerID="5623621fa5de5395f2d05ec83063ef3fca4351fa744a5dc6061d65992689f071" exitCode=1 Apr 17 20:58:35.038051 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:35.037653 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-cx8hg" event={"ID":"9065c47d-482b-4274-9e3c-f9dfa8b506fb","Type":"ContainerDied","Data":"5623621fa5de5395f2d05ec83063ef3fca4351fa744a5dc6061d65992689f071"} Apr 17 20:58:35.038051 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:35.037846 2576 scope.go:117] "RemoveContainer" containerID="5623621fa5de5395f2d05ec83063ef3fca4351fa744a5dc6061d65992689f071" Apr 17 20:58:36.044013 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:36.043947 2576 generic.go:358] "Generic (PLEG): container finished" podID="9065c47d-482b-4274-9e3c-f9dfa8b506fb" containerID="eaa02d6a15691683fbde85380abe95765e61cf581c79a7e045a28fab1329d06b" exitCode=1 Apr 17 20:58:36.044541 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:36.044054 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-cx8hg" event={"ID":"9065c47d-482b-4274-9e3c-f9dfa8b506fb","Type":"ContainerDied","Data":"eaa02d6a15691683fbde85380abe95765e61cf581c79a7e045a28fab1329d06b"} Apr 17 20:58:36.044541 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:36.044117 2576 scope.go:117] "RemoveContainer" containerID="5623621fa5de5395f2d05ec83063ef3fca4351fa744a5dc6061d65992689f071" Apr 17 20:58:36.044541 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:36.044337 2576 scope.go:117] "RemoveContainer" containerID="eaa02d6a15691683fbde85380abe95765e61cf581c79a7e045a28fab1329d06b" Apr 17 20:58:36.044737 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:58:36.044617 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-cx8hg_opendatahub(9065c47d-482b-4274-9e3c-f9dfa8b506fb)\"" pod="opendatahub/odh-model-controller-858dbf95b8-cx8hg" podUID="9065c47d-482b-4274-9e3c-f9dfa8b506fb" Apr 17 20:58:36.198808 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:36.198774 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-kzt7p"] Apr 17 20:58:36.199189 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:36.199172 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9aef61af-c9f8-46d4-b82a-80e72a2b064c" containerName="util" Apr 17 20:58:36.199189 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:36.199189 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="9aef61af-c9f8-46d4-b82a-80e72a2b064c" containerName="util" Apr 17 20:58:36.199338 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:36.199202 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9aef61af-c9f8-46d4-b82a-80e72a2b064c" containerName="extract" Apr 17 20:58:36.199338 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:36.199209 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="9aef61af-c9f8-46d4-b82a-80e72a2b064c" containerName="extract" Apr 17 20:58:36.199338 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:36.199232 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9aef61af-c9f8-46d4-b82a-80e72a2b064c" containerName="pull" Apr 17 20:58:36.199338 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:36.199240 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="9aef61af-c9f8-46d4-b82a-80e72a2b064c" containerName="pull" Apr 17 20:58:36.199338 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:36.199314 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="9aef61af-c9f8-46d4-b82a-80e72a2b064c" containerName="extract" Apr 17 20:58:36.203570 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:36.203550 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-kzt7p" Apr 17 20:58:36.206318 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:36.206293 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-webhook-server-cert\"" Apr 17 20:58:36.206454 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:36.206297 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-controller-manager-dockercfg-nq4l8\"" Apr 17 20:58:36.212683 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:36.212663 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-kzt7p"] Apr 17 20:58:36.341811 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:36.341778 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c20d1b53-d6f2-43e3-beaa-9ce52cc6af90-cert\") pod \"kserve-controller-manager-856948b99f-kzt7p\" (UID: \"c20d1b53-d6f2-43e3-beaa-9ce52cc6af90\") " pod="opendatahub/kserve-controller-manager-856948b99f-kzt7p" Apr 17 20:58:36.341811 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:36.341813 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wnwh\" (UniqueName: \"kubernetes.io/projected/c20d1b53-d6f2-43e3-beaa-9ce52cc6af90-kube-api-access-2wnwh\") pod \"kserve-controller-manager-856948b99f-kzt7p\" (UID: \"c20d1b53-d6f2-43e3-beaa-9ce52cc6af90\") " pod="opendatahub/kserve-controller-manager-856948b99f-kzt7p" Apr 17 20:58:36.442315 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:36.442277 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c20d1b53-d6f2-43e3-beaa-9ce52cc6af90-cert\") pod \"kserve-controller-manager-856948b99f-kzt7p\" (UID: \"c20d1b53-d6f2-43e3-beaa-9ce52cc6af90\") " pod="opendatahub/kserve-controller-manager-856948b99f-kzt7p" Apr 17 20:58:36.442315 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:36.442319 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2wnwh\" (UniqueName: \"kubernetes.io/projected/c20d1b53-d6f2-43e3-beaa-9ce52cc6af90-kube-api-access-2wnwh\") pod \"kserve-controller-manager-856948b99f-kzt7p\" (UID: \"c20d1b53-d6f2-43e3-beaa-9ce52cc6af90\") " pod="opendatahub/kserve-controller-manager-856948b99f-kzt7p" Apr 17 20:58:36.444732 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:36.444707 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c20d1b53-d6f2-43e3-beaa-9ce52cc6af90-cert\") pod \"kserve-controller-manager-856948b99f-kzt7p\" (UID: \"c20d1b53-d6f2-43e3-beaa-9ce52cc6af90\") " pod="opendatahub/kserve-controller-manager-856948b99f-kzt7p" Apr 17 20:58:36.451752 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:36.451729 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wnwh\" (UniqueName: \"kubernetes.io/projected/c20d1b53-d6f2-43e3-beaa-9ce52cc6af90-kube-api-access-2wnwh\") pod \"kserve-controller-manager-856948b99f-kzt7p\" (UID: \"c20d1b53-d6f2-43e3-beaa-9ce52cc6af90\") " pod="opendatahub/kserve-controller-manager-856948b99f-kzt7p" Apr 17 20:58:36.520391 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:36.520366 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-kzt7p" Apr 17 20:58:36.638239 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:36.638212 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-kzt7p"] Apr 17 20:58:36.640147 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:58:36.640117 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc20d1b53_d6f2_43e3_beaa_9ce52cc6af90.slice/crio-8c4d5a65622edf1281a75b0675b1adc6ebe69c8bfd2accca3208783b78acbdf1 WatchSource:0}: Error finding container 8c4d5a65622edf1281a75b0675b1adc6ebe69c8bfd2accca3208783b78acbdf1: Status 404 returned error can't find the container with id 8c4d5a65622edf1281a75b0675b1adc6ebe69c8bfd2accca3208783b78acbdf1 Apr 17 20:58:37.048609 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:37.048532 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-kzt7p" event={"ID":"c20d1b53-d6f2-43e3-beaa-9ce52cc6af90","Type":"ContainerStarted","Data":"8c4d5a65622edf1281a75b0675b1adc6ebe69c8bfd2accca3208783b78acbdf1"} Apr 17 20:58:37.050210 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:37.050190 2576 scope.go:117] "RemoveContainer" containerID="eaa02d6a15691683fbde85380abe95765e61cf581c79a7e045a28fab1329d06b" Apr 17 20:58:37.050379 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:58:37.050364 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-cx8hg_opendatahub(9065c47d-482b-4274-9e3c-f9dfa8b506fb)\"" pod="opendatahub/odh-model-controller-858dbf95b8-cx8hg" podUID="9065c47d-482b-4274-9e3c-f9dfa8b506fb" Apr 17 20:58:39.060489 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:39.060284 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-kzt7p" event={"ID":"c20d1b53-d6f2-43e3-beaa-9ce52cc6af90","Type":"ContainerStarted","Data":"a671116963346c18c94fa990827eae0cf7e097a261e8434cf83bf626c65d4f25"} Apr 17 20:58:39.060946 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:39.060521 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/kserve-controller-manager-856948b99f-kzt7p" Apr 17 20:58:39.078345 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:39.078291 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/kserve-controller-manager-856948b99f-kzt7p" podStartSLOduration=0.750559132 podStartE2EDuration="3.07827724s" podCreationTimestamp="2026-04-17 20:58:36 +0000 UTC" firstStartedPulling="2026-04-17 20:58:36.641466505 +0000 UTC m=+478.647880519" lastFinishedPulling="2026-04-17 20:58:38.969184616 +0000 UTC m=+480.975598627" observedRunningTime="2026-04-17 20:58:39.076531658 +0000 UTC m=+481.082945692" watchObservedRunningTime="2026-04-17 20:58:39.07827724 +0000 UTC m=+481.084691272" Apr 17 20:58:40.717298 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:40.717265 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25qx6t"] Apr 17 20:58:40.720711 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:40.720693 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25qx6t" Apr 17 20:58:40.727197 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:40.727179 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 20:58:40.727406 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:40.727384 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-5vn7h\"" Apr 17 20:58:40.728214 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:40.728198 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 20:58:40.733200 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:40.733180 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25qx6t"] Apr 17 20:58:40.883567 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:40.883533 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fz75m\" (UniqueName: \"kubernetes.io/projected/25272d74-18ad-4033-ac8c-cdf379ae078d-kube-api-access-fz75m\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25qx6t\" (UID: \"25272d74-18ad-4033-ac8c-cdf379ae078d\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25qx6t" Apr 17 20:58:40.883567 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:40.883569 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/25272d74-18ad-4033-ac8c-cdf379ae078d-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25qx6t\" (UID: \"25272d74-18ad-4033-ac8c-cdf379ae078d\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25qx6t" Apr 17 20:58:40.883744 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:40.883587 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/25272d74-18ad-4033-ac8c-cdf379ae078d-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25qx6t\" (UID: \"25272d74-18ad-4033-ac8c-cdf379ae078d\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25qx6t" Apr 17 20:58:40.984502 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:40.984414 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fz75m\" (UniqueName: \"kubernetes.io/projected/25272d74-18ad-4033-ac8c-cdf379ae078d-kube-api-access-fz75m\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25qx6t\" (UID: \"25272d74-18ad-4033-ac8c-cdf379ae078d\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25qx6t" Apr 17 20:58:40.984502 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:40.984455 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/25272d74-18ad-4033-ac8c-cdf379ae078d-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25qx6t\" (UID: \"25272d74-18ad-4033-ac8c-cdf379ae078d\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25qx6t" Apr 17 20:58:40.984502 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:40.984475 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/25272d74-18ad-4033-ac8c-cdf379ae078d-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25qx6t\" (UID: \"25272d74-18ad-4033-ac8c-cdf379ae078d\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25qx6t" Apr 17 20:58:40.984830 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:40.984812 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/25272d74-18ad-4033-ac8c-cdf379ae078d-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25qx6t\" (UID: \"25272d74-18ad-4033-ac8c-cdf379ae078d\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25qx6t" Apr 17 20:58:40.984869 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:40.984841 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/25272d74-18ad-4033-ac8c-cdf379ae078d-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25qx6t\" (UID: \"25272d74-18ad-4033-ac8c-cdf379ae078d\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25qx6t" Apr 17 20:58:41.037406 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:41.037373 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fz75m\" (UniqueName: \"kubernetes.io/projected/25272d74-18ad-4033-ac8c-cdf379ae078d-kube-api-access-fz75m\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25qx6t\" (UID: \"25272d74-18ad-4033-ac8c-cdf379ae078d\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25qx6t" Apr 17 20:58:41.329371 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:41.329340 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25qx6t" Apr 17 20:58:41.460214 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:41.460186 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25qx6t"] Apr 17 20:58:41.461677 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:58:41.461652 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25272d74_18ad_4033_ac8c_cdf379ae078d.slice/crio-ea73359a4e32fe2e801185be8e4bdf01ff1aaa68218cf95d392685175109998e WatchSource:0}: Error finding container ea73359a4e32fe2e801185be8e4bdf01ff1aaa68218cf95d392685175109998e: Status 404 returned error can't find the container with id ea73359a4e32fe2e801185be8e4bdf01ff1aaa68218cf95d392685175109998e Apr 17 20:58:41.878341 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:41.878303 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-cx8hg" Apr 17 20:58:41.878714 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:41.878697 2576 scope.go:117] "RemoveContainer" containerID="eaa02d6a15691683fbde85380abe95765e61cf581c79a7e045a28fab1329d06b" Apr 17 20:58:41.878900 ip-10-0-128-248 kubenswrapper[2576]: E0417 20:58:41.878883 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-cx8hg_opendatahub(9065c47d-482b-4274-9e3c-f9dfa8b506fb)\"" pod="opendatahub/odh-model-controller-858dbf95b8-cx8hg" podUID="9065c47d-482b-4274-9e3c-f9dfa8b506fb" Apr 17 20:58:42.075855 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:42.075814 2576 generic.go:358] "Generic (PLEG): container finished" podID="25272d74-18ad-4033-ac8c-cdf379ae078d" containerID="e20aa8912e9c223a5a6fcfa1533e08672680f697cb20831b504a5b298238f997" exitCode=0 Apr 17 20:58:42.076053 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:42.075957 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25qx6t" event={"ID":"25272d74-18ad-4033-ac8c-cdf379ae078d","Type":"ContainerDied","Data":"e20aa8912e9c223a5a6fcfa1533e08672680f697cb20831b504a5b298238f997"} Apr 17 20:58:42.076261 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:42.076234 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25qx6t" event={"ID":"25272d74-18ad-4033-ac8c-cdf379ae078d","Type":"ContainerStarted","Data":"ea73359a4e32fe2e801185be8e4bdf01ff1aaa68218cf95d392685175109998e"} Apr 17 20:58:42.822907 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:42.822870 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-bx2cg"] Apr 17 20:58:42.826373 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:42.826354 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-bx2cg" Apr 17 20:58:42.829847 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:42.829816 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"servicemesh-operator3-dockercfg-pmmpd\"" Apr 17 20:58:42.830271 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:42.830252 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"kube-root-ca.crt\"" Apr 17 20:58:42.830642 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:42.830628 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"openshift-service-ca.crt\"" Apr 17 20:58:42.839487 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:42.839463 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-bx2cg"] Apr 17 20:58:43.002750 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:43.002714 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9bhv\" (UniqueName: \"kubernetes.io/projected/fe3ffa41-e883-49fc-a6e8-e8d0ccbd1ed1-kube-api-access-v9bhv\") pod \"servicemesh-operator3-55f49c5f94-bx2cg\" (UID: \"fe3ffa41-e883-49fc-a6e8-e8d0ccbd1ed1\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-bx2cg" Apr 17 20:58:43.003166 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:43.002826 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/fe3ffa41-e883-49fc-a6e8-e8d0ccbd1ed1-operator-config\") pod \"servicemesh-operator3-55f49c5f94-bx2cg\" (UID: \"fe3ffa41-e883-49fc-a6e8-e8d0ccbd1ed1\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-bx2cg" Apr 17 20:58:43.103938 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:43.103848 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v9bhv\" (UniqueName: \"kubernetes.io/projected/fe3ffa41-e883-49fc-a6e8-e8d0ccbd1ed1-kube-api-access-v9bhv\") pod \"servicemesh-operator3-55f49c5f94-bx2cg\" (UID: \"fe3ffa41-e883-49fc-a6e8-e8d0ccbd1ed1\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-bx2cg" Apr 17 20:58:43.103938 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:43.103933 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/fe3ffa41-e883-49fc-a6e8-e8d0ccbd1ed1-operator-config\") pod \"servicemesh-operator3-55f49c5f94-bx2cg\" (UID: \"fe3ffa41-e883-49fc-a6e8-e8d0ccbd1ed1\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-bx2cg" Apr 17 20:58:43.106414 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:43.106387 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/fe3ffa41-e883-49fc-a6e8-e8d0ccbd1ed1-operator-config\") pod \"servicemesh-operator3-55f49c5f94-bx2cg\" (UID: \"fe3ffa41-e883-49fc-a6e8-e8d0ccbd1ed1\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-bx2cg" Apr 17 20:58:43.115823 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:43.115799 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9bhv\" (UniqueName: \"kubernetes.io/projected/fe3ffa41-e883-49fc-a6e8-e8d0ccbd1ed1-kube-api-access-v9bhv\") pod \"servicemesh-operator3-55f49c5f94-bx2cg\" (UID: \"fe3ffa41-e883-49fc-a6e8-e8d0ccbd1ed1\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-bx2cg" Apr 17 20:58:43.136032 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:43.136007 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-bx2cg" Apr 17 20:58:43.481679 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:43.481647 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-bx2cg"] Apr 17 20:58:43.482082 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:58:43.482055 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe3ffa41_e883_49fc_a6e8_e8d0ccbd1ed1.slice/crio-a5bb60d1c571cf205c315569d7f35fc558c4d662cee4b5399d5dba50f11b51f5 WatchSource:0}: Error finding container a5bb60d1c571cf205c315569d7f35fc558c4d662cee4b5399d5dba50f11b51f5: Status 404 returned error can't find the container with id a5bb60d1c571cf205c315569d7f35fc558c4d662cee4b5399d5dba50f11b51f5 Apr 17 20:58:44.084546 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:44.084511 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-bx2cg" event={"ID":"fe3ffa41-e883-49fc-a6e8-e8d0ccbd1ed1","Type":"ContainerStarted","Data":"a5bb60d1c571cf205c315569d7f35fc558c4d662cee4b5399d5dba50f11b51f5"} Apr 17 20:58:48.100532 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:48.100501 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-bx2cg" event={"ID":"fe3ffa41-e883-49fc-a6e8-e8d0ccbd1ed1","Type":"ContainerStarted","Data":"c1d55d306631731043bd9494ccca1025ecb99ae6578f439b0e87b7a24213495e"} Apr 17 20:58:48.100937 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:48.100563 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-bx2cg" Apr 17 20:58:48.102208 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:48.102178 2576 generic.go:358] "Generic (PLEG): container finished" podID="25272d74-18ad-4033-ac8c-cdf379ae078d" containerID="a8b4c0a3aadb3ff795bff6fbdb904f20e629361de8126912fd8789c33e880bbd" exitCode=0 Apr 17 20:58:48.102319 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:48.102242 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25qx6t" event={"ID":"25272d74-18ad-4033-ac8c-cdf379ae078d","Type":"ContainerDied","Data":"a8b4c0a3aadb3ff795bff6fbdb904f20e629361de8126912fd8789c33e880bbd"} Apr 17 20:58:48.125513 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:48.125466 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/servicemesh-operator3-55f49c5f94-bx2cg" podStartSLOduration=1.881397499 podStartE2EDuration="6.125449725s" podCreationTimestamp="2026-04-17 20:58:42 +0000 UTC" firstStartedPulling="2026-04-17 20:58:43.484711206 +0000 UTC m=+485.491125217" lastFinishedPulling="2026-04-17 20:58:47.728763427 +0000 UTC m=+489.735177443" observedRunningTime="2026-04-17 20:58:48.122180209 +0000 UTC m=+490.128594241" watchObservedRunningTime="2026-04-17 20:58:48.125449725 +0000 UTC m=+490.131863762" Apr 17 20:58:49.108044 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:49.107989 2576 generic.go:358] "Generic (PLEG): container finished" podID="25272d74-18ad-4033-ac8c-cdf379ae078d" containerID="47073fbd28b1cbf05bafa24c9fd1e0df7d604403e3c2275d5fa32a0e605896c5" exitCode=0 Apr 17 20:58:49.108476 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:49.108065 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25qx6t" event={"ID":"25272d74-18ad-4033-ac8c-cdf379ae078d","Type":"ContainerDied","Data":"47073fbd28b1cbf05bafa24c9fd1e0df7d604403e3c2275d5fa32a0e605896c5"} Apr 17 20:58:50.236314 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:50.236290 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25qx6t" Apr 17 20:58:50.269729 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:50.269702 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/25272d74-18ad-4033-ac8c-cdf379ae078d-bundle\") pod \"25272d74-18ad-4033-ac8c-cdf379ae078d\" (UID: \"25272d74-18ad-4033-ac8c-cdf379ae078d\") " Apr 17 20:58:50.269863 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:50.269746 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/25272d74-18ad-4033-ac8c-cdf379ae078d-util\") pod \"25272d74-18ad-4033-ac8c-cdf379ae078d\" (UID: \"25272d74-18ad-4033-ac8c-cdf379ae078d\") " Apr 17 20:58:50.269905 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:50.269865 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fz75m\" (UniqueName: \"kubernetes.io/projected/25272d74-18ad-4033-ac8c-cdf379ae078d-kube-api-access-fz75m\") pod \"25272d74-18ad-4033-ac8c-cdf379ae078d\" (UID: \"25272d74-18ad-4033-ac8c-cdf379ae078d\") " Apr 17 20:58:50.270618 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:50.270589 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25272d74-18ad-4033-ac8c-cdf379ae078d-bundle" (OuterVolumeSpecName: "bundle") pod "25272d74-18ad-4033-ac8c-cdf379ae078d" (UID: "25272d74-18ad-4033-ac8c-cdf379ae078d"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 20:58:50.271946 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:50.271914 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25272d74-18ad-4033-ac8c-cdf379ae078d-kube-api-access-fz75m" (OuterVolumeSpecName: "kube-api-access-fz75m") pod "25272d74-18ad-4033-ac8c-cdf379ae078d" (UID: "25272d74-18ad-4033-ac8c-cdf379ae078d"). InnerVolumeSpecName "kube-api-access-fz75m". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:58:50.273643 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:50.273621 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25272d74-18ad-4033-ac8c-cdf379ae078d-util" (OuterVolumeSpecName: "util") pod "25272d74-18ad-4033-ac8c-cdf379ae078d" (UID: "25272d74-18ad-4033-ac8c-cdf379ae078d"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 20:58:50.371145 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:50.371064 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fz75m\" (UniqueName: \"kubernetes.io/projected/25272d74-18ad-4033-ac8c-cdf379ae078d-kube-api-access-fz75m\") on node \"ip-10-0-128-248.ec2.internal\" DevicePath \"\"" Apr 17 20:58:50.371145 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:50.371091 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/25272d74-18ad-4033-ac8c-cdf379ae078d-bundle\") on node \"ip-10-0-128-248.ec2.internal\" DevicePath \"\"" Apr 17 20:58:50.371145 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:50.371103 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/25272d74-18ad-4033-ac8c-cdf379ae078d-util\") on node \"ip-10-0-128-248.ec2.internal\" DevicePath \"\"" Apr 17 20:58:51.117454 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:51.117420 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25qx6t" event={"ID":"25272d74-18ad-4033-ac8c-cdf379ae078d","Type":"ContainerDied","Data":"ea73359a4e32fe2e801185be8e4bdf01ff1aaa68218cf95d392685175109998e"} Apr 17 20:58:51.117454 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:51.117458 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ea73359a4e32fe2e801185be8e4bdf01ff1aaa68218cf95d392685175109998e" Apr 17 20:58:51.117655 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:51.117431 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25qx6t" Apr 17 20:58:51.878108 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:51.878075 2576 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="opendatahub/odh-model-controller-858dbf95b8-cx8hg" Apr 17 20:58:51.878601 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:51.878577 2576 scope.go:117] "RemoveContainer" containerID="eaa02d6a15691683fbde85380abe95765e61cf581c79a7e045a28fab1329d06b" Apr 17 20:58:53.125780 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:53.125744 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-cx8hg" event={"ID":"9065c47d-482b-4274-9e3c-f9dfa8b506fb","Type":"ContainerStarted","Data":"b09e907e0df8c30ff2d4068141c2499a097a2cf4fe572762a1c0b95eb403727b"} Apr 17 20:58:53.126169 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:53.125957 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-cx8hg" Apr 17 20:58:53.149980 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:53.149935 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/odh-model-controller-858dbf95b8-cx8hg" podStartSLOduration=3.185104899 podStartE2EDuration="23.149915276s" podCreationTimestamp="2026-04-17 20:58:30 +0000 UTC" firstStartedPulling="2026-04-17 20:58:32.205049652 +0000 UTC m=+474.211463663" lastFinishedPulling="2026-04-17 20:58:52.16986003 +0000 UTC m=+494.176274040" observedRunningTime="2026-04-17 20:58:53.148102992 +0000 UTC m=+495.154517028" watchObservedRunningTime="2026-04-17 20:58:53.149915276 +0000 UTC m=+495.156329308" Apr 17 20:58:59.110500 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:58:59.110470 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-bx2cg" Apr 17 20:59:04.131420 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:04.131381 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/odh-model-controller-858dbf95b8-cx8hg" Apr 17 20:59:10.071373 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:10.071344 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/kserve-controller-manager-856948b99f-kzt7p" Apr 17 20:59:14.857920 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:14.857882 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-hc7bh"] Apr 17 20:59:14.858490 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:14.858469 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="25272d74-18ad-4033-ac8c-cdf379ae078d" containerName="pull" Apr 17 20:59:14.858566 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:14.858493 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="25272d74-18ad-4033-ac8c-cdf379ae078d" containerName="pull" Apr 17 20:59:14.858566 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:14.858514 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="25272d74-18ad-4033-ac8c-cdf379ae078d" containerName="extract" Apr 17 20:59:14.858566 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:14.858522 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="25272d74-18ad-4033-ac8c-cdf379ae078d" containerName="extract" Apr 17 20:59:14.858566 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:14.858545 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="25272d74-18ad-4033-ac8c-cdf379ae078d" containerName="util" Apr 17 20:59:14.858566 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:14.858555 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="25272d74-18ad-4033-ac8c-cdf379ae078d" containerName="util" Apr 17 20:59:14.858808 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:14.858646 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="25272d74-18ad-4033-ac8c-cdf379ae078d" containerName="extract" Apr 17 20:59:14.868855 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:14.868834 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-hc7bh" Apr 17 20:59:14.872577 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:14.872084 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-openshift-gateway-dockercfg-nzdr8\"" Apr 17 20:59:14.872577 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:14.872325 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 17 20:59:14.873061 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:14.873010 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istio-kubeconfig\"" Apr 17 20:59:14.873267 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:14.873249 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-tls\"" Apr 17 20:59:14.874006 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:14.873982 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"cacerts\"" Apr 17 20:59:14.877875 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:14.877844 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-hc7bh"] Apr 17 20:59:14.974749 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:14.974666 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jp8hq\" (UniqueName: \"kubernetes.io/projected/87f053f8-a963-4596-83f5-904087b875bb-kube-api-access-jp8hq\") pod \"istiod-openshift-gateway-55ff986f96-hc7bh\" (UID: \"87f053f8-a963-4596-83f5-904087b875bb\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-hc7bh" Apr 17 20:59:14.974749 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:14.974717 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/87f053f8-a963-4596-83f5-904087b875bb-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-hc7bh\" (UID: \"87f053f8-a963-4596-83f5-904087b875bb\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-hc7bh" Apr 17 20:59:14.974749 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:14.974746 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/87f053f8-a963-4596-83f5-904087b875bb-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-hc7bh\" (UID: \"87f053f8-a963-4596-83f5-904087b875bb\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-hc7bh" Apr 17 20:59:14.974980 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:14.974794 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/87f053f8-a963-4596-83f5-904087b875bb-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-hc7bh\" (UID: \"87f053f8-a963-4596-83f5-904087b875bb\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-hc7bh" Apr 17 20:59:14.974980 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:14.974825 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/87f053f8-a963-4596-83f5-904087b875bb-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-hc7bh\" (UID: \"87f053f8-a963-4596-83f5-904087b875bb\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-hc7bh" Apr 17 20:59:14.974980 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:14.974862 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/87f053f8-a963-4596-83f5-904087b875bb-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-hc7bh\" (UID: \"87f053f8-a963-4596-83f5-904087b875bb\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-hc7bh" Apr 17 20:59:14.974980 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:14.974887 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/87f053f8-a963-4596-83f5-904087b875bb-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-hc7bh\" (UID: \"87f053f8-a963-4596-83f5-904087b875bb\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-hc7bh" Apr 17 20:59:15.076283 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:15.076247 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/87f053f8-a963-4596-83f5-904087b875bb-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-hc7bh\" (UID: \"87f053f8-a963-4596-83f5-904087b875bb\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-hc7bh" Apr 17 20:59:15.076283 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:15.076289 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/87f053f8-a963-4596-83f5-904087b875bb-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-hc7bh\" (UID: \"87f053f8-a963-4596-83f5-904087b875bb\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-hc7bh" Apr 17 20:59:15.076546 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:15.076460 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/87f053f8-a963-4596-83f5-904087b875bb-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-hc7bh\" (UID: \"87f053f8-a963-4596-83f5-904087b875bb\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-hc7bh" Apr 17 20:59:15.076546 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:15.076516 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/87f053f8-a963-4596-83f5-904087b875bb-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-hc7bh\" (UID: \"87f053f8-a963-4596-83f5-904087b875bb\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-hc7bh" Apr 17 20:59:15.076655 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:15.076558 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/87f053f8-a963-4596-83f5-904087b875bb-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-hc7bh\" (UID: \"87f053f8-a963-4596-83f5-904087b875bb\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-hc7bh" Apr 17 20:59:15.076655 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:15.076601 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/87f053f8-a963-4596-83f5-904087b875bb-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-hc7bh\" (UID: \"87f053f8-a963-4596-83f5-904087b875bb\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-hc7bh" Apr 17 20:59:15.076757 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:15.076667 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jp8hq\" (UniqueName: \"kubernetes.io/projected/87f053f8-a963-4596-83f5-904087b875bb-kube-api-access-jp8hq\") pod \"istiod-openshift-gateway-55ff986f96-hc7bh\" (UID: \"87f053f8-a963-4596-83f5-904087b875bb\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-hc7bh" Apr 17 20:59:15.077207 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:15.077173 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/87f053f8-a963-4596-83f5-904087b875bb-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-hc7bh\" (UID: \"87f053f8-a963-4596-83f5-904087b875bb\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-hc7bh" Apr 17 20:59:15.078764 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:15.078741 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/87f053f8-a963-4596-83f5-904087b875bb-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-hc7bh\" (UID: \"87f053f8-a963-4596-83f5-904087b875bb\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-hc7bh" Apr 17 20:59:15.079196 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:15.079172 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/87f053f8-a963-4596-83f5-904087b875bb-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-hc7bh\" (UID: \"87f053f8-a963-4596-83f5-904087b875bb\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-hc7bh" Apr 17 20:59:15.079278 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:15.079178 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/87f053f8-a963-4596-83f5-904087b875bb-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-hc7bh\" (UID: \"87f053f8-a963-4596-83f5-904087b875bb\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-hc7bh" Apr 17 20:59:15.079278 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:15.079214 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/87f053f8-a963-4596-83f5-904087b875bb-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-hc7bh\" (UID: \"87f053f8-a963-4596-83f5-904087b875bb\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-hc7bh" Apr 17 20:59:15.085309 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:15.085291 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/87f053f8-a963-4596-83f5-904087b875bb-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-hc7bh\" (UID: \"87f053f8-a963-4596-83f5-904087b875bb\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-hc7bh" Apr 17 20:59:15.085620 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:15.085599 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jp8hq\" (UniqueName: \"kubernetes.io/projected/87f053f8-a963-4596-83f5-904087b875bb-kube-api-access-jp8hq\") pod \"istiod-openshift-gateway-55ff986f96-hc7bh\" (UID: \"87f053f8-a963-4596-83f5-904087b875bb\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-hc7bh" Apr 17 20:59:15.183556 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:15.183521 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-hc7bh" Apr 17 20:59:15.319771 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:15.319742 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-hc7bh"] Apr 17 20:59:15.321709 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:59:15.321678 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87f053f8_a963_4596_83f5_904087b875bb.slice/crio-948b52ba9e28f283affd529ce489c21e588ef3820ebf0461afd9a1505fc42199 WatchSource:0}: Error finding container 948b52ba9e28f283affd529ce489c21e588ef3820ebf0461afd9a1505fc42199: Status 404 returned error can't find the container with id 948b52ba9e28f283affd529ce489c21e588ef3820ebf0461afd9a1505fc42199 Apr 17 20:59:16.217452 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:16.217410 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-hc7bh" event={"ID":"87f053f8-a963-4596-83f5-904087b875bb","Type":"ContainerStarted","Data":"948b52ba9e28f283affd529ce489c21e588ef3820ebf0461afd9a1505fc42199"} Apr 17 20:59:18.066974 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:18.066940 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 17 20:59:18.067243 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:18.067007 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 17 20:59:18.227489 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:18.227452 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-hc7bh" event={"ID":"87f053f8-a963-4596-83f5-904087b875bb","Type":"ContainerStarted","Data":"10205869b2bd351692879216060253adf41cb03bc71eff73b14253a65fff07df"} Apr 17 20:59:18.227660 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:18.227557 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-hc7bh" Apr 17 20:59:18.248327 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:18.248275 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-hc7bh" podStartSLOduration=1.50548575 podStartE2EDuration="4.248259821s" podCreationTimestamp="2026-04-17 20:59:14 +0000 UTC" firstStartedPulling="2026-04-17 20:59:15.323938194 +0000 UTC m=+517.330352208" lastFinishedPulling="2026-04-17 20:59:18.066712269 +0000 UTC m=+520.073126279" observedRunningTime="2026-04-17 20:59:18.245885798 +0000 UTC m=+520.252299836" watchObservedRunningTime="2026-04-17 20:59:18.248259821 +0000 UTC m=+520.254673855" Apr 17 20:59:19.233500 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:19.233474 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-hc7bh" Apr 17 20:59:42.599843 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:42.599809 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1zqlx2"] Apr 17 20:59:42.603643 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:42.603622 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1zqlx2" Apr 17 20:59:42.607494 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:42.607471 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 17 20:59:42.607761 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:42.607491 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-jn4tk\"" Apr 17 20:59:42.607761 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:42.607656 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 17 20:59:42.609389 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:42.609367 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1zqlx2"] Apr 17 20:59:42.715725 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:42.715697 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/03211626-ffc0-4a5b-9cd7-d55efdc86cde-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1zqlx2\" (UID: \"03211626-ffc0-4a5b-9cd7-d55efdc86cde\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1zqlx2" Apr 17 20:59:42.715858 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:42.715733 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/03211626-ffc0-4a5b-9cd7-d55efdc86cde-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1zqlx2\" (UID: \"03211626-ffc0-4a5b-9cd7-d55efdc86cde\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1zqlx2" Apr 17 20:59:42.715858 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:42.715764 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxlgh\" (UniqueName: \"kubernetes.io/projected/03211626-ffc0-4a5b-9cd7-d55efdc86cde-kube-api-access-rxlgh\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1zqlx2\" (UID: \"03211626-ffc0-4a5b-9cd7-d55efdc86cde\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1zqlx2" Apr 17 20:59:42.817043 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:42.816997 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/03211626-ffc0-4a5b-9cd7-d55efdc86cde-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1zqlx2\" (UID: \"03211626-ffc0-4a5b-9cd7-d55efdc86cde\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1zqlx2" Apr 17 20:59:42.817180 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:42.817070 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/03211626-ffc0-4a5b-9cd7-d55efdc86cde-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1zqlx2\" (UID: \"03211626-ffc0-4a5b-9cd7-d55efdc86cde\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1zqlx2" Apr 17 20:59:42.817180 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:42.817122 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rxlgh\" (UniqueName: \"kubernetes.io/projected/03211626-ffc0-4a5b-9cd7-d55efdc86cde-kube-api-access-rxlgh\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1zqlx2\" (UID: \"03211626-ffc0-4a5b-9cd7-d55efdc86cde\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1zqlx2" Apr 17 20:59:42.817439 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:42.817416 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/03211626-ffc0-4a5b-9cd7-d55efdc86cde-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1zqlx2\" (UID: \"03211626-ffc0-4a5b-9cd7-d55efdc86cde\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1zqlx2" Apr 17 20:59:42.817483 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:42.817429 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/03211626-ffc0-4a5b-9cd7-d55efdc86cde-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1zqlx2\" (UID: \"03211626-ffc0-4a5b-9cd7-d55efdc86cde\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1zqlx2" Apr 17 20:59:42.825105 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:42.825085 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxlgh\" (UniqueName: \"kubernetes.io/projected/03211626-ffc0-4a5b-9cd7-d55efdc86cde-kube-api-access-rxlgh\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1zqlx2\" (UID: \"03211626-ffc0-4a5b-9cd7-d55efdc86cde\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1zqlx2" Apr 17 20:59:42.913583 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:42.913529 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1zqlx2" Apr 17 20:59:43.195191 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:43.195114 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05ps9b"] Apr 17 20:59:43.199828 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:43.199812 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05ps9b" Apr 17 20:59:43.205716 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:43.205688 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05ps9b"] Apr 17 20:59:43.221848 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:43.221823 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4b53a019-ec4b-4dcd-ada4-3f42b65329c9-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05ps9b\" (UID: \"4b53a019-ec4b-4dcd-ada4-3f42b65329c9\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05ps9b" Apr 17 20:59:43.221961 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:43.221860 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czc9r\" (UniqueName: \"kubernetes.io/projected/4b53a019-ec4b-4dcd-ada4-3f42b65329c9-kube-api-access-czc9r\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05ps9b\" (UID: \"4b53a019-ec4b-4dcd-ada4-3f42b65329c9\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05ps9b" Apr 17 20:59:43.222059 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:43.222004 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4b53a019-ec4b-4dcd-ada4-3f42b65329c9-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05ps9b\" (UID: \"4b53a019-ec4b-4dcd-ada4-3f42b65329c9\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05ps9b" Apr 17 20:59:43.238649 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:43.238627 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1zqlx2"] Apr 17 20:59:43.240479 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:59:43.240457 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod03211626_ffc0_4a5b_9cd7_d55efdc86cde.slice/crio-1f381aaaa743831635d619a85567b44f598b5e7ede775f52300cf7b4e0d02ea6 WatchSource:0}: Error finding container 1f381aaaa743831635d619a85567b44f598b5e7ede775f52300cf7b4e0d02ea6: Status 404 returned error can't find the container with id 1f381aaaa743831635d619a85567b44f598b5e7ede775f52300cf7b4e0d02ea6 Apr 17 20:59:43.322525 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:43.322496 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4b53a019-ec4b-4dcd-ada4-3f42b65329c9-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05ps9b\" (UID: \"4b53a019-ec4b-4dcd-ada4-3f42b65329c9\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05ps9b" Apr 17 20:59:43.322655 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:43.322546 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-czc9r\" (UniqueName: \"kubernetes.io/projected/4b53a019-ec4b-4dcd-ada4-3f42b65329c9-kube-api-access-czc9r\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05ps9b\" (UID: \"4b53a019-ec4b-4dcd-ada4-3f42b65329c9\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05ps9b" Apr 17 20:59:43.322725 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:43.322654 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4b53a019-ec4b-4dcd-ada4-3f42b65329c9-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05ps9b\" (UID: \"4b53a019-ec4b-4dcd-ada4-3f42b65329c9\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05ps9b" Apr 17 20:59:43.322908 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:43.322879 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4b53a019-ec4b-4dcd-ada4-3f42b65329c9-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05ps9b\" (UID: \"4b53a019-ec4b-4dcd-ada4-3f42b65329c9\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05ps9b" Apr 17 20:59:43.323045 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:43.322975 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4b53a019-ec4b-4dcd-ada4-3f42b65329c9-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05ps9b\" (UID: \"4b53a019-ec4b-4dcd-ada4-3f42b65329c9\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05ps9b" Apr 17 20:59:43.323539 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:43.323515 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1zqlx2" event={"ID":"03211626-ffc0-4a5b-9cd7-d55efdc86cde","Type":"ContainerStarted","Data":"3c555e51e2c27d5b2cce7c02acd5a82d8f9ef291198a080f7857cf3cbcabc209"} Apr 17 20:59:43.323621 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:43.323550 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1zqlx2" event={"ID":"03211626-ffc0-4a5b-9cd7-d55efdc86cde","Type":"ContainerStarted","Data":"1f381aaaa743831635d619a85567b44f598b5e7ede775f52300cf7b4e0d02ea6"} Apr 17 20:59:43.331811 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:43.331791 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-czc9r\" (UniqueName: \"kubernetes.io/projected/4b53a019-ec4b-4dcd-ada4-3f42b65329c9-kube-api-access-czc9r\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05ps9b\" (UID: \"4b53a019-ec4b-4dcd-ada4-3f42b65329c9\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05ps9b" Apr 17 20:59:43.511192 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:43.511116 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05ps9b" Apr 17 20:59:43.602595 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:43.602567 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759zggmf"] Apr 17 20:59:43.606974 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:43.606949 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759zggmf" Apr 17 20:59:43.618038 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:43.617988 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759zggmf"] Apr 17 20:59:43.626037 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:43.625991 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4wnh\" (UniqueName: \"kubernetes.io/projected/2c95d737-42c4-415a-a9a8-a372c39f21ad-kube-api-access-k4wnh\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759zggmf\" (UID: \"2c95d737-42c4-415a-a9a8-a372c39f21ad\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759zggmf" Apr 17 20:59:43.626358 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:43.626336 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2c95d737-42c4-415a-a9a8-a372c39f21ad-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759zggmf\" (UID: \"2c95d737-42c4-415a-a9a8-a372c39f21ad\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759zggmf" Apr 17 20:59:43.626536 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:43.626519 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2c95d737-42c4-415a-a9a8-a372c39f21ad-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759zggmf\" (UID: \"2c95d737-42c4-415a-a9a8-a372c39f21ad\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759zggmf" Apr 17 20:59:43.638682 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:43.638661 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05ps9b"] Apr 17 20:59:43.640489 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:59:43.640468 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b53a019_ec4b_4dcd_ada4_3f42b65329c9.slice/crio-ce4ce26b17c548196db41df095efa667a03270d0b467f20190855b8307c76142 WatchSource:0}: Error finding container ce4ce26b17c548196db41df095efa667a03270d0b467f20190855b8307c76142: Status 404 returned error can't find the container with id ce4ce26b17c548196db41df095efa667a03270d0b467f20190855b8307c76142 Apr 17 20:59:43.728543 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:43.728507 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2c95d737-42c4-415a-a9a8-a372c39f21ad-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759zggmf\" (UID: \"2c95d737-42c4-415a-a9a8-a372c39f21ad\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759zggmf" Apr 17 20:59:43.728675 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:43.728569 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2c95d737-42c4-415a-a9a8-a372c39f21ad-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759zggmf\" (UID: \"2c95d737-42c4-415a-a9a8-a372c39f21ad\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759zggmf" Apr 17 20:59:43.728675 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:43.728662 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k4wnh\" (UniqueName: \"kubernetes.io/projected/2c95d737-42c4-415a-a9a8-a372c39f21ad-kube-api-access-k4wnh\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759zggmf\" (UID: \"2c95d737-42c4-415a-a9a8-a372c39f21ad\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759zggmf" Apr 17 20:59:43.728945 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:43.728924 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2c95d737-42c4-415a-a9a8-a372c39f21ad-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759zggmf\" (UID: \"2c95d737-42c4-415a-a9a8-a372c39f21ad\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759zggmf" Apr 17 20:59:43.728992 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:43.728921 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2c95d737-42c4-415a-a9a8-a372c39f21ad-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759zggmf\" (UID: \"2c95d737-42c4-415a-a9a8-a372c39f21ad\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759zggmf" Apr 17 20:59:43.736324 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:43.736302 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4wnh\" (UniqueName: \"kubernetes.io/projected/2c95d737-42c4-415a-a9a8-a372c39f21ad-kube-api-access-k4wnh\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759zggmf\" (UID: \"2c95d737-42c4-415a-a9a8-a372c39f21ad\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759zggmf" Apr 17 20:59:43.930205 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:43.930180 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759zggmf" Apr 17 20:59:44.002513 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:44.002485 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73swbsm"] Apr 17 20:59:44.007636 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:44.007611 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73swbsm" Apr 17 20:59:44.014154 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:44.014061 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73swbsm"] Apr 17 20:59:44.031070 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:44.031042 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blfn5\" (UniqueName: \"kubernetes.io/projected/2a0db55c-8f6a-4ac6-903f-dd3250e6f435-kube-api-access-blfn5\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73swbsm\" (UID: \"2a0db55c-8f6a-4ac6-903f-dd3250e6f435\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73swbsm" Apr 17 20:59:44.031191 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:44.031080 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2a0db55c-8f6a-4ac6-903f-dd3250e6f435-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73swbsm\" (UID: \"2a0db55c-8f6a-4ac6-903f-dd3250e6f435\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73swbsm" Apr 17 20:59:44.031191 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:44.031104 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2a0db55c-8f6a-4ac6-903f-dd3250e6f435-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73swbsm\" (UID: \"2a0db55c-8f6a-4ac6-903f-dd3250e6f435\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73swbsm" Apr 17 20:59:44.054112 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:44.054084 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759zggmf"] Apr 17 20:59:44.055581 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:59:44.055558 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c95d737_42c4_415a_a9a8_a372c39f21ad.slice/crio-693427b29e9183e9577a5b79a1aa0ad174ee08ff42362bf43baeb4bb53586cf2 WatchSource:0}: Error finding container 693427b29e9183e9577a5b79a1aa0ad174ee08ff42362bf43baeb4bb53586cf2: Status 404 returned error can't find the container with id 693427b29e9183e9577a5b79a1aa0ad174ee08ff42362bf43baeb4bb53586cf2 Apr 17 20:59:44.132217 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:44.132193 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-blfn5\" (UniqueName: \"kubernetes.io/projected/2a0db55c-8f6a-4ac6-903f-dd3250e6f435-kube-api-access-blfn5\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73swbsm\" (UID: \"2a0db55c-8f6a-4ac6-903f-dd3250e6f435\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73swbsm" Apr 17 20:59:44.132316 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:44.132230 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2a0db55c-8f6a-4ac6-903f-dd3250e6f435-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73swbsm\" (UID: \"2a0db55c-8f6a-4ac6-903f-dd3250e6f435\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73swbsm" Apr 17 20:59:44.132316 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:44.132255 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2a0db55c-8f6a-4ac6-903f-dd3250e6f435-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73swbsm\" (UID: \"2a0db55c-8f6a-4ac6-903f-dd3250e6f435\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73swbsm" Apr 17 20:59:44.132592 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:44.132573 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2a0db55c-8f6a-4ac6-903f-dd3250e6f435-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73swbsm\" (UID: \"2a0db55c-8f6a-4ac6-903f-dd3250e6f435\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73swbsm" Apr 17 20:59:44.132669 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:44.132609 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2a0db55c-8f6a-4ac6-903f-dd3250e6f435-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73swbsm\" (UID: \"2a0db55c-8f6a-4ac6-903f-dd3250e6f435\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73swbsm" Apr 17 20:59:44.140536 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:44.140517 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-blfn5\" (UniqueName: \"kubernetes.io/projected/2a0db55c-8f6a-4ac6-903f-dd3250e6f435-kube-api-access-blfn5\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73swbsm\" (UID: \"2a0db55c-8f6a-4ac6-903f-dd3250e6f435\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73swbsm" Apr 17 20:59:44.320971 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:44.320945 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73swbsm" Apr 17 20:59:44.328435 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:44.328408 2576 generic.go:358] "Generic (PLEG): container finished" podID="2c95d737-42c4-415a-a9a8-a372c39f21ad" containerID="e1af224c6b63839e33503cee07c69d256e6c42d81c7ce41e02b6a80893790931" exitCode=0 Apr 17 20:59:44.328539 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:44.328491 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759zggmf" event={"ID":"2c95d737-42c4-415a-a9a8-a372c39f21ad","Type":"ContainerDied","Data":"e1af224c6b63839e33503cee07c69d256e6c42d81c7ce41e02b6a80893790931"} Apr 17 20:59:44.328539 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:44.328534 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759zggmf" event={"ID":"2c95d737-42c4-415a-a9a8-a372c39f21ad","Type":"ContainerStarted","Data":"693427b29e9183e9577a5b79a1aa0ad174ee08ff42362bf43baeb4bb53586cf2"} Apr 17 20:59:44.330062 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:44.330037 2576 generic.go:358] "Generic (PLEG): container finished" podID="03211626-ffc0-4a5b-9cd7-d55efdc86cde" containerID="3c555e51e2c27d5b2cce7c02acd5a82d8f9ef291198a080f7857cf3cbcabc209" exitCode=0 Apr 17 20:59:44.330135 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:44.330066 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1zqlx2" event={"ID":"03211626-ffc0-4a5b-9cd7-d55efdc86cde","Type":"ContainerDied","Data":"3c555e51e2c27d5b2cce7c02acd5a82d8f9ef291198a080f7857cf3cbcabc209"} Apr 17 20:59:44.331551 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:44.331530 2576 generic.go:358] "Generic (PLEG): container finished" podID="4b53a019-ec4b-4dcd-ada4-3f42b65329c9" containerID="612ee602d7bc30bbe71bbfffa26dd386d3160cc9ff96b3205e3f88516d63c905" exitCode=0 Apr 17 20:59:44.331655 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:44.331602 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05ps9b" event={"ID":"4b53a019-ec4b-4dcd-ada4-3f42b65329c9","Type":"ContainerDied","Data":"612ee602d7bc30bbe71bbfffa26dd386d3160cc9ff96b3205e3f88516d63c905"} Apr 17 20:59:44.331655 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:44.331624 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05ps9b" event={"ID":"4b53a019-ec4b-4dcd-ada4-3f42b65329c9","Type":"ContainerStarted","Data":"ce4ce26b17c548196db41df095efa667a03270d0b467f20190855b8307c76142"} Apr 17 20:59:44.446428 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:44.446399 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73swbsm"] Apr 17 20:59:44.447604 ip-10-0-128-248 kubenswrapper[2576]: W0417 20:59:44.447582 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a0db55c_8f6a_4ac6_903f_dd3250e6f435.slice/crio-fabede2599411bf487773a2b52204f5883ceb6ca891601f2ca6ee1974fed88c1 WatchSource:0}: Error finding container fabede2599411bf487773a2b52204f5883ceb6ca891601f2ca6ee1974fed88c1: Status 404 returned error can't find the container with id fabede2599411bf487773a2b52204f5883ceb6ca891601f2ca6ee1974fed88c1 Apr 17 20:59:45.336894 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:45.336863 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759zggmf" event={"ID":"2c95d737-42c4-415a-a9a8-a372c39f21ad","Type":"ContainerStarted","Data":"33ce8dca57729be802dcc3137559c06d5cb9bb8a20dd7c014fb47ed15027db9b"} Apr 17 20:59:45.338343 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:45.338320 2576 generic.go:358] "Generic (PLEG): container finished" podID="2a0db55c-8f6a-4ac6-903f-dd3250e6f435" containerID="6edbcf8ad18d090cdf6cba52a499512da5ec2fe4e2847000819e0d6684ea7c22" exitCode=0 Apr 17 20:59:45.338443 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:45.338411 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73swbsm" event={"ID":"2a0db55c-8f6a-4ac6-903f-dd3250e6f435","Type":"ContainerDied","Data":"6edbcf8ad18d090cdf6cba52a499512da5ec2fe4e2847000819e0d6684ea7c22"} Apr 17 20:59:45.338496 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:45.338444 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73swbsm" event={"ID":"2a0db55c-8f6a-4ac6-903f-dd3250e6f435","Type":"ContainerStarted","Data":"fabede2599411bf487773a2b52204f5883ceb6ca891601f2ca6ee1974fed88c1"} Apr 17 20:59:45.340257 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:45.340232 2576 generic.go:358] "Generic (PLEG): container finished" podID="4b53a019-ec4b-4dcd-ada4-3f42b65329c9" containerID="b08eece469a5b5a208777344ebe024026c5ab0addac81eb7319dbee663f3f6e3" exitCode=0 Apr 17 20:59:45.340356 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:45.340292 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05ps9b" event={"ID":"4b53a019-ec4b-4dcd-ada4-3f42b65329c9","Type":"ContainerDied","Data":"b08eece469a5b5a208777344ebe024026c5ab0addac81eb7319dbee663f3f6e3"} Apr 17 20:59:46.345800 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:46.345768 2576 generic.go:358] "Generic (PLEG): container finished" podID="2c95d737-42c4-415a-a9a8-a372c39f21ad" containerID="33ce8dca57729be802dcc3137559c06d5cb9bb8a20dd7c014fb47ed15027db9b" exitCode=0 Apr 17 20:59:46.346237 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:46.345842 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759zggmf" event={"ID":"2c95d737-42c4-415a-a9a8-a372c39f21ad","Type":"ContainerDied","Data":"33ce8dca57729be802dcc3137559c06d5cb9bb8a20dd7c014fb47ed15027db9b"} Apr 17 20:59:46.354172 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:46.354129 2576 generic.go:358] "Generic (PLEG): container finished" podID="2a0db55c-8f6a-4ac6-903f-dd3250e6f435" containerID="8e31d4d8fa72ab37811a941cacf078a04803ef87b62712db830857c2ee31469d" exitCode=0 Apr 17 20:59:46.354269 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:46.354233 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73swbsm" event={"ID":"2a0db55c-8f6a-4ac6-903f-dd3250e6f435","Type":"ContainerDied","Data":"8e31d4d8fa72ab37811a941cacf078a04803ef87b62712db830857c2ee31469d"} Apr 17 20:59:46.356052 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:46.356012 2576 generic.go:358] "Generic (PLEG): container finished" podID="03211626-ffc0-4a5b-9cd7-d55efdc86cde" containerID="7626496d1ff75bbb9e2243704b56af92d9875837e0cb1884024869f8f3ab4c43" exitCode=0 Apr 17 20:59:46.356122 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:46.356054 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1zqlx2" event={"ID":"03211626-ffc0-4a5b-9cd7-d55efdc86cde","Type":"ContainerDied","Data":"7626496d1ff75bbb9e2243704b56af92d9875837e0cb1884024869f8f3ab4c43"} Apr 17 20:59:46.358016 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:46.357998 2576 generic.go:358] "Generic (PLEG): container finished" podID="4b53a019-ec4b-4dcd-ada4-3f42b65329c9" containerID="02de0320c871b397d7029bf5afb8c3ab005f93418c1a0bc18937aacbd263e632" exitCode=0 Apr 17 20:59:46.358113 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:46.358092 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05ps9b" event={"ID":"4b53a019-ec4b-4dcd-ada4-3f42b65329c9","Type":"ContainerDied","Data":"02de0320c871b397d7029bf5afb8c3ab005f93418c1a0bc18937aacbd263e632"} Apr 17 20:59:47.363758 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:47.363720 2576 generic.go:358] "Generic (PLEG): container finished" podID="2a0db55c-8f6a-4ac6-903f-dd3250e6f435" containerID="f85cb82b22e451e3f43807df123d763e2bbdd4dc0e9b88b2e655bf86cf6d1c06" exitCode=0 Apr 17 20:59:47.364218 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:47.363804 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73swbsm" event={"ID":"2a0db55c-8f6a-4ac6-903f-dd3250e6f435","Type":"ContainerDied","Data":"f85cb82b22e451e3f43807df123d763e2bbdd4dc0e9b88b2e655bf86cf6d1c06"} Apr 17 20:59:47.365586 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:47.365564 2576 generic.go:358] "Generic (PLEG): container finished" podID="03211626-ffc0-4a5b-9cd7-d55efdc86cde" containerID="e28dfbae936e6f3c877806cdd68180feb38c046cd5c7e58194e01ab93593dc4d" exitCode=0 Apr 17 20:59:47.365696 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:47.365625 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1zqlx2" event={"ID":"03211626-ffc0-4a5b-9cd7-d55efdc86cde","Type":"ContainerDied","Data":"e28dfbae936e6f3c877806cdd68180feb38c046cd5c7e58194e01ab93593dc4d"} Apr 17 20:59:47.367348 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:47.367329 2576 generic.go:358] "Generic (PLEG): container finished" podID="2c95d737-42c4-415a-a9a8-a372c39f21ad" containerID="c29ad5d4245a1b91014828c074593ddee76ac06c992c817ccc386a86ec7b2243" exitCode=0 Apr 17 20:59:47.367430 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:47.367349 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759zggmf" event={"ID":"2c95d737-42c4-415a-a9a8-a372c39f21ad","Type":"ContainerDied","Data":"c29ad5d4245a1b91014828c074593ddee76ac06c992c817ccc386a86ec7b2243"} Apr 17 20:59:47.494848 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:47.494825 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05ps9b" Apr 17 20:59:47.563711 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:47.563680 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4b53a019-ec4b-4dcd-ada4-3f42b65329c9-bundle\") pod \"4b53a019-ec4b-4dcd-ada4-3f42b65329c9\" (UID: \"4b53a019-ec4b-4dcd-ada4-3f42b65329c9\") " Apr 17 20:59:47.563872 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:47.563796 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4b53a019-ec4b-4dcd-ada4-3f42b65329c9-util\") pod \"4b53a019-ec4b-4dcd-ada4-3f42b65329c9\" (UID: \"4b53a019-ec4b-4dcd-ada4-3f42b65329c9\") " Apr 17 20:59:47.563872 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:47.563867 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-czc9r\" (UniqueName: \"kubernetes.io/projected/4b53a019-ec4b-4dcd-ada4-3f42b65329c9-kube-api-access-czc9r\") pod \"4b53a019-ec4b-4dcd-ada4-3f42b65329c9\" (UID: \"4b53a019-ec4b-4dcd-ada4-3f42b65329c9\") " Apr 17 20:59:47.564270 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:47.564242 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b53a019-ec4b-4dcd-ada4-3f42b65329c9-bundle" (OuterVolumeSpecName: "bundle") pod "4b53a019-ec4b-4dcd-ada4-3f42b65329c9" (UID: "4b53a019-ec4b-4dcd-ada4-3f42b65329c9"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 20:59:47.566090 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:47.566063 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b53a019-ec4b-4dcd-ada4-3f42b65329c9-kube-api-access-czc9r" (OuterVolumeSpecName: "kube-api-access-czc9r") pod "4b53a019-ec4b-4dcd-ada4-3f42b65329c9" (UID: "4b53a019-ec4b-4dcd-ada4-3f42b65329c9"). InnerVolumeSpecName "kube-api-access-czc9r". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:59:47.568941 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:47.568911 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b53a019-ec4b-4dcd-ada4-3f42b65329c9-util" (OuterVolumeSpecName: "util") pod "4b53a019-ec4b-4dcd-ada4-3f42b65329c9" (UID: "4b53a019-ec4b-4dcd-ada4-3f42b65329c9"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 20:59:47.664679 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:47.664599 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4b53a019-ec4b-4dcd-ada4-3f42b65329c9-util\") on node \"ip-10-0-128-248.ec2.internal\" DevicePath \"\"" Apr 17 20:59:47.664679 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:47.664631 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-czc9r\" (UniqueName: \"kubernetes.io/projected/4b53a019-ec4b-4dcd-ada4-3f42b65329c9-kube-api-access-czc9r\") on node \"ip-10-0-128-248.ec2.internal\" DevicePath \"\"" Apr 17 20:59:47.664679 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:47.664641 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4b53a019-ec4b-4dcd-ada4-3f42b65329c9-bundle\") on node \"ip-10-0-128-248.ec2.internal\" DevicePath \"\"" Apr 17 20:59:48.372615 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:48.372583 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05ps9b" Apr 17 20:59:48.372615 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:48.372602 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05ps9b" event={"ID":"4b53a019-ec4b-4dcd-ada4-3f42b65329c9","Type":"ContainerDied","Data":"ce4ce26b17c548196db41df095efa667a03270d0b467f20190855b8307c76142"} Apr 17 20:59:48.373208 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:48.372633 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ce4ce26b17c548196db41df095efa667a03270d0b467f20190855b8307c76142" Apr 17 20:59:48.521304 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:48.521282 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73swbsm" Apr 17 20:59:48.563124 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:48.563103 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759zggmf" Apr 17 20:59:48.566412 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:48.566394 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1zqlx2" Apr 17 20:59:48.571672 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:48.571651 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-blfn5\" (UniqueName: \"kubernetes.io/projected/2a0db55c-8f6a-4ac6-903f-dd3250e6f435-kube-api-access-blfn5\") pod \"2a0db55c-8f6a-4ac6-903f-dd3250e6f435\" (UID: \"2a0db55c-8f6a-4ac6-903f-dd3250e6f435\") " Apr 17 20:59:48.571756 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:48.571707 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2a0db55c-8f6a-4ac6-903f-dd3250e6f435-util\") pod \"2a0db55c-8f6a-4ac6-903f-dd3250e6f435\" (UID: \"2a0db55c-8f6a-4ac6-903f-dd3250e6f435\") " Apr 17 20:59:48.571815 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:48.571801 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2a0db55c-8f6a-4ac6-903f-dd3250e6f435-bundle\") pod \"2a0db55c-8f6a-4ac6-903f-dd3250e6f435\" (UID: \"2a0db55c-8f6a-4ac6-903f-dd3250e6f435\") " Apr 17 20:59:48.572459 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:48.572438 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a0db55c-8f6a-4ac6-903f-dd3250e6f435-bundle" (OuterVolumeSpecName: "bundle") pod "2a0db55c-8f6a-4ac6-903f-dd3250e6f435" (UID: "2a0db55c-8f6a-4ac6-903f-dd3250e6f435"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 20:59:48.573697 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:48.573673 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a0db55c-8f6a-4ac6-903f-dd3250e6f435-kube-api-access-blfn5" (OuterVolumeSpecName: "kube-api-access-blfn5") pod "2a0db55c-8f6a-4ac6-903f-dd3250e6f435" (UID: "2a0db55c-8f6a-4ac6-903f-dd3250e6f435"). InnerVolumeSpecName "kube-api-access-blfn5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:59:48.576837 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:48.576818 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a0db55c-8f6a-4ac6-903f-dd3250e6f435-util" (OuterVolumeSpecName: "util") pod "2a0db55c-8f6a-4ac6-903f-dd3250e6f435" (UID: "2a0db55c-8f6a-4ac6-903f-dd3250e6f435"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 20:59:48.672945 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:48.672866 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2c95d737-42c4-415a-a9a8-a372c39f21ad-bundle\") pod \"2c95d737-42c4-415a-a9a8-a372c39f21ad\" (UID: \"2c95d737-42c4-415a-a9a8-a372c39f21ad\") " Apr 17 20:59:48.673130 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:48.672981 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/03211626-ffc0-4a5b-9cd7-d55efdc86cde-bundle\") pod \"03211626-ffc0-4a5b-9cd7-d55efdc86cde\" (UID: \"03211626-ffc0-4a5b-9cd7-d55efdc86cde\") " Apr 17 20:59:48.673130 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:48.673043 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k4wnh\" (UniqueName: \"kubernetes.io/projected/2c95d737-42c4-415a-a9a8-a372c39f21ad-kube-api-access-k4wnh\") pod \"2c95d737-42c4-415a-a9a8-a372c39f21ad\" (UID: \"2c95d737-42c4-415a-a9a8-a372c39f21ad\") " Apr 17 20:59:48.673130 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:48.673078 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2c95d737-42c4-415a-a9a8-a372c39f21ad-util\") pod \"2c95d737-42c4-415a-a9a8-a372c39f21ad\" (UID: \"2c95d737-42c4-415a-a9a8-a372c39f21ad\") " Apr 17 20:59:48.673304 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:48.673146 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rxlgh\" (UniqueName: \"kubernetes.io/projected/03211626-ffc0-4a5b-9cd7-d55efdc86cde-kube-api-access-rxlgh\") pod \"03211626-ffc0-4a5b-9cd7-d55efdc86cde\" (UID: \"03211626-ffc0-4a5b-9cd7-d55efdc86cde\") " Apr 17 20:59:48.673304 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:48.673173 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/03211626-ffc0-4a5b-9cd7-d55efdc86cde-util\") pod \"03211626-ffc0-4a5b-9cd7-d55efdc86cde\" (UID: \"03211626-ffc0-4a5b-9cd7-d55efdc86cde\") " Apr 17 20:59:48.673444 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:48.673423 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-blfn5\" (UniqueName: \"kubernetes.io/projected/2a0db55c-8f6a-4ac6-903f-dd3250e6f435-kube-api-access-blfn5\") on node \"ip-10-0-128-248.ec2.internal\" DevicePath \"\"" Apr 17 20:59:48.673503 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:48.673450 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2a0db55c-8f6a-4ac6-903f-dd3250e6f435-util\") on node \"ip-10-0-128-248.ec2.internal\" DevicePath \"\"" Apr 17 20:59:48.673503 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:48.673464 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2a0db55c-8f6a-4ac6-903f-dd3250e6f435-bundle\") on node \"ip-10-0-128-248.ec2.internal\" DevicePath \"\"" Apr 17 20:59:48.673597 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:48.673531 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c95d737-42c4-415a-a9a8-a372c39f21ad-bundle" (OuterVolumeSpecName: "bundle") pod "2c95d737-42c4-415a-a9a8-a372c39f21ad" (UID: "2c95d737-42c4-415a-a9a8-a372c39f21ad"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 20:59:48.673648 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:48.673633 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03211626-ffc0-4a5b-9cd7-d55efdc86cde-bundle" (OuterVolumeSpecName: "bundle") pod "03211626-ffc0-4a5b-9cd7-d55efdc86cde" (UID: "03211626-ffc0-4a5b-9cd7-d55efdc86cde"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 20:59:48.676077 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:48.676044 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03211626-ffc0-4a5b-9cd7-d55efdc86cde-kube-api-access-rxlgh" (OuterVolumeSpecName: "kube-api-access-rxlgh") pod "03211626-ffc0-4a5b-9cd7-d55efdc86cde" (UID: "03211626-ffc0-4a5b-9cd7-d55efdc86cde"). InnerVolumeSpecName "kube-api-access-rxlgh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:59:48.676077 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:48.676062 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c95d737-42c4-415a-a9a8-a372c39f21ad-kube-api-access-k4wnh" (OuterVolumeSpecName: "kube-api-access-k4wnh") pod "2c95d737-42c4-415a-a9a8-a372c39f21ad" (UID: "2c95d737-42c4-415a-a9a8-a372c39f21ad"). InnerVolumeSpecName "kube-api-access-k4wnh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:59:48.678731 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:48.678707 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03211626-ffc0-4a5b-9cd7-d55efdc86cde-util" (OuterVolumeSpecName: "util") pod "03211626-ffc0-4a5b-9cd7-d55efdc86cde" (UID: "03211626-ffc0-4a5b-9cd7-d55efdc86cde"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 20:59:48.679662 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:48.679643 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c95d737-42c4-415a-a9a8-a372c39f21ad-util" (OuterVolumeSpecName: "util") pod "2c95d737-42c4-415a-a9a8-a372c39f21ad" (UID: "2c95d737-42c4-415a-a9a8-a372c39f21ad"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 20:59:48.774429 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:48.774407 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rxlgh\" (UniqueName: \"kubernetes.io/projected/03211626-ffc0-4a5b-9cd7-d55efdc86cde-kube-api-access-rxlgh\") on node \"ip-10-0-128-248.ec2.internal\" DevicePath \"\"" Apr 17 20:59:48.774429 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:48.774428 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/03211626-ffc0-4a5b-9cd7-d55efdc86cde-util\") on node \"ip-10-0-128-248.ec2.internal\" DevicePath \"\"" Apr 17 20:59:48.774569 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:48.774439 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2c95d737-42c4-415a-a9a8-a372c39f21ad-bundle\") on node \"ip-10-0-128-248.ec2.internal\" DevicePath \"\"" Apr 17 20:59:48.774569 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:48.774448 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/03211626-ffc0-4a5b-9cd7-d55efdc86cde-bundle\") on node \"ip-10-0-128-248.ec2.internal\" DevicePath \"\"" Apr 17 20:59:48.774569 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:48.774456 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-k4wnh\" (UniqueName: \"kubernetes.io/projected/2c95d737-42c4-415a-a9a8-a372c39f21ad-kube-api-access-k4wnh\") on node \"ip-10-0-128-248.ec2.internal\" DevicePath \"\"" Apr 17 20:59:48.774569 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:48.774465 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2c95d737-42c4-415a-a9a8-a372c39f21ad-util\") on node \"ip-10-0-128-248.ec2.internal\" DevicePath \"\"" Apr 17 20:59:49.377879 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:49.377836 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759zggmf" event={"ID":"2c95d737-42c4-415a-a9a8-a372c39f21ad","Type":"ContainerDied","Data":"693427b29e9183e9577a5b79a1aa0ad174ee08ff42362bf43baeb4bb53586cf2"} Apr 17 20:59:49.377879 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:49.377878 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="693427b29e9183e9577a5b79a1aa0ad174ee08ff42362bf43baeb4bb53586cf2" Apr 17 20:59:49.377879 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:49.377883 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759zggmf" Apr 17 20:59:49.379635 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:49.379609 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73swbsm" event={"ID":"2a0db55c-8f6a-4ac6-903f-dd3250e6f435","Type":"ContainerDied","Data":"fabede2599411bf487773a2b52204f5883ceb6ca891601f2ca6ee1974fed88c1"} Apr 17 20:59:49.379635 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:49.379625 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73swbsm" Apr 17 20:59:49.379635 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:49.379638 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fabede2599411bf487773a2b52204f5883ceb6ca891601f2ca6ee1974fed88c1" Apr 17 20:59:49.381398 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:49.381379 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1zqlx2" Apr 17 20:59:49.381484 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:49.381370 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1zqlx2" event={"ID":"03211626-ffc0-4a5b-9cd7-d55efdc86cde","Type":"ContainerDied","Data":"1f381aaaa743831635d619a85567b44f598b5e7ede775f52300cf7b4e0d02ea6"} Apr 17 20:59:49.381528 ip-10-0-128-248 kubenswrapper[2576]: I0417 20:59:49.381493 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1f381aaaa743831635d619a85567b44f598b5e7ede775f52300cf7b4e0d02ea6" Apr 17 21:00:00.196569 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:00.196531 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-lb25t"] Apr 17 21:00:00.196946 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:00.196913 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2a0db55c-8f6a-4ac6-903f-dd3250e6f435" containerName="extract" Apr 17 21:00:00.196946 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:00.196924 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a0db55c-8f6a-4ac6-903f-dd3250e6f435" containerName="extract" Apr 17 21:00:00.196946 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:00.196937 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="03211626-ffc0-4a5b-9cd7-d55efdc86cde" containerName="pull" Apr 17 21:00:00.196946 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:00.196942 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="03211626-ffc0-4a5b-9cd7-d55efdc86cde" containerName="pull" Apr 17 21:00:00.197092 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:00.196954 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4b53a019-ec4b-4dcd-ada4-3f42b65329c9" containerName="util" Apr 17 21:00:00.197092 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:00.196959 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b53a019-ec4b-4dcd-ada4-3f42b65329c9" containerName="util" Apr 17 21:00:00.197092 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:00.196968 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2c95d737-42c4-415a-a9a8-a372c39f21ad" containerName="util" Apr 17 21:00:00.197092 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:00.196973 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c95d737-42c4-415a-a9a8-a372c39f21ad" containerName="util" Apr 17 21:00:00.197092 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:00.196979 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2a0db55c-8f6a-4ac6-903f-dd3250e6f435" containerName="util" Apr 17 21:00:00.197092 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:00.196984 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a0db55c-8f6a-4ac6-903f-dd3250e6f435" containerName="util" Apr 17 21:00:00.197092 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:00.196990 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4b53a019-ec4b-4dcd-ada4-3f42b65329c9" containerName="extract" Apr 17 21:00:00.197092 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:00.196996 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b53a019-ec4b-4dcd-ada4-3f42b65329c9" containerName="extract" Apr 17 21:00:00.197092 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:00.197003 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="03211626-ffc0-4a5b-9cd7-d55efdc86cde" containerName="util" Apr 17 21:00:00.197092 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:00.197008 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="03211626-ffc0-4a5b-9cd7-d55efdc86cde" containerName="util" Apr 17 21:00:00.197092 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:00.197014 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2c95d737-42c4-415a-a9a8-a372c39f21ad" containerName="pull" Apr 17 21:00:00.197092 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:00.197019 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c95d737-42c4-415a-a9a8-a372c39f21ad" containerName="pull" Apr 17 21:00:00.197092 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:00.197042 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="03211626-ffc0-4a5b-9cd7-d55efdc86cde" containerName="extract" Apr 17 21:00:00.197092 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:00.197047 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="03211626-ffc0-4a5b-9cd7-d55efdc86cde" containerName="extract" Apr 17 21:00:00.197092 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:00.197055 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4b53a019-ec4b-4dcd-ada4-3f42b65329c9" containerName="pull" Apr 17 21:00:00.197092 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:00.197060 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b53a019-ec4b-4dcd-ada4-3f42b65329c9" containerName="pull" Apr 17 21:00:00.197092 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:00.197065 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2a0db55c-8f6a-4ac6-903f-dd3250e6f435" containerName="pull" Apr 17 21:00:00.197092 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:00.197070 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a0db55c-8f6a-4ac6-903f-dd3250e6f435" containerName="pull" Apr 17 21:00:00.197092 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:00.197077 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2c95d737-42c4-415a-a9a8-a372c39f21ad" containerName="extract" Apr 17 21:00:00.197092 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:00.197082 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c95d737-42c4-415a-a9a8-a372c39f21ad" containerName="extract" Apr 17 21:00:00.197683 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:00.197140 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="4b53a019-ec4b-4dcd-ada4-3f42b65329c9" containerName="extract" Apr 17 21:00:00.197683 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:00.197147 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="2a0db55c-8f6a-4ac6-903f-dd3250e6f435" containerName="extract" Apr 17 21:00:00.197683 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:00.197155 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="2c95d737-42c4-415a-a9a8-a372c39f21ad" containerName="extract" Apr 17 21:00:00.197683 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:00.197161 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="03211626-ffc0-4a5b-9cd7-d55efdc86cde" containerName="extract" Apr 17 21:00:00.200188 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:00.200172 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-lb25t" Apr 17 21:00:00.203640 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:00.203614 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-dj7dw\"" Apr 17 21:00:00.203774 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:00.203615 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 17 21:00:00.203774 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:00.203667 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 17 21:00:00.218490 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:00.218469 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-lb25t"] Apr 17 21:00:00.270640 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:00.270613 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bt7x\" (UniqueName: \"kubernetes.io/projected/54172ef6-ac96-4324-b0eb-8d9adcb84cfd-kube-api-access-7bt7x\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-lb25t\" (UID: \"54172ef6-ac96-4324-b0eb-8d9adcb84cfd\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-lb25t" Apr 17 21:00:00.270774 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:00.270694 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/54172ef6-ac96-4324-b0eb-8d9adcb84cfd-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-lb25t\" (UID: \"54172ef6-ac96-4324-b0eb-8d9adcb84cfd\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-lb25t" Apr 17 21:00:00.371739 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:00.371710 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/54172ef6-ac96-4324-b0eb-8d9adcb84cfd-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-lb25t\" (UID: \"54172ef6-ac96-4324-b0eb-8d9adcb84cfd\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-lb25t" Apr 17 21:00:00.371884 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:00.371864 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7bt7x\" (UniqueName: \"kubernetes.io/projected/54172ef6-ac96-4324-b0eb-8d9adcb84cfd-kube-api-access-7bt7x\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-lb25t\" (UID: \"54172ef6-ac96-4324-b0eb-8d9adcb84cfd\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-lb25t" Apr 17 21:00:00.372096 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:00.372078 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/54172ef6-ac96-4324-b0eb-8d9adcb84cfd-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-lb25t\" (UID: \"54172ef6-ac96-4324-b0eb-8d9adcb84cfd\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-lb25t" Apr 17 21:00:00.380339 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:00.380318 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bt7x\" (UniqueName: \"kubernetes.io/projected/54172ef6-ac96-4324-b0eb-8d9adcb84cfd-kube-api-access-7bt7x\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-lb25t\" (UID: \"54172ef6-ac96-4324-b0eb-8d9adcb84cfd\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-lb25t" Apr 17 21:00:00.509762 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:00.509692 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-lb25t" Apr 17 21:00:00.640481 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:00.640451 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-lb25t"] Apr 17 21:00:00.641750 ip-10-0-128-248 kubenswrapper[2576]: W0417 21:00:00.641722 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod54172ef6_ac96_4324_b0eb_8d9adcb84cfd.slice/crio-32c4999185bfee156739810d4018302a2ac1f3ed705005663504cadf3c8c1f2d WatchSource:0}: Error finding container 32c4999185bfee156739810d4018302a2ac1f3ed705005663504cadf3c8c1f2d: Status 404 returned error can't find the container with id 32c4999185bfee156739810d4018302a2ac1f3ed705005663504cadf3c8c1f2d Apr 17 21:00:01.434195 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:01.434160 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-lb25t" event={"ID":"54172ef6-ac96-4324-b0eb-8d9adcb84cfd","Type":"ContainerStarted","Data":"32c4999185bfee156739810d4018302a2ac1f3ed705005663504cadf3c8c1f2d"} Apr 17 21:00:05.453431 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:05.453319 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-lb25t" event={"ID":"54172ef6-ac96-4324-b0eb-8d9adcb84cfd","Type":"ContainerStarted","Data":"b0fa58aedafc1e0f244f850d39703ed5f9b72bb63dd8066ad266e757bd6824b4"} Apr 17 21:00:05.453803 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:05.453490 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-lb25t" Apr 17 21:00:05.478742 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:05.478687 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-lb25t" podStartSLOduration=0.930275884 podStartE2EDuration="5.478672961s" podCreationTimestamp="2026-04-17 21:00:00 +0000 UTC" firstStartedPulling="2026-04-17 21:00:00.643836596 +0000 UTC m=+562.650250606" lastFinishedPulling="2026-04-17 21:00:05.192233668 +0000 UTC m=+567.198647683" observedRunningTime="2026-04-17 21:00:05.475317426 +0000 UTC m=+567.481731459" watchObservedRunningTime="2026-04-17 21:00:05.478672961 +0000 UTC m=+567.485087050" Apr 17 21:00:09.684258 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:09.684224 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7ff664dfb6-lcc5q"] Apr 17 21:00:09.688253 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:09.688232 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7ff664dfb6-lcc5q" Apr 17 21:00:09.697511 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:09.697487 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7ff664dfb6-lcc5q"] Apr 17 21:00:09.857042 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:09.856988 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/422bf273-28f9-489e-8bfb-0834a4d90d5b-oauth-serving-cert\") pod \"console-7ff664dfb6-lcc5q\" (UID: \"422bf273-28f9-489e-8bfb-0834a4d90d5b\") " pod="openshift-console/console-7ff664dfb6-lcc5q" Apr 17 21:00:09.857258 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:09.857056 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/422bf273-28f9-489e-8bfb-0834a4d90d5b-console-oauth-config\") pod \"console-7ff664dfb6-lcc5q\" (UID: \"422bf273-28f9-489e-8bfb-0834a4d90d5b\") " pod="openshift-console/console-7ff664dfb6-lcc5q" Apr 17 21:00:09.857258 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:09.857175 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87jrc\" (UniqueName: \"kubernetes.io/projected/422bf273-28f9-489e-8bfb-0834a4d90d5b-kube-api-access-87jrc\") pod \"console-7ff664dfb6-lcc5q\" (UID: \"422bf273-28f9-489e-8bfb-0834a4d90d5b\") " pod="openshift-console/console-7ff664dfb6-lcc5q" Apr 17 21:00:09.857258 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:09.857237 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/422bf273-28f9-489e-8bfb-0834a4d90d5b-service-ca\") pod \"console-7ff664dfb6-lcc5q\" (UID: \"422bf273-28f9-489e-8bfb-0834a4d90d5b\") " pod="openshift-console/console-7ff664dfb6-lcc5q" Apr 17 21:00:09.857401 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:09.857272 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/422bf273-28f9-489e-8bfb-0834a4d90d5b-console-config\") pod \"console-7ff664dfb6-lcc5q\" (UID: \"422bf273-28f9-489e-8bfb-0834a4d90d5b\") " pod="openshift-console/console-7ff664dfb6-lcc5q" Apr 17 21:00:09.857401 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:09.857329 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/422bf273-28f9-489e-8bfb-0834a4d90d5b-console-serving-cert\") pod \"console-7ff664dfb6-lcc5q\" (UID: \"422bf273-28f9-489e-8bfb-0834a4d90d5b\") " pod="openshift-console/console-7ff664dfb6-lcc5q" Apr 17 21:00:09.857401 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:09.857385 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/422bf273-28f9-489e-8bfb-0834a4d90d5b-trusted-ca-bundle\") pod \"console-7ff664dfb6-lcc5q\" (UID: \"422bf273-28f9-489e-8bfb-0834a4d90d5b\") " pod="openshift-console/console-7ff664dfb6-lcc5q" Apr 17 21:00:09.958003 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:09.957912 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/422bf273-28f9-489e-8bfb-0834a4d90d5b-service-ca\") pod \"console-7ff664dfb6-lcc5q\" (UID: \"422bf273-28f9-489e-8bfb-0834a4d90d5b\") " pod="openshift-console/console-7ff664dfb6-lcc5q" Apr 17 21:00:09.958003 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:09.957957 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/422bf273-28f9-489e-8bfb-0834a4d90d5b-console-config\") pod \"console-7ff664dfb6-lcc5q\" (UID: \"422bf273-28f9-489e-8bfb-0834a4d90d5b\") " pod="openshift-console/console-7ff664dfb6-lcc5q" Apr 17 21:00:09.958003 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:09.957992 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/422bf273-28f9-489e-8bfb-0834a4d90d5b-console-serving-cert\") pod \"console-7ff664dfb6-lcc5q\" (UID: \"422bf273-28f9-489e-8bfb-0834a4d90d5b\") " pod="openshift-console/console-7ff664dfb6-lcc5q" Apr 17 21:00:09.958308 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:09.958073 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/422bf273-28f9-489e-8bfb-0834a4d90d5b-trusted-ca-bundle\") pod \"console-7ff664dfb6-lcc5q\" (UID: \"422bf273-28f9-489e-8bfb-0834a4d90d5b\") " pod="openshift-console/console-7ff664dfb6-lcc5q" Apr 17 21:00:09.958308 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:09.958096 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/422bf273-28f9-489e-8bfb-0834a4d90d5b-oauth-serving-cert\") pod \"console-7ff664dfb6-lcc5q\" (UID: \"422bf273-28f9-489e-8bfb-0834a4d90d5b\") " pod="openshift-console/console-7ff664dfb6-lcc5q" Apr 17 21:00:09.958308 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:09.958131 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/422bf273-28f9-489e-8bfb-0834a4d90d5b-console-oauth-config\") pod \"console-7ff664dfb6-lcc5q\" (UID: \"422bf273-28f9-489e-8bfb-0834a4d90d5b\") " pod="openshift-console/console-7ff664dfb6-lcc5q" Apr 17 21:00:09.958308 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:09.958184 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-87jrc\" (UniqueName: \"kubernetes.io/projected/422bf273-28f9-489e-8bfb-0834a4d90d5b-kube-api-access-87jrc\") pod \"console-7ff664dfb6-lcc5q\" (UID: \"422bf273-28f9-489e-8bfb-0834a4d90d5b\") " pod="openshift-console/console-7ff664dfb6-lcc5q" Apr 17 21:00:09.958741 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:09.958711 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/422bf273-28f9-489e-8bfb-0834a4d90d5b-service-ca\") pod \"console-7ff664dfb6-lcc5q\" (UID: \"422bf273-28f9-489e-8bfb-0834a4d90d5b\") " pod="openshift-console/console-7ff664dfb6-lcc5q" Apr 17 21:00:09.958863 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:09.958740 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/422bf273-28f9-489e-8bfb-0834a4d90d5b-console-config\") pod \"console-7ff664dfb6-lcc5q\" (UID: \"422bf273-28f9-489e-8bfb-0834a4d90d5b\") " pod="openshift-console/console-7ff664dfb6-lcc5q" Apr 17 21:00:09.959061 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:09.959018 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/422bf273-28f9-489e-8bfb-0834a4d90d5b-trusted-ca-bundle\") pod \"console-7ff664dfb6-lcc5q\" (UID: \"422bf273-28f9-489e-8bfb-0834a4d90d5b\") " pod="openshift-console/console-7ff664dfb6-lcc5q" Apr 17 21:00:09.959277 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:09.959258 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/422bf273-28f9-489e-8bfb-0834a4d90d5b-oauth-serving-cert\") pod \"console-7ff664dfb6-lcc5q\" (UID: \"422bf273-28f9-489e-8bfb-0834a4d90d5b\") " pod="openshift-console/console-7ff664dfb6-lcc5q" Apr 17 21:00:09.960752 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:09.960728 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/422bf273-28f9-489e-8bfb-0834a4d90d5b-console-serving-cert\") pod \"console-7ff664dfb6-lcc5q\" (UID: \"422bf273-28f9-489e-8bfb-0834a4d90d5b\") " pod="openshift-console/console-7ff664dfb6-lcc5q" Apr 17 21:00:09.961183 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:09.961162 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/422bf273-28f9-489e-8bfb-0834a4d90d5b-console-oauth-config\") pod \"console-7ff664dfb6-lcc5q\" (UID: \"422bf273-28f9-489e-8bfb-0834a4d90d5b\") " pod="openshift-console/console-7ff664dfb6-lcc5q" Apr 17 21:00:09.966849 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:09.966826 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-87jrc\" (UniqueName: \"kubernetes.io/projected/422bf273-28f9-489e-8bfb-0834a4d90d5b-kube-api-access-87jrc\") pod \"console-7ff664dfb6-lcc5q\" (UID: \"422bf273-28f9-489e-8bfb-0834a4d90d5b\") " pod="openshift-console/console-7ff664dfb6-lcc5q" Apr 17 21:00:09.998440 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:09.998417 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7ff664dfb6-lcc5q" Apr 17 21:00:10.127227 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:10.127200 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7ff664dfb6-lcc5q"] Apr 17 21:00:10.128275 ip-10-0-128-248 kubenswrapper[2576]: W0417 21:00:10.128243 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod422bf273_28f9_489e_8bfb_0834a4d90d5b.slice/crio-3079bd76b4f25ae58c1c3d560ef2eedd42625fc92f410e1a98dbabe0c184e172 WatchSource:0}: Error finding container 3079bd76b4f25ae58c1c3d560ef2eedd42625fc92f410e1a98dbabe0c184e172: Status 404 returned error can't find the container with id 3079bd76b4f25ae58c1c3d560ef2eedd42625fc92f410e1a98dbabe0c184e172 Apr 17 21:00:10.477278 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:10.477242 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7ff664dfb6-lcc5q" event={"ID":"422bf273-28f9-489e-8bfb-0834a4d90d5b","Type":"ContainerStarted","Data":"c28089263cef9e1cb4a727d03048748603ec9d7d17166dc77395c4a9ff1cd20a"} Apr 17 21:00:10.477278 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:10.477282 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7ff664dfb6-lcc5q" event={"ID":"422bf273-28f9-489e-8bfb-0834a4d90d5b","Type":"ContainerStarted","Data":"3079bd76b4f25ae58c1c3d560ef2eedd42625fc92f410e1a98dbabe0c184e172"} Apr 17 21:00:10.497309 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:10.497264 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7ff664dfb6-lcc5q" podStartSLOduration=1.497250521 podStartE2EDuration="1.497250521s" podCreationTimestamp="2026-04-17 21:00:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 21:00:10.494781026 +0000 UTC m=+572.501195084" watchObservedRunningTime="2026-04-17 21:00:10.497250521 +0000 UTC m=+572.503664555" Apr 17 21:00:16.459755 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:16.459718 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-lb25t" Apr 17 21:00:18.223695 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:18.223662 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-s92zs"] Apr 17 21:00:18.228160 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:18.228141 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-s92zs" Apr 17 21:00:18.241106 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:18.241080 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-s92zs"] Apr 17 21:00:18.278331 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:18.278308 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-s92zs"] Apr 17 21:00:18.278529 ip-10-0-128-248 kubenswrapper[2576]: E0417 21:00:18.278510 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[extensions-socket-volume kube-api-access-62s7k], unattached volumes=[], failed to process volumes=[]: context canceled" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-s92zs" podUID="e9e5dcf6-050b-4a68-b8ed-7f7c86a28aea" Apr 17 21:00:18.288721 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:18.288694 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-s92zs"] Apr 17 21:00:18.299614 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:18.299586 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-lb25t"] Apr 17 21:00:18.299864 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:18.299834 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-lb25t" podUID="54172ef6-ac96-4324-b0eb-8d9adcb84cfd" containerName="manager" containerID="cri-o://b0fa58aedafc1e0f244f850d39703ed5f9b72bb63dd8066ad266e757bd6824b4" gracePeriod=2 Apr 17 21:00:18.301966 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:18.301946 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-khlfh"] Apr 17 21:00:18.305660 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:18.305642 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-khlfh" Apr 17 21:00:18.317552 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:18.317518 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-lb25t"] Apr 17 21:00:18.319993 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:18.319972 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-khlfh"] Apr 17 21:00:18.327896 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:18.327875 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-w4g8c"] Apr 17 21:00:18.328278 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:18.328262 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="54172ef6-ac96-4324-b0eb-8d9adcb84cfd" containerName="manager" Apr 17 21:00:18.328278 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:18.328278 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="54172ef6-ac96-4324-b0eb-8d9adcb84cfd" containerName="manager" Apr 17 21:00:18.328490 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:18.328339 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="54172ef6-ac96-4324-b0eb-8d9adcb84cfd" containerName="manager" Apr 17 21:00:18.331450 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:18.331431 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-w4g8c" Apr 17 21:00:18.338407 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:18.338379 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/e9e5dcf6-050b-4a68-b8ed-7f7c86a28aea-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-s92zs\" (UID: \"e9e5dcf6-050b-4a68-b8ed-7f7c86a28aea\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-s92zs" Apr 17 21:00:18.338579 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:18.338556 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62s7k\" (UniqueName: \"kubernetes.io/projected/e9e5dcf6-050b-4a68-b8ed-7f7c86a28aea-kube-api-access-62s7k\") pod \"kuadrant-operator-controller-manager-55c7f4c975-s92zs\" (UID: \"e9e5dcf6-050b-4a68-b8ed-7f7c86a28aea\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-s92zs" Apr 17 21:00:18.352793 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:18.352772 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-w4g8c"] Apr 17 21:00:18.439657 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:18.439622 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/c001474b-fef5-48f1-93dc-4225796b24ca-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-w4g8c\" (UID: \"c001474b-fef5-48f1-93dc-4225796b24ca\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-w4g8c" Apr 17 21:00:18.439772 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:18.439724 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-62s7k\" (UniqueName: \"kubernetes.io/projected/e9e5dcf6-050b-4a68-b8ed-7f7c86a28aea-kube-api-access-62s7k\") pod \"kuadrant-operator-controller-manager-55c7f4c975-s92zs\" (UID: \"e9e5dcf6-050b-4a68-b8ed-7f7c86a28aea\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-s92zs" Apr 17 21:00:18.439818 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:18.439783 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-httmk\" (UniqueName: \"kubernetes.io/projected/58ae72d8-1ff3-45fb-8f4e-6d4afee8c6d2-kube-api-access-httmk\") pod \"kuadrant-operator-controller-manager-55c7f4c975-khlfh\" (UID: \"58ae72d8-1ff3-45fb-8f4e-6d4afee8c6d2\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-khlfh" Apr 17 21:00:18.439818 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:18.439810 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wf4x\" (UniqueName: \"kubernetes.io/projected/c001474b-fef5-48f1-93dc-4225796b24ca-kube-api-access-2wf4x\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-w4g8c\" (UID: \"c001474b-fef5-48f1-93dc-4225796b24ca\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-w4g8c" Apr 17 21:00:18.439882 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:18.439838 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/e9e5dcf6-050b-4a68-b8ed-7f7c86a28aea-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-s92zs\" (UID: \"e9e5dcf6-050b-4a68-b8ed-7f7c86a28aea\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-s92zs" Apr 17 21:00:18.439882 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:18.439863 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/58ae72d8-1ff3-45fb-8f4e-6d4afee8c6d2-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-khlfh\" (UID: \"58ae72d8-1ff3-45fb-8f4e-6d4afee8c6d2\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-khlfh" Apr 17 21:00:18.440319 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:18.440293 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/e9e5dcf6-050b-4a68-b8ed-7f7c86a28aea-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-s92zs\" (UID: \"e9e5dcf6-050b-4a68-b8ed-7f7c86a28aea\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-s92zs" Apr 17 21:00:18.445284 ip-10-0-128-248 kubenswrapper[2576]: E0417 21:00:18.445223 2576 projected.go:194] Error preparing data for projected volume kube-api-access-62s7k for pod kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-s92zs: failed to fetch token: pod "kuadrant-operator-controller-manager-55c7f4c975-s92zs" not found Apr 17 21:00:18.445398 ip-10-0-128-248 kubenswrapper[2576]: E0417 21:00:18.445291 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e9e5dcf6-050b-4a68-b8ed-7f7c86a28aea-kube-api-access-62s7k podName:e9e5dcf6-050b-4a68-b8ed-7f7c86a28aea nodeName:}" failed. No retries permitted until 2026-04-17 21:00:18.945269971 +0000 UTC m=+580.951683982 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-62s7k" (UniqueName: "kubernetes.io/projected/e9e5dcf6-050b-4a68-b8ed-7f7c86a28aea-kube-api-access-62s7k") pod "kuadrant-operator-controller-manager-55c7f4c975-s92zs" (UID: "e9e5dcf6-050b-4a68-b8ed-7f7c86a28aea") : failed to fetch token: pod "kuadrant-operator-controller-manager-55c7f4c975-s92zs" not found Apr 17 21:00:18.507653 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:18.507576 2576 generic.go:358] "Generic (PLEG): container finished" podID="54172ef6-ac96-4324-b0eb-8d9adcb84cfd" containerID="b0fa58aedafc1e0f244f850d39703ed5f9b72bb63dd8066ad266e757bd6824b4" exitCode=0 Apr 17 21:00:18.507775 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:18.507717 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-s92zs" Apr 17 21:00:18.512102 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:18.512082 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-s92zs" Apr 17 21:00:18.514649 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:18.514621 2576 status_manager.go:895] "Failed to get status for pod" podUID="e9e5dcf6-050b-4a68-b8ed-7f7c86a28aea" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-s92zs" err="pods \"kuadrant-operator-controller-manager-55c7f4c975-s92zs\" is forbidden: User \"system:node:ip-10-0-128-248.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-128-248.ec2.internal' and this object" Apr 17 21:00:18.535944 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:18.535928 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-lb25t" Apr 17 21:00:18.538363 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:18.538341 2576 status_manager.go:895] "Failed to get status for pod" podUID="e9e5dcf6-050b-4a68-b8ed-7f7c86a28aea" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-s92zs" err="pods \"kuadrant-operator-controller-manager-55c7f4c975-s92zs\" is forbidden: User \"system:node:ip-10-0-128-248.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-128-248.ec2.internal' and this object" Apr 17 21:00:18.540823 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:18.540802 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-httmk\" (UniqueName: \"kubernetes.io/projected/58ae72d8-1ff3-45fb-8f4e-6d4afee8c6d2-kube-api-access-httmk\") pod \"kuadrant-operator-controller-manager-55c7f4c975-khlfh\" (UID: \"58ae72d8-1ff3-45fb-8f4e-6d4afee8c6d2\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-khlfh" Apr 17 21:00:18.540891 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:18.540837 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2wf4x\" (UniqueName: \"kubernetes.io/projected/c001474b-fef5-48f1-93dc-4225796b24ca-kube-api-access-2wf4x\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-w4g8c\" (UID: \"c001474b-fef5-48f1-93dc-4225796b24ca\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-w4g8c" Apr 17 21:00:18.540891 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:18.540860 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/58ae72d8-1ff3-45fb-8f4e-6d4afee8c6d2-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-khlfh\" (UID: \"58ae72d8-1ff3-45fb-8f4e-6d4afee8c6d2\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-khlfh" Apr 17 21:00:18.540974 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:18.540898 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/c001474b-fef5-48f1-93dc-4225796b24ca-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-w4g8c\" (UID: \"c001474b-fef5-48f1-93dc-4225796b24ca\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-w4g8c" Apr 17 21:00:18.541317 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:18.541296 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/58ae72d8-1ff3-45fb-8f4e-6d4afee8c6d2-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-khlfh\" (UID: \"58ae72d8-1ff3-45fb-8f4e-6d4afee8c6d2\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-khlfh" Apr 17 21:00:18.541365 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:18.541345 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/c001474b-fef5-48f1-93dc-4225796b24ca-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-w4g8c\" (UID: \"c001474b-fef5-48f1-93dc-4225796b24ca\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-w4g8c" Apr 17 21:00:18.548850 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:18.548824 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wf4x\" (UniqueName: \"kubernetes.io/projected/c001474b-fef5-48f1-93dc-4225796b24ca-kube-api-access-2wf4x\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-w4g8c\" (UID: \"c001474b-fef5-48f1-93dc-4225796b24ca\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-w4g8c" Apr 17 21:00:18.548960 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:18.548905 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-httmk\" (UniqueName: \"kubernetes.io/projected/58ae72d8-1ff3-45fb-8f4e-6d4afee8c6d2-kube-api-access-httmk\") pod \"kuadrant-operator-controller-manager-55c7f4c975-khlfh\" (UID: \"58ae72d8-1ff3-45fb-8f4e-6d4afee8c6d2\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-khlfh" Apr 17 21:00:18.642160 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:18.642130 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7bt7x\" (UniqueName: \"kubernetes.io/projected/54172ef6-ac96-4324-b0eb-8d9adcb84cfd-kube-api-access-7bt7x\") pod \"54172ef6-ac96-4324-b0eb-8d9adcb84cfd\" (UID: \"54172ef6-ac96-4324-b0eb-8d9adcb84cfd\") " Apr 17 21:00:18.642283 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:18.642203 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/54172ef6-ac96-4324-b0eb-8d9adcb84cfd-extensions-socket-volume\") pod \"54172ef6-ac96-4324-b0eb-8d9adcb84cfd\" (UID: \"54172ef6-ac96-4324-b0eb-8d9adcb84cfd\") " Apr 17 21:00:18.642283 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:18.642274 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/e9e5dcf6-050b-4a68-b8ed-7f7c86a28aea-extensions-socket-volume\") pod \"e9e5dcf6-050b-4a68-b8ed-7f7c86a28aea\" (UID: \"e9e5dcf6-050b-4a68-b8ed-7f7c86a28aea\") " Apr 17 21:00:18.642542 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:18.642516 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9e5dcf6-050b-4a68-b8ed-7f7c86a28aea-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "e9e5dcf6-050b-4a68-b8ed-7f7c86a28aea" (UID: "e9e5dcf6-050b-4a68-b8ed-7f7c86a28aea"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 21:00:18.642615 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:18.642575 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-62s7k\" (UniqueName: \"kubernetes.io/projected/e9e5dcf6-050b-4a68-b8ed-7f7c86a28aea-kube-api-access-62s7k\") on node \"ip-10-0-128-248.ec2.internal\" DevicePath \"\"" Apr 17 21:00:18.642670 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:18.642631 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54172ef6-ac96-4324-b0eb-8d9adcb84cfd-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "54172ef6-ac96-4324-b0eb-8d9adcb84cfd" (UID: "54172ef6-ac96-4324-b0eb-8d9adcb84cfd"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 21:00:18.644050 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:18.644017 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54172ef6-ac96-4324-b0eb-8d9adcb84cfd-kube-api-access-7bt7x" (OuterVolumeSpecName: "kube-api-access-7bt7x") pod "54172ef6-ac96-4324-b0eb-8d9adcb84cfd" (UID: "54172ef6-ac96-4324-b0eb-8d9adcb84cfd"). InnerVolumeSpecName "kube-api-access-7bt7x". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 21:00:18.675256 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:18.675234 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-khlfh" Apr 17 21:00:18.683897 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:18.683868 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-w4g8c" Apr 17 21:00:18.746153 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:18.743966 2576 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/e9e5dcf6-050b-4a68-b8ed-7f7c86a28aea-extensions-socket-volume\") on node \"ip-10-0-128-248.ec2.internal\" DevicePath \"\"" Apr 17 21:00:18.746153 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:18.743999 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7bt7x\" (UniqueName: \"kubernetes.io/projected/54172ef6-ac96-4324-b0eb-8d9adcb84cfd-kube-api-access-7bt7x\") on node \"ip-10-0-128-248.ec2.internal\" DevicePath \"\"" Apr 17 21:00:18.746153 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:18.744016 2576 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/54172ef6-ac96-4324-b0eb-8d9adcb84cfd-extensions-socket-volume\") on node \"ip-10-0-128-248.ec2.internal\" DevicePath \"\"" Apr 17 21:00:18.875002 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:18.874979 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-w4g8c"] Apr 17 21:00:18.876424 ip-10-0-128-248 kubenswrapper[2576]: W0417 21:00:18.876396 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc001474b_fef5_48f1_93dc_4225796b24ca.slice/crio-2bd6182d7f41ef23108481651042f88e2aebadaceba599ca6798e89c6b8873dc WatchSource:0}: Error finding container 2bd6182d7f41ef23108481651042f88e2aebadaceba599ca6798e89c6b8873dc: Status 404 returned error can't find the container with id 2bd6182d7f41ef23108481651042f88e2aebadaceba599ca6798e89c6b8873dc Apr 17 21:00:19.026940 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:19.026915 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-khlfh"] Apr 17 21:00:19.513511 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:19.513470 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-khlfh" event={"ID":"58ae72d8-1ff3-45fb-8f4e-6d4afee8c6d2","Type":"ContainerStarted","Data":"767680c03dd0bf60e6f5074784e53b6d024c22646048d5adbdc8041ddfe1e342"} Apr 17 21:00:19.513511 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:19.513509 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-khlfh" event={"ID":"58ae72d8-1ff3-45fb-8f4e-6d4afee8c6d2","Type":"ContainerStarted","Data":"ffade7bfdaec405241c65604e75b2e3407e7fd67c02db38548be4be87ba84745"} Apr 17 21:00:19.514073 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:19.513550 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-khlfh" Apr 17 21:00:19.514986 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:19.514956 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-w4g8c" event={"ID":"c001474b-fef5-48f1-93dc-4225796b24ca","Type":"ContainerStarted","Data":"10cd52cbb2ff670dc2eaf8b7f72e51acb835a6d6be9d13965a28ea02c78d73e0"} Apr 17 21:00:19.515137 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:19.514994 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-w4g8c" event={"ID":"c001474b-fef5-48f1-93dc-4225796b24ca","Type":"ContainerStarted","Data":"2bd6182d7f41ef23108481651042f88e2aebadaceba599ca6798e89c6b8873dc"} Apr 17 21:00:19.515137 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:19.515070 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-w4g8c" Apr 17 21:00:19.515873 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:19.515839 2576 status_manager.go:895] "Failed to get status for pod" podUID="e9e5dcf6-050b-4a68-b8ed-7f7c86a28aea" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-s92zs" err="pods \"kuadrant-operator-controller-manager-55c7f4c975-s92zs\" is forbidden: User \"system:node:ip-10-0-128-248.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-128-248.ec2.internal' and this object" Apr 17 21:00:19.516239 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:19.516221 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-lb25t" Apr 17 21:00:19.516325 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:19.516260 2576 scope.go:117] "RemoveContainer" containerID="b0fa58aedafc1e0f244f850d39703ed5f9b72bb63dd8066ad266e757bd6824b4" Apr 17 21:00:19.516460 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:19.516428 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-s92zs" Apr 17 21:00:19.535575 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:19.535538 2576 status_manager.go:895] "Failed to get status for pod" podUID="e9e5dcf6-050b-4a68-b8ed-7f7c86a28aea" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-s92zs" err="pods \"kuadrant-operator-controller-manager-55c7f4c975-s92zs\" is forbidden: User \"system:node:ip-10-0-128-248.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-128-248.ec2.internal' and this object" Apr 17 21:00:19.538610 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:19.538570 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-khlfh" podStartSLOduration=1.538558202 podStartE2EDuration="1.538558202s" podCreationTimestamp="2026-04-17 21:00:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 21:00:19.533364252 +0000 UTC m=+581.539778285" watchObservedRunningTime="2026-04-17 21:00:19.538558202 +0000 UTC m=+581.544972234" Apr 17 21:00:19.556074 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:19.556010 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-w4g8c" podStartSLOduration=1.555998582 podStartE2EDuration="1.555998582s" podCreationTimestamp="2026-04-17 21:00:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 21:00:19.553410258 +0000 UTC m=+581.559824295" watchObservedRunningTime="2026-04-17 21:00:19.555998582 +0000 UTC m=+581.562412614" Apr 17 21:00:19.999070 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:19.999021 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7ff664dfb6-lcc5q" Apr 17 21:00:19.999070 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:19.999074 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7ff664dfb6-lcc5q" Apr 17 21:00:20.003664 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:20.003641 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7ff664dfb6-lcc5q" Apr 17 21:00:20.005933 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:20.005897 2576 status_manager.go:895] "Failed to get status for pod" podUID="e9e5dcf6-050b-4a68-b8ed-7f7c86a28aea" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-s92zs" err="pods \"kuadrant-operator-controller-manager-55c7f4c975-s92zs\" is forbidden: User \"system:node:ip-10-0-128-248.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-128-248.ec2.internal' and this object" Apr 17 21:00:20.487898 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:20.487845 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54172ef6-ac96-4324-b0eb-8d9adcb84cfd" path="/var/lib/kubelet/pods/54172ef6-ac96-4324-b0eb-8d9adcb84cfd/volumes" Apr 17 21:00:20.488282 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:20.488260 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9e5dcf6-050b-4a68-b8ed-7f7c86a28aea" path="/var/lib/kubelet/pods/e9e5dcf6-050b-4a68-b8ed-7f7c86a28aea/volumes" Apr 17 21:00:20.524972 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:20.524945 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7ff664dfb6-lcc5q" Apr 17 21:00:20.604419 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:20.604384 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-676df7fdc7-s829p"] Apr 17 21:00:30.523570 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:30.523540 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-khlfh" Apr 17 21:00:30.523995 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:30.523589 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-w4g8c" Apr 17 21:00:30.605884 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:30.605814 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-w4g8c"] Apr 17 21:00:30.606257 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:30.606198 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-w4g8c" podUID="c001474b-fef5-48f1-93dc-4225796b24ca" containerName="manager" containerID="cri-o://10cd52cbb2ff670dc2eaf8b7f72e51acb835a6d6be9d13965a28ea02c78d73e0" gracePeriod=10 Apr 17 21:00:30.848865 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:30.848844 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-w4g8c" Apr 17 21:00:30.951586 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:30.951548 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/c001474b-fef5-48f1-93dc-4225796b24ca-extensions-socket-volume\") pod \"c001474b-fef5-48f1-93dc-4225796b24ca\" (UID: \"c001474b-fef5-48f1-93dc-4225796b24ca\") " Apr 17 21:00:30.951764 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:30.951653 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2wf4x\" (UniqueName: \"kubernetes.io/projected/c001474b-fef5-48f1-93dc-4225796b24ca-kube-api-access-2wf4x\") pod \"c001474b-fef5-48f1-93dc-4225796b24ca\" (UID: \"c001474b-fef5-48f1-93dc-4225796b24ca\") " Apr 17 21:00:30.951974 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:30.951951 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c001474b-fef5-48f1-93dc-4225796b24ca-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "c001474b-fef5-48f1-93dc-4225796b24ca" (UID: "c001474b-fef5-48f1-93dc-4225796b24ca"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 21:00:30.953594 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:30.953573 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c001474b-fef5-48f1-93dc-4225796b24ca-kube-api-access-2wf4x" (OuterVolumeSpecName: "kube-api-access-2wf4x") pod "c001474b-fef5-48f1-93dc-4225796b24ca" (UID: "c001474b-fef5-48f1-93dc-4225796b24ca"). InnerVolumeSpecName "kube-api-access-2wf4x". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 21:00:31.052411 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:31.052368 2576 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/c001474b-fef5-48f1-93dc-4225796b24ca-extensions-socket-volume\") on node \"ip-10-0-128-248.ec2.internal\" DevicePath \"\"" Apr 17 21:00:31.052411 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:31.052402 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2wf4x\" (UniqueName: \"kubernetes.io/projected/c001474b-fef5-48f1-93dc-4225796b24ca-kube-api-access-2wf4x\") on node \"ip-10-0-128-248.ec2.internal\" DevicePath \"\"" Apr 17 21:00:31.565611 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:31.565572 2576 generic.go:358] "Generic (PLEG): container finished" podID="c001474b-fef5-48f1-93dc-4225796b24ca" containerID="10cd52cbb2ff670dc2eaf8b7f72e51acb835a6d6be9d13965a28ea02c78d73e0" exitCode=0 Apr 17 21:00:31.566016 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:31.565642 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-w4g8c" Apr 17 21:00:31.566016 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:31.565647 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-w4g8c" event={"ID":"c001474b-fef5-48f1-93dc-4225796b24ca","Type":"ContainerDied","Data":"10cd52cbb2ff670dc2eaf8b7f72e51acb835a6d6be9d13965a28ea02c78d73e0"} Apr 17 21:00:31.566016 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:31.565684 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-w4g8c" event={"ID":"c001474b-fef5-48f1-93dc-4225796b24ca","Type":"ContainerDied","Data":"2bd6182d7f41ef23108481651042f88e2aebadaceba599ca6798e89c6b8873dc"} Apr 17 21:00:31.566016 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:31.565699 2576 scope.go:117] "RemoveContainer" containerID="10cd52cbb2ff670dc2eaf8b7f72e51acb835a6d6be9d13965a28ea02c78d73e0" Apr 17 21:00:31.574699 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:31.574681 2576 scope.go:117] "RemoveContainer" containerID="10cd52cbb2ff670dc2eaf8b7f72e51acb835a6d6be9d13965a28ea02c78d73e0" Apr 17 21:00:31.574944 ip-10-0-128-248 kubenswrapper[2576]: E0417 21:00:31.574926 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10cd52cbb2ff670dc2eaf8b7f72e51acb835a6d6be9d13965a28ea02c78d73e0\": container with ID starting with 10cd52cbb2ff670dc2eaf8b7f72e51acb835a6d6be9d13965a28ea02c78d73e0 not found: ID does not exist" containerID="10cd52cbb2ff670dc2eaf8b7f72e51acb835a6d6be9d13965a28ea02c78d73e0" Apr 17 21:00:31.575006 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:31.574956 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10cd52cbb2ff670dc2eaf8b7f72e51acb835a6d6be9d13965a28ea02c78d73e0"} err="failed to get container status \"10cd52cbb2ff670dc2eaf8b7f72e51acb835a6d6be9d13965a28ea02c78d73e0\": rpc error: code = NotFound desc = could not find container \"10cd52cbb2ff670dc2eaf8b7f72e51acb835a6d6be9d13965a28ea02c78d73e0\": container with ID starting with 10cd52cbb2ff670dc2eaf8b7f72e51acb835a6d6be9d13965a28ea02c78d73e0 not found: ID does not exist" Apr 17 21:00:31.588340 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:31.588312 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-w4g8c"] Apr 17 21:00:31.591207 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:31.591181 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-w4g8c"] Apr 17 21:00:32.488614 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:32.488578 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c001474b-fef5-48f1-93dc-4225796b24ca" path="/var/lib/kubelet/pods/c001474b-fef5-48f1-93dc-4225796b24ca/volumes" Apr 17 21:00:38.455710 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:38.455682 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pwcjq_c5da4be1-c2a5-411b-a362-4b558f88d556/ovn-acl-logging/0.log" Apr 17 21:00:38.456545 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:38.456519 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pwcjq_c5da4be1-c2a5-411b-a362-4b558f88d556/ovn-acl-logging/0.log" Apr 17 21:00:45.623442 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:45.623376 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-676df7fdc7-s829p" podUID="ba3f396b-dc67-4c7a-8171-03acae476c79" containerName="console" containerID="cri-o://a4d74f750d875474f81b7e49b2590d3a8fc1972cb07ec3209389cd085dee3aa0" gracePeriod=15 Apr 17 21:00:45.867431 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:45.867406 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-676df7fdc7-s829p_ba3f396b-dc67-4c7a-8171-03acae476c79/console/0.log" Apr 17 21:00:45.867539 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:45.867470 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-676df7fdc7-s829p" Apr 17 21:00:45.989168 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:45.989082 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ba3f396b-dc67-4c7a-8171-03acae476c79-console-serving-cert\") pod \"ba3f396b-dc67-4c7a-8171-03acae476c79\" (UID: \"ba3f396b-dc67-4c7a-8171-03acae476c79\") " Apr 17 21:00:45.989168 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:45.989133 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ba3f396b-dc67-4c7a-8171-03acae476c79-service-ca\") pod \"ba3f396b-dc67-4c7a-8171-03acae476c79\" (UID: \"ba3f396b-dc67-4c7a-8171-03acae476c79\") " Apr 17 21:00:45.989168 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:45.989167 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ba3f396b-dc67-4c7a-8171-03acae476c79-console-oauth-config\") pod \"ba3f396b-dc67-4c7a-8171-03acae476c79\" (UID: \"ba3f396b-dc67-4c7a-8171-03acae476c79\") " Apr 17 21:00:45.989460 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:45.989246 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba3f396b-dc67-4c7a-8171-03acae476c79-trusted-ca-bundle\") pod \"ba3f396b-dc67-4c7a-8171-03acae476c79\" (UID: \"ba3f396b-dc67-4c7a-8171-03acae476c79\") " Apr 17 21:00:45.989460 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:45.989300 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ba3f396b-dc67-4c7a-8171-03acae476c79-console-config\") pod \"ba3f396b-dc67-4c7a-8171-03acae476c79\" (UID: \"ba3f396b-dc67-4c7a-8171-03acae476c79\") " Apr 17 21:00:45.989460 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:45.989339 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ba3f396b-dc67-4c7a-8171-03acae476c79-oauth-serving-cert\") pod \"ba3f396b-dc67-4c7a-8171-03acae476c79\" (UID: \"ba3f396b-dc67-4c7a-8171-03acae476c79\") " Apr 17 21:00:45.989460 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:45.989365 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j9ng2\" (UniqueName: \"kubernetes.io/projected/ba3f396b-dc67-4c7a-8171-03acae476c79-kube-api-access-j9ng2\") pod \"ba3f396b-dc67-4c7a-8171-03acae476c79\" (UID: \"ba3f396b-dc67-4c7a-8171-03acae476c79\") " Apr 17 21:00:45.989685 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:45.989655 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba3f396b-dc67-4c7a-8171-03acae476c79-service-ca" (OuterVolumeSpecName: "service-ca") pod "ba3f396b-dc67-4c7a-8171-03acae476c79" (UID: "ba3f396b-dc67-4c7a-8171-03acae476c79"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 21:00:45.989874 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:45.989848 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba3f396b-dc67-4c7a-8171-03acae476c79-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "ba3f396b-dc67-4c7a-8171-03acae476c79" (UID: "ba3f396b-dc67-4c7a-8171-03acae476c79"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 21:00:45.989950 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:45.989864 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba3f396b-dc67-4c7a-8171-03acae476c79-console-config" (OuterVolumeSpecName: "console-config") pod "ba3f396b-dc67-4c7a-8171-03acae476c79" (UID: "ba3f396b-dc67-4c7a-8171-03acae476c79"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 21:00:45.989950 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:45.989902 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba3f396b-dc67-4c7a-8171-03acae476c79-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "ba3f396b-dc67-4c7a-8171-03acae476c79" (UID: "ba3f396b-dc67-4c7a-8171-03acae476c79"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 21:00:45.991416 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:45.991383 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba3f396b-dc67-4c7a-8171-03acae476c79-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "ba3f396b-dc67-4c7a-8171-03acae476c79" (UID: "ba3f396b-dc67-4c7a-8171-03acae476c79"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 21:00:45.991595 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:45.991570 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba3f396b-dc67-4c7a-8171-03acae476c79-kube-api-access-j9ng2" (OuterVolumeSpecName: "kube-api-access-j9ng2") pod "ba3f396b-dc67-4c7a-8171-03acae476c79" (UID: "ba3f396b-dc67-4c7a-8171-03acae476c79"). InnerVolumeSpecName "kube-api-access-j9ng2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 21:00:45.991745 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:45.991682 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba3f396b-dc67-4c7a-8171-03acae476c79-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "ba3f396b-dc67-4c7a-8171-03acae476c79" (UID: "ba3f396b-dc67-4c7a-8171-03acae476c79"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 21:00:46.090956 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:46.090913 2576 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ba3f396b-dc67-4c7a-8171-03acae476c79-console-config\") on node \"ip-10-0-128-248.ec2.internal\" DevicePath \"\"" Apr 17 21:00:46.090956 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:46.090949 2576 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ba3f396b-dc67-4c7a-8171-03acae476c79-oauth-serving-cert\") on node \"ip-10-0-128-248.ec2.internal\" DevicePath \"\"" Apr 17 21:00:46.090956 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:46.090964 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-j9ng2\" (UniqueName: \"kubernetes.io/projected/ba3f396b-dc67-4c7a-8171-03acae476c79-kube-api-access-j9ng2\") on node \"ip-10-0-128-248.ec2.internal\" DevicePath \"\"" Apr 17 21:00:46.091236 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:46.090978 2576 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ba3f396b-dc67-4c7a-8171-03acae476c79-console-serving-cert\") on node \"ip-10-0-128-248.ec2.internal\" DevicePath \"\"" Apr 17 21:00:46.091236 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:46.090989 2576 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ba3f396b-dc67-4c7a-8171-03acae476c79-service-ca\") on node \"ip-10-0-128-248.ec2.internal\" DevicePath \"\"" Apr 17 21:00:46.091236 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:46.091001 2576 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ba3f396b-dc67-4c7a-8171-03acae476c79-console-oauth-config\") on node \"ip-10-0-128-248.ec2.internal\" DevicePath \"\"" Apr 17 21:00:46.091236 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:46.091015 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba3f396b-dc67-4c7a-8171-03acae476c79-trusted-ca-bundle\") on node \"ip-10-0-128-248.ec2.internal\" DevicePath \"\"" Apr 17 21:00:46.625995 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:46.625965 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-676df7fdc7-s829p_ba3f396b-dc67-4c7a-8171-03acae476c79/console/0.log" Apr 17 21:00:46.626428 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:46.626006 2576 generic.go:358] "Generic (PLEG): container finished" podID="ba3f396b-dc67-4c7a-8171-03acae476c79" containerID="a4d74f750d875474f81b7e49b2590d3a8fc1972cb07ec3209389cd085dee3aa0" exitCode=2 Apr 17 21:00:46.626428 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:46.626057 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-676df7fdc7-s829p" event={"ID":"ba3f396b-dc67-4c7a-8171-03acae476c79","Type":"ContainerDied","Data":"a4d74f750d875474f81b7e49b2590d3a8fc1972cb07ec3209389cd085dee3aa0"} Apr 17 21:00:46.626428 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:46.626091 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-676df7fdc7-s829p" Apr 17 21:00:46.626428 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:46.626106 2576 scope.go:117] "RemoveContainer" containerID="a4d74f750d875474f81b7e49b2590d3a8fc1972cb07ec3209389cd085dee3aa0" Apr 17 21:00:46.626428 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:46.626094 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-676df7fdc7-s829p" event={"ID":"ba3f396b-dc67-4c7a-8171-03acae476c79","Type":"ContainerDied","Data":"75d240729ef30cc6ea70f10c646129e8f3303a60691519acbab751f5770094d1"} Apr 17 21:00:46.634821 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:46.634803 2576 scope.go:117] "RemoveContainer" containerID="a4d74f750d875474f81b7e49b2590d3a8fc1972cb07ec3209389cd085dee3aa0" Apr 17 21:00:46.635106 ip-10-0-128-248 kubenswrapper[2576]: E0417 21:00:46.635082 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4d74f750d875474f81b7e49b2590d3a8fc1972cb07ec3209389cd085dee3aa0\": container with ID starting with a4d74f750d875474f81b7e49b2590d3a8fc1972cb07ec3209389cd085dee3aa0 not found: ID does not exist" containerID="a4d74f750d875474f81b7e49b2590d3a8fc1972cb07ec3209389cd085dee3aa0" Apr 17 21:00:46.635208 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:46.635111 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4d74f750d875474f81b7e49b2590d3a8fc1972cb07ec3209389cd085dee3aa0"} err="failed to get container status \"a4d74f750d875474f81b7e49b2590d3a8fc1972cb07ec3209389cd085dee3aa0\": rpc error: code = NotFound desc = could not find container \"a4d74f750d875474f81b7e49b2590d3a8fc1972cb07ec3209389cd085dee3aa0\": container with ID starting with a4d74f750d875474f81b7e49b2590d3a8fc1972cb07ec3209389cd085dee3aa0 not found: ID does not exist" Apr 17 21:00:46.652123 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:46.652096 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-676df7fdc7-s829p"] Apr 17 21:00:46.655935 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:46.655913 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-676df7fdc7-s829p"] Apr 17 21:00:46.789420 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:46.789317 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-zr68v"] Apr 17 21:00:46.789954 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:46.789920 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ba3f396b-dc67-4c7a-8171-03acae476c79" containerName="console" Apr 17 21:00:46.789954 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:46.789951 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba3f396b-dc67-4c7a-8171-03acae476c79" containerName="console" Apr 17 21:00:46.790173 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:46.789967 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c001474b-fef5-48f1-93dc-4225796b24ca" containerName="manager" Apr 17 21:00:46.790173 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:46.789977 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="c001474b-fef5-48f1-93dc-4225796b24ca" containerName="manager" Apr 17 21:00:46.790173 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:46.790101 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="c001474b-fef5-48f1-93dc-4225796b24ca" containerName="manager" Apr 17 21:00:46.790173 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:46.790119 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="ba3f396b-dc67-4c7a-8171-03acae476c79" containerName="console" Apr 17 21:00:46.795201 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:46.795176 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-zr68v" Apr 17 21:00:46.798044 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:46.798002 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"maas-default-gateway-openshift-default-dockercfg-nkcxc\"" Apr 17 21:00:46.806180 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:46.805322 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-zr68v"] Apr 17 21:00:46.898251 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:46.898215 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/ca407b3c-0d81-447d-8623-bc49da617a4c-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-zr68v\" (UID: \"ca407b3c-0d81-447d-8623-bc49da617a4c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-zr68v" Apr 17 21:00:46.898436 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:46.898271 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/ca407b3c-0d81-447d-8623-bc49da617a4c-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-zr68v\" (UID: \"ca407b3c-0d81-447d-8623-bc49da617a4c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-zr68v" Apr 17 21:00:46.898436 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:46.898336 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/ca407b3c-0d81-447d-8623-bc49da617a4c-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-zr68v\" (UID: \"ca407b3c-0d81-447d-8623-bc49da617a4c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-zr68v" Apr 17 21:00:46.898436 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:46.898365 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/ca407b3c-0d81-447d-8623-bc49da617a4c-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-zr68v\" (UID: \"ca407b3c-0d81-447d-8623-bc49da617a4c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-zr68v" Apr 17 21:00:46.898436 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:46.898410 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9d68\" (UniqueName: \"kubernetes.io/projected/ca407b3c-0d81-447d-8623-bc49da617a4c-kube-api-access-d9d68\") pod \"maas-default-gateway-openshift-default-58b6f876-zr68v\" (UID: \"ca407b3c-0d81-447d-8623-bc49da617a4c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-zr68v" Apr 17 21:00:46.898637 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:46.898447 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/ca407b3c-0d81-447d-8623-bc49da617a4c-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-zr68v\" (UID: \"ca407b3c-0d81-447d-8623-bc49da617a4c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-zr68v" Apr 17 21:00:46.898637 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:46.898478 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/ca407b3c-0d81-447d-8623-bc49da617a4c-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-zr68v\" (UID: \"ca407b3c-0d81-447d-8623-bc49da617a4c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-zr68v" Apr 17 21:00:46.898637 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:46.898528 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/ca407b3c-0d81-447d-8623-bc49da617a4c-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-zr68v\" (UID: \"ca407b3c-0d81-447d-8623-bc49da617a4c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-zr68v" Apr 17 21:00:46.898637 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:46.898598 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/ca407b3c-0d81-447d-8623-bc49da617a4c-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-zr68v\" (UID: \"ca407b3c-0d81-447d-8623-bc49da617a4c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-zr68v" Apr 17 21:00:46.999401 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:46.999357 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d9d68\" (UniqueName: \"kubernetes.io/projected/ca407b3c-0d81-447d-8623-bc49da617a4c-kube-api-access-d9d68\") pod \"maas-default-gateway-openshift-default-58b6f876-zr68v\" (UID: \"ca407b3c-0d81-447d-8623-bc49da617a4c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-zr68v" Apr 17 21:00:46.999578 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:46.999419 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/ca407b3c-0d81-447d-8623-bc49da617a4c-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-zr68v\" (UID: \"ca407b3c-0d81-447d-8623-bc49da617a4c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-zr68v" Apr 17 21:00:46.999578 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:46.999450 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/ca407b3c-0d81-447d-8623-bc49da617a4c-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-zr68v\" (UID: \"ca407b3c-0d81-447d-8623-bc49da617a4c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-zr68v" Apr 17 21:00:46.999578 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:46.999492 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/ca407b3c-0d81-447d-8623-bc49da617a4c-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-zr68v\" (UID: \"ca407b3c-0d81-447d-8623-bc49da617a4c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-zr68v" Apr 17 21:00:46.999578 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:46.999541 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/ca407b3c-0d81-447d-8623-bc49da617a4c-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-zr68v\" (UID: \"ca407b3c-0d81-447d-8623-bc49da617a4c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-zr68v" Apr 17 21:00:46.999789 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:46.999598 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/ca407b3c-0d81-447d-8623-bc49da617a4c-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-zr68v\" (UID: \"ca407b3c-0d81-447d-8623-bc49da617a4c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-zr68v" Apr 17 21:00:46.999789 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:46.999637 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/ca407b3c-0d81-447d-8623-bc49da617a4c-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-zr68v\" (UID: \"ca407b3c-0d81-447d-8623-bc49da617a4c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-zr68v" Apr 17 21:00:46.999900 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:46.999866 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/ca407b3c-0d81-447d-8623-bc49da617a4c-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-zr68v\" (UID: \"ca407b3c-0d81-447d-8623-bc49da617a4c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-zr68v" Apr 17 21:00:46.999963 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:46.999919 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/ca407b3c-0d81-447d-8623-bc49da617a4c-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-zr68v\" (UID: \"ca407b3c-0d81-447d-8623-bc49da617a4c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-zr68v" Apr 17 21:00:46.999963 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:46.999940 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/ca407b3c-0d81-447d-8623-bc49da617a4c-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-zr68v\" (UID: \"ca407b3c-0d81-447d-8623-bc49da617a4c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-zr68v" Apr 17 21:00:46.999963 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:46.999953 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/ca407b3c-0d81-447d-8623-bc49da617a4c-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-zr68v\" (UID: \"ca407b3c-0d81-447d-8623-bc49da617a4c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-zr68v" Apr 17 21:00:47.000155 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:47.000070 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/ca407b3c-0d81-447d-8623-bc49da617a4c-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-zr68v\" (UID: \"ca407b3c-0d81-447d-8623-bc49da617a4c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-zr68v" Apr 17 21:00:47.000255 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:47.000225 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/ca407b3c-0d81-447d-8623-bc49da617a4c-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-zr68v\" (UID: \"ca407b3c-0d81-447d-8623-bc49da617a4c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-zr68v" Apr 17 21:00:47.000255 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:47.000237 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/ca407b3c-0d81-447d-8623-bc49da617a4c-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-zr68v\" (UID: \"ca407b3c-0d81-447d-8623-bc49da617a4c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-zr68v" Apr 17 21:00:47.002062 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:47.002019 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/ca407b3c-0d81-447d-8623-bc49da617a4c-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-zr68v\" (UID: \"ca407b3c-0d81-447d-8623-bc49da617a4c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-zr68v" Apr 17 21:00:47.002194 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:47.002175 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/ca407b3c-0d81-447d-8623-bc49da617a4c-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-zr68v\" (UID: \"ca407b3c-0d81-447d-8623-bc49da617a4c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-zr68v" Apr 17 21:00:47.007411 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:47.007383 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/ca407b3c-0d81-447d-8623-bc49da617a4c-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-zr68v\" (UID: \"ca407b3c-0d81-447d-8623-bc49da617a4c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-zr68v" Apr 17 21:00:47.007525 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:47.007498 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9d68\" (UniqueName: \"kubernetes.io/projected/ca407b3c-0d81-447d-8623-bc49da617a4c-kube-api-access-d9d68\") pod \"maas-default-gateway-openshift-default-58b6f876-zr68v\" (UID: \"ca407b3c-0d81-447d-8623-bc49da617a4c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-zr68v" Apr 17 21:00:47.110697 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:47.110653 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-zr68v" Apr 17 21:00:47.252801 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:47.252764 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-zr68v"] Apr 17 21:00:47.254683 ip-10-0-128-248 kubenswrapper[2576]: W0417 21:00:47.254654 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca407b3c_0d81_447d_8623_bc49da617a4c.slice/crio-5c7d5c5455579be9a77e6bb27e53d2c61ebcf4f0bf511b261d8d2b790233c13a WatchSource:0}: Error finding container 5c7d5c5455579be9a77e6bb27e53d2c61ebcf4f0bf511b261d8d2b790233c13a: Status 404 returned error can't find the container with id 5c7d5c5455579be9a77e6bb27e53d2c61ebcf4f0bf511b261d8d2b790233c13a Apr 17 21:00:47.631935 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:47.631900 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-zr68v" event={"ID":"ca407b3c-0d81-447d-8623-bc49da617a4c","Type":"ContainerStarted","Data":"5c7d5c5455579be9a77e6bb27e53d2c61ebcf4f0bf511b261d8d2b790233c13a"} Apr 17 21:00:48.489587 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:48.489555 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba3f396b-dc67-4c7a-8171-03acae476c79" path="/var/lib/kubelet/pods/ba3f396b-dc67-4c7a-8171-03acae476c79/volumes" Apr 17 21:00:49.728804 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:49.728760 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 17 21:00:49.729071 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:49.728849 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 17 21:00:49.729071 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:49.728881 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 17 21:00:50.649111 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:50.649062 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-zr68v" event={"ID":"ca407b3c-0d81-447d-8623-bc49da617a4c","Type":"ContainerStarted","Data":"83a467194683cea5f7585ab26b0f56b045f658d8e0a33523023f146156731945"} Apr 17 21:00:50.671262 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:50.671213 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-zr68v" podStartSLOduration=2.199864994 podStartE2EDuration="4.671198726s" podCreationTimestamp="2026-04-17 21:00:46 +0000 UTC" firstStartedPulling="2026-04-17 21:00:47.257119226 +0000 UTC m=+609.263533236" lastFinishedPulling="2026-04-17 21:00:49.72845283 +0000 UTC m=+611.734866968" observedRunningTime="2026-04-17 21:00:50.668281996 +0000 UTC m=+612.674696052" watchObservedRunningTime="2026-04-17 21:00:50.671198726 +0000 UTC m=+612.677612759" Apr 17 21:00:51.110979 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:51.110948 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-zr68v" Apr 17 21:00:51.116014 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:51.115985 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-zr68v" Apr 17 21:00:51.565053 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:51.564992 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-95d9k"] Apr 17 21:00:51.568873 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:51.568845 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-95d9k" Apr 17 21:00:51.571446 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:51.571426 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-jn4tk\"" Apr 17 21:00:51.571577 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:51.571430 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 17 21:00:51.579078 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:51.579055 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-95d9k"] Apr 17 21:00:51.589991 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:51.589965 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-95d9k"] Apr 17 21:00:51.645788 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:51.645759 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/ccea4105-9465-41d0-a87c-2d6e86ad9ff5-config-file\") pod \"limitador-limitador-78c99df468-95d9k\" (UID: \"ccea4105-9465-41d0-a87c-2d6e86ad9ff5\") " pod="kuadrant-system/limitador-limitador-78c99df468-95d9k" Apr 17 21:00:51.645958 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:51.645790 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwqt9\" (UniqueName: \"kubernetes.io/projected/ccea4105-9465-41d0-a87c-2d6e86ad9ff5-kube-api-access-cwqt9\") pod \"limitador-limitador-78c99df468-95d9k\" (UID: \"ccea4105-9465-41d0-a87c-2d6e86ad9ff5\") " pod="kuadrant-system/limitador-limitador-78c99df468-95d9k" Apr 17 21:00:51.653098 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:51.653066 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-zr68v" Apr 17 21:00:51.654003 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:51.653987 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-zr68v" Apr 17 21:00:51.747461 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:51.747427 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/ccea4105-9465-41d0-a87c-2d6e86ad9ff5-config-file\") pod \"limitador-limitador-78c99df468-95d9k\" (UID: \"ccea4105-9465-41d0-a87c-2d6e86ad9ff5\") " pod="kuadrant-system/limitador-limitador-78c99df468-95d9k" Apr 17 21:00:51.747724 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:51.747693 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cwqt9\" (UniqueName: \"kubernetes.io/projected/ccea4105-9465-41d0-a87c-2d6e86ad9ff5-kube-api-access-cwqt9\") pod \"limitador-limitador-78c99df468-95d9k\" (UID: \"ccea4105-9465-41d0-a87c-2d6e86ad9ff5\") " pod="kuadrant-system/limitador-limitador-78c99df468-95d9k" Apr 17 21:00:51.748122 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:51.748090 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/ccea4105-9465-41d0-a87c-2d6e86ad9ff5-config-file\") pod \"limitador-limitador-78c99df468-95d9k\" (UID: \"ccea4105-9465-41d0-a87c-2d6e86ad9ff5\") " pod="kuadrant-system/limitador-limitador-78c99df468-95d9k" Apr 17 21:00:51.755729 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:51.755701 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwqt9\" (UniqueName: \"kubernetes.io/projected/ccea4105-9465-41d0-a87c-2d6e86ad9ff5-kube-api-access-cwqt9\") pod \"limitador-limitador-78c99df468-95d9k\" (UID: \"ccea4105-9465-41d0-a87c-2d6e86ad9ff5\") " pod="kuadrant-system/limitador-limitador-78c99df468-95d9k" Apr 17 21:00:51.880817 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:51.880745 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-95d9k" Apr 17 21:00:52.221420 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:52.221364 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-95d9k"] Apr 17 21:00:52.222700 ip-10-0-128-248 kubenswrapper[2576]: W0417 21:00:52.222672 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podccea4105_9465_41d0_a87c_2d6e86ad9ff5.slice/crio-e486867358b6c1edce0359031d2f5fce1c9cddba245bd1b790991385b166bc43 WatchSource:0}: Error finding container e486867358b6c1edce0359031d2f5fce1c9cddba245bd1b790991385b166bc43: Status 404 returned error can't find the container with id e486867358b6c1edce0359031d2f5fce1c9cddba245bd1b790991385b166bc43 Apr 17 21:00:52.657526 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:52.657492 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-95d9k" event={"ID":"ccea4105-9465-41d0-a87c-2d6e86ad9ff5","Type":"ContainerStarted","Data":"e486867358b6c1edce0359031d2f5fce1c9cddba245bd1b790991385b166bc43"} Apr 17 21:00:55.673316 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:55.673281 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-95d9k" event={"ID":"ccea4105-9465-41d0-a87c-2d6e86ad9ff5","Type":"ContainerStarted","Data":"a598568f91fa8e1a07aff8e001645f345007a8dca8296a39ae3a1d6e3fc5597a"} Apr 17 21:00:55.673702 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:55.673359 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-78c99df468-95d9k" Apr 17 21:00:55.691199 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:00:55.691152 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-78c99df468-95d9k" podStartSLOduration=1.973847814 podStartE2EDuration="4.691138268s" podCreationTimestamp="2026-04-17 21:00:51 +0000 UTC" firstStartedPulling="2026-04-17 21:00:52.224412081 +0000 UTC m=+614.230826095" lastFinishedPulling="2026-04-17 21:00:54.941702519 +0000 UTC m=+616.948116549" observedRunningTime="2026-04-17 21:00:55.688133533 +0000 UTC m=+617.694547566" watchObservedRunningTime="2026-04-17 21:00:55.691138268 +0000 UTC m=+617.697552330" Apr 17 21:01:06.678144 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:01:06.678114 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-78c99df468-95d9k" Apr 17 21:01:56.600298 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:01:56.600261 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-95d9k"] Apr 17 21:02:21.601095 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:02:21.601051 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-95d9k"] Apr 17 21:02:38.386231 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:02:38.386187 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-95d9k"] Apr 17 21:02:42.995083 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:02:42.995043 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-95d9k"] Apr 17 21:02:43.375670 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:02:43.375637 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-6xz79"] Apr 17 21:02:43.378319 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:02:43.378294 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-6xz79" Apr 17 21:02:43.382426 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:02:43.382392 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-lj54k\"" Apr 17 21:02:43.382426 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:02:43.382392 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 17 21:02:43.382614 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:02:43.382391 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-simulated-kserve-self-signed-certs\"" Apr 17 21:02:43.382614 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:02:43.382428 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 17 21:02:43.388572 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:02:43.388541 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-6xz79"] Apr 17 21:02:43.473036 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:02:43.472999 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fp2r6\" (UniqueName: \"kubernetes.io/projected/99946c83-ad99-44da-8448-2336453b0a2e-kube-api-access-fp2r6\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-6xz79\" (UID: \"99946c83-ad99-44da-8448-2336453b0a2e\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-6xz79" Apr 17 21:02:43.473232 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:02:43.473080 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/99946c83-ad99-44da-8448-2336453b0a2e-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-6xz79\" (UID: \"99946c83-ad99-44da-8448-2336453b0a2e\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-6xz79" Apr 17 21:02:43.473232 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:02:43.473106 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/99946c83-ad99-44da-8448-2336453b0a2e-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-6xz79\" (UID: \"99946c83-ad99-44da-8448-2336453b0a2e\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-6xz79" Apr 17 21:02:43.473232 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:02:43.473140 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/99946c83-ad99-44da-8448-2336453b0a2e-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-6xz79\" (UID: \"99946c83-ad99-44da-8448-2336453b0a2e\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-6xz79" Apr 17 21:02:43.473345 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:02:43.473229 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/99946c83-ad99-44da-8448-2336453b0a2e-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-6xz79\" (UID: \"99946c83-ad99-44da-8448-2336453b0a2e\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-6xz79" Apr 17 21:02:43.473345 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:02:43.473268 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/99946c83-ad99-44da-8448-2336453b0a2e-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-6xz79\" (UID: \"99946c83-ad99-44da-8448-2336453b0a2e\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-6xz79" Apr 17 21:02:43.574294 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:02:43.574251 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fp2r6\" (UniqueName: \"kubernetes.io/projected/99946c83-ad99-44da-8448-2336453b0a2e-kube-api-access-fp2r6\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-6xz79\" (UID: \"99946c83-ad99-44da-8448-2336453b0a2e\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-6xz79" Apr 17 21:02:43.574472 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:02:43.574337 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/99946c83-ad99-44da-8448-2336453b0a2e-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-6xz79\" (UID: \"99946c83-ad99-44da-8448-2336453b0a2e\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-6xz79" Apr 17 21:02:43.574472 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:02:43.574377 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/99946c83-ad99-44da-8448-2336453b0a2e-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-6xz79\" (UID: \"99946c83-ad99-44da-8448-2336453b0a2e\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-6xz79" Apr 17 21:02:43.574472 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:02:43.574431 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/99946c83-ad99-44da-8448-2336453b0a2e-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-6xz79\" (UID: \"99946c83-ad99-44da-8448-2336453b0a2e\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-6xz79" Apr 17 21:02:43.574631 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:02:43.574494 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/99946c83-ad99-44da-8448-2336453b0a2e-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-6xz79\" (UID: \"99946c83-ad99-44da-8448-2336453b0a2e\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-6xz79" Apr 17 21:02:43.574631 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:02:43.574530 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/99946c83-ad99-44da-8448-2336453b0a2e-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-6xz79\" (UID: \"99946c83-ad99-44da-8448-2336453b0a2e\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-6xz79" Apr 17 21:02:43.574867 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:02:43.574842 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/99946c83-ad99-44da-8448-2336453b0a2e-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-6xz79\" (UID: \"99946c83-ad99-44da-8448-2336453b0a2e\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-6xz79" Apr 17 21:02:43.574867 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:02:43.574855 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/99946c83-ad99-44da-8448-2336453b0a2e-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-6xz79\" (UID: \"99946c83-ad99-44da-8448-2336453b0a2e\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-6xz79" Apr 17 21:02:43.574990 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:02:43.574892 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/99946c83-ad99-44da-8448-2336453b0a2e-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-6xz79\" (UID: \"99946c83-ad99-44da-8448-2336453b0a2e\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-6xz79" Apr 17 21:02:43.576788 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:02:43.576764 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/99946c83-ad99-44da-8448-2336453b0a2e-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-6xz79\" (UID: \"99946c83-ad99-44da-8448-2336453b0a2e\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-6xz79" Apr 17 21:02:43.577149 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:02:43.577131 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/99946c83-ad99-44da-8448-2336453b0a2e-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-6xz79\" (UID: \"99946c83-ad99-44da-8448-2336453b0a2e\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-6xz79" Apr 17 21:02:43.582669 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:02:43.582643 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fp2r6\" (UniqueName: \"kubernetes.io/projected/99946c83-ad99-44da-8448-2336453b0a2e-kube-api-access-fp2r6\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-6xz79\" (UID: \"99946c83-ad99-44da-8448-2336453b0a2e\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-6xz79" Apr 17 21:02:43.689924 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:02:43.689827 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-6xz79" Apr 17 21:02:43.827374 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:02:43.827342 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-6xz79"] Apr 17 21:02:43.829087 ip-10-0-128-248 kubenswrapper[2576]: W0417 21:02:43.829058 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99946c83_ad99_44da_8448_2336453b0a2e.slice/crio-777f7d980c55efb06307b46b4d779f24f57cfc46fddc2b9fc0b9278fa66dcb24 WatchSource:0}: Error finding container 777f7d980c55efb06307b46b4d779f24f57cfc46fddc2b9fc0b9278fa66dcb24: Status 404 returned error can't find the container with id 777f7d980c55efb06307b46b4d779f24f57cfc46fddc2b9fc0b9278fa66dcb24 Apr 17 21:02:43.836378 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:02:43.834522 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 21:02:44.121702 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:02:44.121669 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-6xz79" event={"ID":"99946c83-ad99-44da-8448-2336453b0a2e","Type":"ContainerStarted","Data":"777f7d980c55efb06307b46b4d779f24f57cfc46fddc2b9fc0b9278fa66dcb24"} Apr 17 21:02:46.598999 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:02:46.598940 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-95d9k"] Apr 17 21:02:50.163830 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:02:50.163792 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-6xz79" event={"ID":"99946c83-ad99-44da-8448-2336453b0a2e","Type":"ContainerStarted","Data":"dcf29d2da2389ed9bdd62d303f396fc1b016a56d40b46898330e78df9d15b136"} Apr 17 21:02:55.186402 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:02:55.186368 2576 generic.go:358] "Generic (PLEG): container finished" podID="99946c83-ad99-44da-8448-2336453b0a2e" containerID="dcf29d2da2389ed9bdd62d303f396fc1b016a56d40b46898330e78df9d15b136" exitCode=0 Apr 17 21:02:55.186797 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:02:55.186441 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-6xz79" event={"ID":"99946c83-ad99-44da-8448-2336453b0a2e","Type":"ContainerDied","Data":"dcf29d2da2389ed9bdd62d303f396fc1b016a56d40b46898330e78df9d15b136"} Apr 17 21:02:57.196386 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:02:57.196347 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-6xz79" event={"ID":"99946c83-ad99-44da-8448-2336453b0a2e","Type":"ContainerStarted","Data":"4d6d2c1071de32238cb9c69263bd9b6301fee1a326e0b7e103fccc97079ee0d5"} Apr 17 21:02:57.196762 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:02:57.196576 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-6xz79" Apr 17 21:02:57.215738 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:02:57.215684 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-6xz79" podStartSLOduration=1.743210667 podStartE2EDuration="14.215661627s" podCreationTimestamp="2026-04-17 21:02:43 +0000 UTC" firstStartedPulling="2026-04-17 21:02:43.834708008 +0000 UTC m=+725.841122034" lastFinishedPulling="2026-04-17 21:02:56.307158983 +0000 UTC m=+738.313572994" observedRunningTime="2026-04-17 21:02:57.213200038 +0000 UTC m=+739.219614072" watchObservedRunningTime="2026-04-17 21:02:57.215661627 +0000 UTC m=+739.222075659" Apr 17 21:03:00.685507 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:03:00.685455 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-95d9k"] Apr 17 21:03:08.213923 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:03:08.213892 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-6xz79" Apr 17 21:03:22.792448 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:03:22.792408 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-95d9k"] Apr 17 21:05:38.501170 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:05:38.501140 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pwcjq_c5da4be1-c2a5-411b-a362-4b558f88d556/ovn-acl-logging/0.log" Apr 17 21:05:38.501671 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:05:38.501434 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pwcjq_c5da4be1-c2a5-411b-a362-4b558f88d556/ovn-acl-logging/0.log" Apr 17 21:06:11.687408 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:11.687374 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-kzt7p_c20d1b53-d6f2-43e3-beaa-9ce52cc6af90/manager/0.log" Apr 17 21:06:12.022012 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:12.021931 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-cx8hg_9065c47d-482b-4274-9e3c-f9dfa8b506fb/manager/2.log" Apr 17 21:06:12.130922 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:12.130890 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-5f74b9c8f9-9t7sq_c32c12ad-6174-4995-b1ee-e43255b206ee/manager/0.log" Apr 17 21:06:13.161663 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:13.161633 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759zggmf_2c95d737-42c4-415a-a9a8-a372c39f21ad/util/0.log" Apr 17 21:06:13.167339 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:13.167316 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759zggmf_2c95d737-42c4-415a-a9a8-a372c39f21ad/pull/0.log" Apr 17 21:06:13.172701 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:13.172674 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759zggmf_2c95d737-42c4-415a-a9a8-a372c39f21ad/extract/0.log" Apr 17 21:06:13.272340 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:13.272308 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05ps9b_4b53a019-ec4b-4dcd-ada4-3f42b65329c9/pull/0.log" Apr 17 21:06:13.277742 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:13.277720 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05ps9b_4b53a019-ec4b-4dcd-ada4-3f42b65329c9/extract/0.log" Apr 17 21:06:13.283352 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:13.283333 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05ps9b_4b53a019-ec4b-4dcd-ada4-3f42b65329c9/util/0.log" Apr 17 21:06:13.385899 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:13.385868 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73swbsm_2a0db55c-8f6a-4ac6-903f-dd3250e6f435/util/0.log" Apr 17 21:06:13.392636 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:13.392610 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73swbsm_2a0db55c-8f6a-4ac6-903f-dd3250e6f435/pull/0.log" Apr 17 21:06:13.397949 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:13.397931 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73swbsm_2a0db55c-8f6a-4ac6-903f-dd3250e6f435/extract/0.log" Apr 17 21:06:13.503139 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:13.503064 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1zqlx2_03211626-ffc0-4a5b-9cd7-d55efdc86cde/util/0.log" Apr 17 21:06:13.508529 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:13.508509 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1zqlx2_03211626-ffc0-4a5b-9cd7-d55efdc86cde/pull/0.log" Apr 17 21:06:13.513536 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:13.513516 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1zqlx2_03211626-ffc0-4a5b-9cd7-d55efdc86cde/extract/0.log" Apr 17 21:06:14.156530 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:14.156491 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-55c7f4c975-khlfh_58ae72d8-1ff3-45fb-8f4e-6d4afee8c6d2/manager/0.log" Apr 17 21:06:14.260895 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:14.260868 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-95d9k_ccea4105-9465-41d0-a87c-2d6e86ad9ff5/limitador/0.log" Apr 17 21:06:14.801370 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:14.801339 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-hc7bh_87f053f8-a963-4596-83f5-904087b875bb/discovery/0.log" Apr 17 21:06:14.904594 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:14.904511 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-666889b9b6-dwtcj_43070034-fd46-413e-ac75-0b27ae25d08a/kube-auth-proxy/0.log" Apr 17 21:06:15.111612 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:15.111578 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-58b6f876-zr68v_ca407b3c-0d81-447d-8623-bc49da617a4c/istio-proxy/0.log" Apr 17 21:06:15.665252 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:15.665227 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-7bb4cdb4d7-6xz79_99946c83-ad99-44da-8448-2336453b0a2e/storage-initializer/0.log" Apr 17 21:06:15.672145 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:15.672121 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-7bb4cdb4d7-6xz79_99946c83-ad99-44da-8448-2336453b0a2e/main/0.log" Apr 17 21:06:22.561412 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:22.561380 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-5ws25_1690cc02-1c37-4fea-8728-f08db138a40f/global-pull-secret-syncer/0.log" Apr 17 21:06:22.715506 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:22.715474 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-g2jbd_216f383a-4c2f-482f-b8e6-c36230d9ffbf/konnectivity-agent/0.log" Apr 17 21:06:22.761594 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:22.761566 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-128-248.ec2.internal_d579a59ca2c3bf4b3f744c41961ff1e1/haproxy/0.log" Apr 17 21:06:26.251420 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:26.251389 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759zggmf_2c95d737-42c4-415a-a9a8-a372c39f21ad/extract/0.log" Apr 17 21:06:26.272302 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:26.272279 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759zggmf_2c95d737-42c4-415a-a9a8-a372c39f21ad/util/0.log" Apr 17 21:06:26.315768 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:26.315740 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759zggmf_2c95d737-42c4-415a-a9a8-a372c39f21ad/pull/0.log" Apr 17 21:06:26.349794 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:26.349765 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05ps9b_4b53a019-ec4b-4dcd-ada4-3f42b65329c9/extract/0.log" Apr 17 21:06:26.373858 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:26.373832 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05ps9b_4b53a019-ec4b-4dcd-ada4-3f42b65329c9/util/0.log" Apr 17 21:06:26.397401 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:26.397373 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05ps9b_4b53a019-ec4b-4dcd-ada4-3f42b65329c9/pull/0.log" Apr 17 21:06:26.431377 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:26.431353 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73swbsm_2a0db55c-8f6a-4ac6-903f-dd3250e6f435/extract/0.log" Apr 17 21:06:26.456390 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:26.456370 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73swbsm_2a0db55c-8f6a-4ac6-903f-dd3250e6f435/util/0.log" Apr 17 21:06:26.477577 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:26.477558 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73swbsm_2a0db55c-8f6a-4ac6-903f-dd3250e6f435/pull/0.log" Apr 17 21:06:26.505753 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:26.505698 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1zqlx2_03211626-ffc0-4a5b-9cd7-d55efdc86cde/extract/0.log" Apr 17 21:06:26.529418 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:26.529390 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1zqlx2_03211626-ffc0-4a5b-9cd7-d55efdc86cde/util/0.log" Apr 17 21:06:26.553941 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:26.553920 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1zqlx2_03211626-ffc0-4a5b-9cd7-d55efdc86cde/pull/0.log" Apr 17 21:06:26.743632 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:26.743604 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-55c7f4c975-khlfh_58ae72d8-1ff3-45fb-8f4e-6d4afee8c6d2/manager/0.log" Apr 17 21:06:26.768354 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:26.768298 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-95d9k_ccea4105-9465-41d0-a87c-2d6e86ad9ff5/limitador/0.log" Apr 17 21:06:28.040122 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:28.040095 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_d64d740e-c62f-415b-9506-0ddd2f3a03c9/alertmanager/0.log" Apr 17 21:06:28.064344 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:28.064320 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_d64d740e-c62f-415b-9506-0ddd2f3a03c9/config-reloader/0.log" Apr 17 21:06:28.086056 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:28.086015 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_d64d740e-c62f-415b-9506-0ddd2f3a03c9/kube-rbac-proxy-web/0.log" Apr 17 21:06:28.107460 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:28.107439 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_d64d740e-c62f-415b-9506-0ddd2f3a03c9/kube-rbac-proxy/0.log" Apr 17 21:06:28.129182 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:28.129159 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_d64d740e-c62f-415b-9506-0ddd2f3a03c9/kube-rbac-proxy-metric/0.log" Apr 17 21:06:28.153008 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:28.152983 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_d64d740e-c62f-415b-9506-0ddd2f3a03c9/prom-label-proxy/0.log" Apr 17 21:06:28.173862 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:28.173823 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_d64d740e-c62f-415b-9506-0ddd2f3a03c9/init-config-reloader/0.log" Apr 17 21:06:28.235443 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:28.235415 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-4xl6v_1049aead-593a-4542-8f3d-64e8b6e5868c/kube-state-metrics/0.log" Apr 17 21:06:28.258464 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:28.258420 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-4xl6v_1049aead-593a-4542-8f3d-64e8b6e5868c/kube-rbac-proxy-main/0.log" Apr 17 21:06:28.281969 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:28.281944 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-4xl6v_1049aead-593a-4542-8f3d-64e8b6e5868c/kube-rbac-proxy-self/0.log" Apr 17 21:06:28.354044 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:28.353997 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-s89br_e0274df8-8f37-41d5-9ac8-89ed0c37fdd0/monitoring-plugin/0.log" Apr 17 21:06:28.378702 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:28.378678 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-98rrz_80324393-b669-4bf0-9373-8962675c7e34/node-exporter/0.log" Apr 17 21:06:28.404784 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:28.404755 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-98rrz_80324393-b669-4bf0-9373-8962675c7e34/kube-rbac-proxy/0.log" Apr 17 21:06:28.432556 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:28.432536 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-98rrz_80324393-b669-4bf0-9373-8962675c7e34/init-textfile/0.log" Apr 17 21:06:28.630291 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:28.630215 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-xpj77_3366da27-91fd-4f57-9f19-7f1a7aca9a23/kube-rbac-proxy-main/0.log" Apr 17 21:06:28.652631 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:28.652605 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-xpj77_3366da27-91fd-4f57-9f19-7f1a7aca9a23/kube-rbac-proxy-self/0.log" Apr 17 21:06:28.673596 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:28.673570 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-xpj77_3366da27-91fd-4f57-9f19-7f1a7aca9a23/openshift-state-metrics/0.log" Apr 17 21:06:28.938245 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:28.938169 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-qwgks_6113635f-b0f8-443f-935b-a31f49f907ad/prometheus-operator-admission-webhook/0.log" Apr 17 21:06:28.975240 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:28.975214 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-69d7584d88-zt59r_8467193a-45b0-480d-b09a-7583a0813a24/telemeter-client/0.log" Apr 17 21:06:28.999193 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:28.999169 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-69d7584d88-zt59r_8467193a-45b0-480d-b09a-7583a0813a24/reload/0.log" Apr 17 21:06:29.021994 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:29.021975 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-69d7584d88-zt59r_8467193a-45b0-480d-b09a-7583a0813a24/kube-rbac-proxy/0.log" Apr 17 21:06:31.345634 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:31.345604 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7ff664dfb6-lcc5q_422bf273-28f9-489e-8bfb-0834a4d90d5b/console/0.log" Apr 17 21:06:31.371127 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:31.371106 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-smxnr_fad3a57b-d1f9-4920-971e-1e1bdf88df8f/download-server/0.log" Apr 17 21:06:31.410462 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:31.410432 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-mxv2s/perf-node-gather-daemonset-rmlnd"] Apr 17 21:06:31.414191 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:31.414165 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mxv2s/perf-node-gather-daemonset-rmlnd" Apr 17 21:06:31.417135 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:31.416954 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-mxv2s\"/\"kube-root-ca.crt\"" Apr 17 21:06:31.417265 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:31.417148 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-mxv2s\"/\"openshift-service-ca.crt\"" Apr 17 21:06:31.417265 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:31.417252 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-mxv2s\"/\"default-dockercfg-kt5z7\"" Apr 17 21:06:31.422571 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:31.422485 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-mxv2s/perf-node-gather-daemonset-rmlnd"] Apr 17 21:06:31.483185 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:31.483160 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/db383b70-93e5-4ab2-9120-59b9c3d1d386-proc\") pod \"perf-node-gather-daemonset-rmlnd\" (UID: \"db383b70-93e5-4ab2-9120-59b9c3d1d386\") " pod="openshift-must-gather-mxv2s/perf-node-gather-daemonset-rmlnd" Apr 17 21:06:31.483308 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:31.483229 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/db383b70-93e5-4ab2-9120-59b9c3d1d386-lib-modules\") pod \"perf-node-gather-daemonset-rmlnd\" (UID: \"db383b70-93e5-4ab2-9120-59b9c3d1d386\") " pod="openshift-must-gather-mxv2s/perf-node-gather-daemonset-rmlnd" Apr 17 21:06:31.483308 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:31.483248 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/db383b70-93e5-4ab2-9120-59b9c3d1d386-sys\") pod \"perf-node-gather-daemonset-rmlnd\" (UID: \"db383b70-93e5-4ab2-9120-59b9c3d1d386\") " pod="openshift-must-gather-mxv2s/perf-node-gather-daemonset-rmlnd" Apr 17 21:06:31.483308 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:31.483282 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/db383b70-93e5-4ab2-9120-59b9c3d1d386-podres\") pod \"perf-node-gather-daemonset-rmlnd\" (UID: \"db383b70-93e5-4ab2-9120-59b9c3d1d386\") " pod="openshift-must-gather-mxv2s/perf-node-gather-daemonset-rmlnd" Apr 17 21:06:31.483416 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:31.483332 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlk6v\" (UniqueName: \"kubernetes.io/projected/db383b70-93e5-4ab2-9120-59b9c3d1d386-kube-api-access-jlk6v\") pod \"perf-node-gather-daemonset-rmlnd\" (UID: \"db383b70-93e5-4ab2-9120-59b9c3d1d386\") " pod="openshift-must-gather-mxv2s/perf-node-gather-daemonset-rmlnd" Apr 17 21:06:31.584175 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:31.584145 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/db383b70-93e5-4ab2-9120-59b9c3d1d386-lib-modules\") pod \"perf-node-gather-daemonset-rmlnd\" (UID: \"db383b70-93e5-4ab2-9120-59b9c3d1d386\") " pod="openshift-must-gather-mxv2s/perf-node-gather-daemonset-rmlnd" Apr 17 21:06:31.584175 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:31.584179 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/db383b70-93e5-4ab2-9120-59b9c3d1d386-sys\") pod \"perf-node-gather-daemonset-rmlnd\" (UID: \"db383b70-93e5-4ab2-9120-59b9c3d1d386\") " pod="openshift-must-gather-mxv2s/perf-node-gather-daemonset-rmlnd" Apr 17 21:06:31.584377 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:31.584219 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/db383b70-93e5-4ab2-9120-59b9c3d1d386-podres\") pod \"perf-node-gather-daemonset-rmlnd\" (UID: \"db383b70-93e5-4ab2-9120-59b9c3d1d386\") " pod="openshift-must-gather-mxv2s/perf-node-gather-daemonset-rmlnd" Apr 17 21:06:31.584377 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:31.584239 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jlk6v\" (UniqueName: \"kubernetes.io/projected/db383b70-93e5-4ab2-9120-59b9c3d1d386-kube-api-access-jlk6v\") pod \"perf-node-gather-daemonset-rmlnd\" (UID: \"db383b70-93e5-4ab2-9120-59b9c3d1d386\") " pod="openshift-must-gather-mxv2s/perf-node-gather-daemonset-rmlnd" Apr 17 21:06:31.584377 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:31.584273 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/db383b70-93e5-4ab2-9120-59b9c3d1d386-proc\") pod \"perf-node-gather-daemonset-rmlnd\" (UID: \"db383b70-93e5-4ab2-9120-59b9c3d1d386\") " pod="openshift-must-gather-mxv2s/perf-node-gather-daemonset-rmlnd" Apr 17 21:06:31.584377 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:31.584319 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/db383b70-93e5-4ab2-9120-59b9c3d1d386-lib-modules\") pod \"perf-node-gather-daemonset-rmlnd\" (UID: \"db383b70-93e5-4ab2-9120-59b9c3d1d386\") " pod="openshift-must-gather-mxv2s/perf-node-gather-daemonset-rmlnd" Apr 17 21:06:31.584377 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:31.584370 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/db383b70-93e5-4ab2-9120-59b9c3d1d386-podres\") pod \"perf-node-gather-daemonset-rmlnd\" (UID: \"db383b70-93e5-4ab2-9120-59b9c3d1d386\") " pod="openshift-must-gather-mxv2s/perf-node-gather-daemonset-rmlnd" Apr 17 21:06:31.584557 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:31.584372 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/db383b70-93e5-4ab2-9120-59b9c3d1d386-sys\") pod \"perf-node-gather-daemonset-rmlnd\" (UID: \"db383b70-93e5-4ab2-9120-59b9c3d1d386\") " pod="openshift-must-gather-mxv2s/perf-node-gather-daemonset-rmlnd" Apr 17 21:06:31.584557 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:31.584438 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/db383b70-93e5-4ab2-9120-59b9c3d1d386-proc\") pod \"perf-node-gather-daemonset-rmlnd\" (UID: \"db383b70-93e5-4ab2-9120-59b9c3d1d386\") " pod="openshift-must-gather-mxv2s/perf-node-gather-daemonset-rmlnd" Apr 17 21:06:31.591820 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:31.591798 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlk6v\" (UniqueName: \"kubernetes.io/projected/db383b70-93e5-4ab2-9120-59b9c3d1d386-kube-api-access-jlk6v\") pod \"perf-node-gather-daemonset-rmlnd\" (UID: \"db383b70-93e5-4ab2-9120-59b9c3d1d386\") " pod="openshift-must-gather-mxv2s/perf-node-gather-daemonset-rmlnd" Apr 17 21:06:31.726242 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:31.726160 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mxv2s/perf-node-gather-daemonset-rmlnd" Apr 17 21:06:31.853587 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:31.853553 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-mxv2s/perf-node-gather-daemonset-rmlnd"] Apr 17 21:06:31.854637 ip-10-0-128-248 kubenswrapper[2576]: W0417 21:06:31.854608 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-poddb383b70_93e5_4ab2_9120_59b9c3d1d386.slice/crio-799024233a367de83a00f72ca8029093dd648c5bbf6b87f60985423ffab42d3f WatchSource:0}: Error finding container 799024233a367de83a00f72ca8029093dd648c5bbf6b87f60985423ffab42d3f: Status 404 returned error can't find the container with id 799024233a367de83a00f72ca8029093dd648c5bbf6b87f60985423ffab42d3f Apr 17 21:06:32.118695 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:32.118663 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mxv2s/perf-node-gather-daemonset-rmlnd" event={"ID":"db383b70-93e5-4ab2-9120-59b9c3d1d386","Type":"ContainerStarted","Data":"62384ec5b7ce48b0fdd7e93808eb081754bac74f909279f1f7260860681d25f5"} Apr 17 21:06:32.118874 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:32.118702 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mxv2s/perf-node-gather-daemonset-rmlnd" event={"ID":"db383b70-93e5-4ab2-9120-59b9c3d1d386","Type":"ContainerStarted","Data":"799024233a367de83a00f72ca8029093dd648c5bbf6b87f60985423ffab42d3f"} Apr 17 21:06:32.118874 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:32.118741 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-mxv2s/perf-node-gather-daemonset-rmlnd" Apr 17 21:06:32.134418 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:32.134380 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-mxv2s/perf-node-gather-daemonset-rmlnd" podStartSLOduration=1.134367441 podStartE2EDuration="1.134367441s" podCreationTimestamp="2026-04-17 21:06:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 21:06:32.133526056 +0000 UTC m=+954.139940090" watchObservedRunningTime="2026-04-17 21:06:32.134367441 +0000 UTC m=+954.140781474" Apr 17 21:06:32.850629 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:32.850592 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-ws9kw_d2a7b59a-da53-4886-af79-da4e37b76a3e/dns/0.log" Apr 17 21:06:32.870336 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:32.870314 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-ws9kw_d2a7b59a-da53-4886-af79-da4e37b76a3e/kube-rbac-proxy/0.log" Apr 17 21:06:32.918573 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:32.918545 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-bv5tl_bfe609fc-0a8b-47f5-ac0d-e7e0c47917de/dns-node-resolver/0.log" Apr 17 21:06:33.406995 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:33.406962 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-66c59b6986-j28rw_f793a52c-6af4-4836-bb9d-78c25e56a7ed/registry/0.log" Apr 17 21:06:33.473645 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:33.473614 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-q6mlz_bd8e668b-8da4-40de-9724-891b41810e2d/node-ca/0.log" Apr 17 21:06:34.328527 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:34.328491 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-hc7bh_87f053f8-a963-4596-83f5-904087b875bb/discovery/0.log" Apr 17 21:06:34.346322 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:34.346296 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-666889b9b6-dwtcj_43070034-fd46-413e-ac75-0b27ae25d08a/kube-auth-proxy/0.log" Apr 17 21:06:34.392516 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:34.392489 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-58b6f876-zr68v_ca407b3c-0d81-447d-8623-bc49da617a4c/istio-proxy/0.log" Apr 17 21:06:34.944370 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:34.944341 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-wsdk8_be99f611-577c-4ca5-8103-64206cf5c3bd/serve-healthcheck-canary/0.log" Apr 17 21:06:35.347596 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:35.347561 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-wf7nr_6ddc2491-03fd-4683-9633-ee15d1c98c2d/insights-operator/1.log" Apr 17 21:06:35.347976 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:35.347717 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-wf7nr_6ddc2491-03fd-4683-9633-ee15d1c98c2d/insights-operator/0.log" Apr 17 21:06:35.370655 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:35.370630 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-29mv6_1b4762c6-58a6-4aa4-83cd-d9b76247b1e7/kube-rbac-proxy/0.log" Apr 17 21:06:35.394406 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:35.394379 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-29mv6_1b4762c6-58a6-4aa4-83cd-d9b76247b1e7/exporter/0.log" Apr 17 21:06:35.416207 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:35.416180 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-29mv6_1b4762c6-58a6-4aa4-83cd-d9b76247b1e7/extractor/0.log" Apr 17 21:06:37.333361 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:37.333335 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-kzt7p_c20d1b53-d6f2-43e3-beaa-9ce52cc6af90/manager/0.log" Apr 17 21:06:37.405325 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:37.405299 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-cx8hg_9065c47d-482b-4274-9e3c-f9dfa8b506fb/manager/1.log" Apr 17 21:06:37.415445 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:37.415426 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-cx8hg_9065c47d-482b-4274-9e3c-f9dfa8b506fb/manager/2.log" Apr 17 21:06:37.435088 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:37.435065 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-5f74b9c8f9-9t7sq_c32c12ad-6174-4995-b1ee-e43255b206ee/manager/0.log" Apr 17 21:06:38.133194 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:38.133157 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-mxv2s/perf-node-gather-daemonset-rmlnd" Apr 17 21:06:38.607135 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:38.607102 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_openshift-lws-operator-bfc7f696d-wz56g_51bedda0-02fb-443a-bd4f-e9016f008085/openshift-lws-operator/0.log" Apr 17 21:06:42.906633 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:42.906603 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-dhjql_9d15f35d-a2cc-4916-9a12-7d0f34fdfe67/migrator/0.log" Apr 17 21:06:42.930663 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:42.930619 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-dhjql_9d15f35d-a2cc-4916-9a12-7d0f34fdfe67/graceful-termination/0.log" Apr 17 21:06:43.272289 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:43.272258 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-gzcjp_7e1e5242-44ff-4464-9b66-32c92d278373/kube-storage-version-migrator-operator/1.log" Apr 17 21:06:43.273099 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:43.273079 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-gzcjp_7e1e5242-44ff-4464-9b66-32c92d278373/kube-storage-version-migrator-operator/0.log" Apr 17 21:06:44.276633 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:44.276604 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hzs8v_f9a06ede-5ace-4172-9e77-3eb6558ce0b1/kube-multus-additional-cni-plugins/0.log" Apr 17 21:06:44.298142 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:44.298116 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hzs8v_f9a06ede-5ace-4172-9e77-3eb6558ce0b1/egress-router-binary-copy/0.log" Apr 17 21:06:44.318263 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:44.318241 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hzs8v_f9a06ede-5ace-4172-9e77-3eb6558ce0b1/cni-plugins/0.log" Apr 17 21:06:44.338788 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:44.338760 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hzs8v_f9a06ede-5ace-4172-9e77-3eb6558ce0b1/bond-cni-plugin/0.log" Apr 17 21:06:44.360487 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:44.360468 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hzs8v_f9a06ede-5ace-4172-9e77-3eb6558ce0b1/routeoverride-cni/0.log" Apr 17 21:06:44.380369 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:44.380346 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hzs8v_f9a06ede-5ace-4172-9e77-3eb6558ce0b1/whereabouts-cni-bincopy/0.log" Apr 17 21:06:44.400589 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:44.400565 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hzs8v_f9a06ede-5ace-4172-9e77-3eb6558ce0b1/whereabouts-cni/0.log" Apr 17 21:06:44.782186 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:44.782156 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-ckt8c_16f9d82e-a351-4b70-9872-d4e4ed6dd1b2/kube-multus/0.log" Apr 17 21:06:44.892854 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:44.892822 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-z2lfh_f35870d4-a814-46cd-8764-30e79f3a7d79/network-metrics-daemon/0.log" Apr 17 21:06:44.911384 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:44.911357 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-z2lfh_f35870d4-a814-46cd-8764-30e79f3a7d79/kube-rbac-proxy/0.log" Apr 17 21:06:46.282371 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:46.282329 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pwcjq_c5da4be1-c2a5-411b-a362-4b558f88d556/ovn-controller/0.log" Apr 17 21:06:46.299367 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:46.299339 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pwcjq_c5da4be1-c2a5-411b-a362-4b558f88d556/ovn-acl-logging/0.log" Apr 17 21:06:46.303417 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:46.303396 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pwcjq_c5da4be1-c2a5-411b-a362-4b558f88d556/ovn-acl-logging/1.log" Apr 17 21:06:46.322198 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:46.322175 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pwcjq_c5da4be1-c2a5-411b-a362-4b558f88d556/kube-rbac-proxy-node/0.log" Apr 17 21:06:46.341230 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:46.341208 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pwcjq_c5da4be1-c2a5-411b-a362-4b558f88d556/kube-rbac-proxy-ovn-metrics/0.log" Apr 17 21:06:46.358610 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:46.358585 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pwcjq_c5da4be1-c2a5-411b-a362-4b558f88d556/northd/0.log" Apr 17 21:06:46.377752 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:46.377728 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pwcjq_c5da4be1-c2a5-411b-a362-4b558f88d556/nbdb/0.log" Apr 17 21:06:46.397749 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:46.397728 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pwcjq_c5da4be1-c2a5-411b-a362-4b558f88d556/sbdb/0.log" Apr 17 21:06:46.502522 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:46.502486 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pwcjq_c5da4be1-c2a5-411b-a362-4b558f88d556/ovnkube-controller/0.log" Apr 17 21:06:47.596609 ip-10-0-128-248 kubenswrapper[2576]: I0417 21:06:47.596583 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-hzbw9_276cca8f-28b9-4e8c-918c-32c78b8870e7/network-check-target-container/0.log"