Apr 19 15:25:00.761000 ip-10-0-133-218 systemd[1]: Starting Kubernetes Kubelet... Apr 19 15:25:01.261022 ip-10-0-133-218 kubenswrapper[2579]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 19 15:25:01.261022 ip-10-0-133-218 kubenswrapper[2579]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 19 15:25:01.261022 ip-10-0-133-218 kubenswrapper[2579]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 19 15:25:01.261022 ip-10-0-133-218 kubenswrapper[2579]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 19 15:25:01.261022 ip-10-0-133-218 kubenswrapper[2579]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 19 15:25:01.264862 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.264773 2579 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 19 15:25:01.268142 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.268120 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 19 15:25:01.268142 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.268137 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 19 15:25:01.268142 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.268141 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 19 15:25:01.268142 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.268144 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 19 15:25:01.268142 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.268148 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 19 15:25:01.268348 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.268151 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 19 15:25:01.268348 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.268156 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 19 15:25:01.268348 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.268159 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 19 15:25:01.268348 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.268163 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 19 15:25:01.268348 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.268166 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 19 15:25:01.268348 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.268169 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 19 15:25:01.268348 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.268173 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 19 15:25:01.268348 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.268179 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 19 15:25:01.268348 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.268182 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 19 15:25:01.268348 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.268185 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 19 15:25:01.268348 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.268188 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 19 15:25:01.268348 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.268202 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 19 15:25:01.268348 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.268205 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 19 15:25:01.268348 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.268208 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 19 15:25:01.268348 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.268211 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 19 15:25:01.268348 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.268214 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 19 15:25:01.268348 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.268217 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 19 15:25:01.268348 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.268220 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 19 15:25:01.268348 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.268222 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 19 15:25:01.268348 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.268225 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 19 15:25:01.268986 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.268231 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 19 15:25:01.268986 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.268235 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 19 15:25:01.268986 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.268239 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 19 15:25:01.268986 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.268242 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 19 15:25:01.268986 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.268245 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 19 15:25:01.268986 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.268248 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 19 15:25:01.268986 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.268250 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 19 15:25:01.268986 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.268253 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 19 15:25:01.268986 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.268256 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 19 15:25:01.268986 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.268260 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 19 15:25:01.268986 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.268266 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 19 15:25:01.268986 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.268273 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 19 15:25:01.268986 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.268276 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 19 15:25:01.268986 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.268280 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 19 15:25:01.268986 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.268284 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 19 15:25:01.268986 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.268287 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 19 15:25:01.268986 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.268291 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 19 15:25:01.268986 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.268294 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 19 15:25:01.268986 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.268298 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 19 15:25:01.269489 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.268300 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 19 15:25:01.269489 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.268304 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 19 15:25:01.269489 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.268307 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 19 15:25:01.269489 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.268310 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 19 15:25:01.269489 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.268313 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 19 15:25:01.269489 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.268319 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 19 15:25:01.269489 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.268323 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 19 15:25:01.269489 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.268326 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 19 15:25:01.269489 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.268329 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 19 15:25:01.269489 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.268332 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 19 15:25:01.269489 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.268335 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 19 15:25:01.269489 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.268337 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 19 15:25:01.269489 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.268340 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 19 15:25:01.269489 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.268343 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 19 15:25:01.269489 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.268346 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 19 15:25:01.269489 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.268349 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 19 15:25:01.269489 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.268352 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 19 15:25:01.269489 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.268357 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 19 15:25:01.269489 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.268360 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 19 15:25:01.269489 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.268362 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 19 15:25:01.269999 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.268534 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 19 15:25:01.269999 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.268551 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 19 15:25:01.269999 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.268555 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 19 15:25:01.269999 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.268558 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 19 15:25:01.269999 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.268562 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 19 15:25:01.269999 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.268565 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 19 15:25:01.269999 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.268569 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 19 15:25:01.269999 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.268571 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 19 15:25:01.269999 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.268575 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 19 15:25:01.269999 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.268578 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 19 15:25:01.269999 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.268581 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 19 15:25:01.269999 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.268584 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 19 15:25:01.269999 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.268587 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 19 15:25:01.269999 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.268589 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 19 15:25:01.269999 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.268592 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 19 15:25:01.269999 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.268595 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 19 15:25:01.269999 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.268598 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 19 15:25:01.269999 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.268601 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 19 15:25:01.269999 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.268603 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 19 15:25:01.269999 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.268606 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 19 15:25:01.270515 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.268608 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 19 15:25:01.270515 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.268612 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 19 15:25:01.270515 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.269691 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 19 15:25:01.270515 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.269698 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 19 15:25:01.270515 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.269702 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 19 15:25:01.270515 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.269705 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 19 15:25:01.270515 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.269707 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 19 15:25:01.270515 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.269710 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 19 15:25:01.270515 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.269713 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 19 15:25:01.270515 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.269733 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 19 15:25:01.270515 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.269738 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 19 15:25:01.270515 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.269741 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 19 15:25:01.270515 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.269744 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 19 15:25:01.270515 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.269747 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 19 15:25:01.270515 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.269750 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 19 15:25:01.270515 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.269753 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 19 15:25:01.270515 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.269756 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 19 15:25:01.270515 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.269759 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 19 15:25:01.270515 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.269761 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 19 15:25:01.271002 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.269764 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 19 15:25:01.271002 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.269767 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 19 15:25:01.271002 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.269770 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 19 15:25:01.271002 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.269772 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 19 15:25:01.271002 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.269775 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 19 15:25:01.271002 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.269777 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 19 15:25:01.271002 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.269780 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 19 15:25:01.271002 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.269782 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 19 15:25:01.271002 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.269785 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 19 15:25:01.271002 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.269787 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 19 15:25:01.271002 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.269790 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 19 15:25:01.271002 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.269792 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 19 15:25:01.271002 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.269795 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 19 15:25:01.271002 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.269798 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 19 15:25:01.271002 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.269801 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 19 15:25:01.271002 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.269804 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 19 15:25:01.271002 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.269807 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 19 15:25:01.271002 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.269809 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 19 15:25:01.271002 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.269819 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 19 15:25:01.271002 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.269823 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 19 15:25:01.271487 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.269826 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 19 15:25:01.271487 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.269828 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 19 15:25:01.271487 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.269831 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 19 15:25:01.271487 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.269834 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 19 15:25:01.271487 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.269836 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 19 15:25:01.271487 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.269838 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 19 15:25:01.271487 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.269843 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 19 15:25:01.271487 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.269846 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 19 15:25:01.271487 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.269848 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 19 15:25:01.271487 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.269851 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 19 15:25:01.271487 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.269853 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 19 15:25:01.271487 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.269856 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 19 15:25:01.271487 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.269858 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 19 15:25:01.271487 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.269861 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 19 15:25:01.271487 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.269863 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 19 15:25:01.271487 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.269866 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 19 15:25:01.271487 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.269868 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 19 15:25:01.271487 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.269872 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 19 15:25:01.271487 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.269874 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 19 15:25:01.271487 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.269878 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 19 15:25:01.271993 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.269884 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 19 15:25:01.271993 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.269887 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 19 15:25:01.271993 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.269890 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 19 15:25:01.271993 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.269892 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 19 15:25:01.271993 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.269895 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 19 15:25:01.271993 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.269897 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 19 15:25:01.271993 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.269900 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 19 15:25:01.271993 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.269902 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 19 15:25:01.271993 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.269905 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 19 15:25:01.271993 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.269907 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 19 15:25:01.271993 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.269911 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 19 15:25:01.271993 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.269913 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 19 15:25:01.271993 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.269915 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 19 15:25:01.271993 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.269918 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 19 15:25:01.271993 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.269920 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 19 15:25:01.271993 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.269923 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 19 15:25:01.271993 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.269925 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 19 15:25:01.271993 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.269928 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 19 15:25:01.271993 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.269931 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 19 15:25:01.271993 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.269934 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 19 15:25:01.272479 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.269937 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 19 15:25:01.272479 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.269940 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 19 15:25:01.272479 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.269943 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 19 15:25:01.272479 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.269945 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 19 15:25:01.272479 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.269947 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 19 15:25:01.272479 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.269950 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 19 15:25:01.272479 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.269952 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 19 15:25:01.272479 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.269955 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 19 15:25:01.272479 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.269958 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 19 15:25:01.272479 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270040 2579 flags.go:64] FLAG: --address="0.0.0.0" Apr 19 15:25:01.272479 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270047 2579 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 19 15:25:01.272479 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270055 2579 flags.go:64] FLAG: --anonymous-auth="true" Apr 19 15:25:01.272479 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270059 2579 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 19 15:25:01.272479 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270065 2579 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 19 15:25:01.272479 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270068 2579 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 19 15:25:01.272479 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270073 2579 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 19 15:25:01.272479 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270077 2579 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 19 15:25:01.272479 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270080 2579 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 19 15:25:01.272479 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270083 2579 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 19 15:25:01.272479 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270087 2579 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 19 15:25:01.272479 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270090 2579 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 19 15:25:01.272479 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270094 2579 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 19 15:25:01.273051 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270096 2579 flags.go:64] FLAG: --cgroup-root="" Apr 19 15:25:01.273051 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270099 2579 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 19 15:25:01.273051 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270102 2579 flags.go:64] FLAG: --client-ca-file="" Apr 19 15:25:01.273051 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270105 2579 flags.go:64] FLAG: --cloud-config="" Apr 19 15:25:01.273051 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270107 2579 flags.go:64] FLAG: --cloud-provider="external" Apr 19 15:25:01.273051 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270111 2579 flags.go:64] FLAG: --cluster-dns="[]" Apr 19 15:25:01.273051 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270115 2579 flags.go:64] FLAG: --cluster-domain="" Apr 19 15:25:01.273051 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270117 2579 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 19 15:25:01.273051 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270121 2579 flags.go:64] FLAG: --config-dir="" Apr 19 15:25:01.273051 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270125 2579 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 19 15:25:01.273051 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270128 2579 flags.go:64] FLAG: --container-log-max-files="5" Apr 19 15:25:01.273051 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270132 2579 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 19 15:25:01.273051 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270135 2579 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 19 15:25:01.273051 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270138 2579 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 19 15:25:01.273051 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270142 2579 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 19 15:25:01.273051 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270145 2579 flags.go:64] FLAG: --contention-profiling="false" Apr 19 15:25:01.273051 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270148 2579 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 19 15:25:01.273051 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270151 2579 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 19 15:25:01.273051 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270154 2579 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 19 15:25:01.273051 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270157 2579 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 19 15:25:01.273051 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270161 2579 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 19 15:25:01.273051 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270164 2579 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 19 15:25:01.273051 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270167 2579 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 19 15:25:01.273051 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270169 2579 flags.go:64] FLAG: --enable-load-reader="false" Apr 19 15:25:01.273051 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270173 2579 flags.go:64] FLAG: --enable-server="true" Apr 19 15:25:01.273765 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270176 2579 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 19 15:25:01.273765 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270181 2579 flags.go:64] FLAG: --event-burst="100" Apr 19 15:25:01.273765 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270184 2579 flags.go:64] FLAG: --event-qps="50" Apr 19 15:25:01.273765 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270186 2579 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 19 15:25:01.273765 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270190 2579 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 19 15:25:01.273765 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270192 2579 flags.go:64] FLAG: --eviction-hard="" Apr 19 15:25:01.273765 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270196 2579 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 19 15:25:01.273765 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270199 2579 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 19 15:25:01.273765 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270202 2579 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 19 15:25:01.273765 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270205 2579 flags.go:64] FLAG: --eviction-soft="" Apr 19 15:25:01.273765 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270208 2579 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 19 15:25:01.273765 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270211 2579 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 19 15:25:01.273765 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270214 2579 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 19 15:25:01.273765 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270216 2579 flags.go:64] FLAG: --experimental-mounter-path="" Apr 19 15:25:01.273765 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270219 2579 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 19 15:25:01.273765 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270223 2579 flags.go:64] FLAG: --fail-swap-on="true" Apr 19 15:25:01.273765 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270226 2579 flags.go:64] FLAG: --feature-gates="" Apr 19 15:25:01.273765 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270230 2579 flags.go:64] FLAG: --file-check-frequency="20s" Apr 19 15:25:01.273765 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270233 2579 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 19 15:25:01.273765 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270236 2579 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 19 15:25:01.273765 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270239 2579 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 19 15:25:01.273765 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270242 2579 flags.go:64] FLAG: --healthz-port="10248" Apr 19 15:25:01.273765 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270245 2579 flags.go:64] FLAG: --help="false" Apr 19 15:25:01.273765 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270248 2579 flags.go:64] FLAG: --hostname-override="ip-10-0-133-218.ec2.internal" Apr 19 15:25:01.273765 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270251 2579 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 19 15:25:01.274437 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270254 2579 flags.go:64] FLAG: --http-check-frequency="20s" Apr 19 15:25:01.274437 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270257 2579 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 19 15:25:01.274437 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270260 2579 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 19 15:25:01.274437 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270263 2579 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 19 15:25:01.274437 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270266 2579 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 19 15:25:01.274437 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270269 2579 flags.go:64] FLAG: --image-service-endpoint="" Apr 19 15:25:01.274437 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270271 2579 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 19 15:25:01.274437 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270275 2579 flags.go:64] FLAG: --kube-api-burst="100" Apr 19 15:25:01.274437 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270278 2579 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 19 15:25:01.274437 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270281 2579 flags.go:64] FLAG: --kube-api-qps="50" Apr 19 15:25:01.274437 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270283 2579 flags.go:64] FLAG: --kube-reserved="" Apr 19 15:25:01.274437 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270286 2579 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 19 15:25:01.274437 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270289 2579 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 19 15:25:01.274437 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270292 2579 flags.go:64] FLAG: --kubelet-cgroups="" Apr 19 15:25:01.274437 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270294 2579 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 19 15:25:01.274437 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270297 2579 flags.go:64] FLAG: --lock-file="" Apr 19 15:25:01.274437 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270300 2579 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 19 15:25:01.274437 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270303 2579 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 19 15:25:01.274437 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270305 2579 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 19 15:25:01.274437 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270315 2579 flags.go:64] FLAG: --log-json-split-stream="false" Apr 19 15:25:01.274437 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270318 2579 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 19 15:25:01.274437 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270321 2579 flags.go:64] FLAG: --log-text-split-stream="false" Apr 19 15:25:01.274437 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270325 2579 flags.go:64] FLAG: --logging-format="text" Apr 19 15:25:01.275022 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270328 2579 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 19 15:25:01.275022 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270331 2579 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 19 15:25:01.275022 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270334 2579 flags.go:64] FLAG: --manifest-url="" Apr 19 15:25:01.275022 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270337 2579 flags.go:64] FLAG: --manifest-url-header="" Apr 19 15:25:01.275022 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270341 2579 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 19 15:25:01.275022 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270344 2579 flags.go:64] FLAG: --max-open-files="1000000" Apr 19 15:25:01.275022 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270348 2579 flags.go:64] FLAG: --max-pods="110" Apr 19 15:25:01.275022 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270351 2579 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 19 15:25:01.275022 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270354 2579 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 19 15:25:01.275022 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270357 2579 flags.go:64] FLAG: --memory-manager-policy="None" Apr 19 15:25:01.275022 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270359 2579 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 19 15:25:01.275022 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270363 2579 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 19 15:25:01.275022 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270365 2579 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 19 15:25:01.275022 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270368 2579 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 19 15:25:01.275022 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270376 2579 flags.go:64] FLAG: --node-status-max-images="50" Apr 19 15:25:01.275022 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270379 2579 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 19 15:25:01.275022 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270382 2579 flags.go:64] FLAG: --oom-score-adj="-999" Apr 19 15:25:01.275022 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270385 2579 flags.go:64] FLAG: --pod-cidr="" Apr 19 15:25:01.275022 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270389 2579 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 19 15:25:01.275022 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270394 2579 flags.go:64] FLAG: --pod-manifest-path="" Apr 19 15:25:01.275022 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270396 2579 flags.go:64] FLAG: --pod-max-pids="-1" Apr 19 15:25:01.275022 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270399 2579 flags.go:64] FLAG: --pods-per-core="0" Apr 19 15:25:01.275022 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270402 2579 flags.go:64] FLAG: --port="10250" Apr 19 15:25:01.275022 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270405 2579 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 19 15:25:01.275600 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270409 2579 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-07739db662c755251" Apr 19 15:25:01.275600 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270412 2579 flags.go:64] FLAG: --qos-reserved="" Apr 19 15:25:01.275600 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270415 2579 flags.go:64] FLAG: --read-only-port="10255" Apr 19 15:25:01.275600 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270417 2579 flags.go:64] FLAG: --register-node="true" Apr 19 15:25:01.275600 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270421 2579 flags.go:64] FLAG: --register-schedulable="true" Apr 19 15:25:01.275600 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270423 2579 flags.go:64] FLAG: --register-with-taints="" Apr 19 15:25:01.275600 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270427 2579 flags.go:64] FLAG: --registry-burst="10" Apr 19 15:25:01.275600 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270430 2579 flags.go:64] FLAG: --registry-qps="5" Apr 19 15:25:01.275600 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270437 2579 flags.go:64] FLAG: --reserved-cpus="" Apr 19 15:25:01.275600 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270440 2579 flags.go:64] FLAG: --reserved-memory="" Apr 19 15:25:01.275600 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270444 2579 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 19 15:25:01.275600 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270447 2579 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 19 15:25:01.275600 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270450 2579 flags.go:64] FLAG: --rotate-certificates="false" Apr 19 15:25:01.275600 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270452 2579 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 19 15:25:01.275600 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270455 2579 flags.go:64] FLAG: --runonce="false" Apr 19 15:25:01.275600 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270458 2579 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 19 15:25:01.275600 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270461 2579 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 19 15:25:01.275600 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270464 2579 flags.go:64] FLAG: --seccomp-default="false" Apr 19 15:25:01.275600 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270466 2579 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 19 15:25:01.275600 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270469 2579 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 19 15:25:01.275600 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270472 2579 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 19 15:25:01.275600 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270476 2579 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 19 15:25:01.275600 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270479 2579 flags.go:64] FLAG: --storage-driver-password="root" Apr 19 15:25:01.275600 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270481 2579 flags.go:64] FLAG: --storage-driver-secure="false" Apr 19 15:25:01.275600 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270484 2579 flags.go:64] FLAG: --storage-driver-table="stats" Apr 19 15:25:01.275600 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270487 2579 flags.go:64] FLAG: --storage-driver-user="root" Apr 19 15:25:01.276275 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270490 2579 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 19 15:25:01.276275 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270493 2579 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 19 15:25:01.276275 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270496 2579 flags.go:64] FLAG: --system-cgroups="" Apr 19 15:25:01.276275 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270499 2579 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 19 15:25:01.276275 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270504 2579 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 19 15:25:01.276275 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270507 2579 flags.go:64] FLAG: --tls-cert-file="" Apr 19 15:25:01.276275 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270510 2579 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 19 15:25:01.276275 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270513 2579 flags.go:64] FLAG: --tls-min-version="" Apr 19 15:25:01.276275 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270516 2579 flags.go:64] FLAG: --tls-private-key-file="" Apr 19 15:25:01.276275 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270519 2579 flags.go:64] FLAG: --topology-manager-policy="none" Apr 19 15:25:01.276275 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270522 2579 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 19 15:25:01.276275 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270525 2579 flags.go:64] FLAG: --topology-manager-scope="container" Apr 19 15:25:01.276275 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270528 2579 flags.go:64] FLAG: --v="2" Apr 19 15:25:01.276275 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270533 2579 flags.go:64] FLAG: --version="false" Apr 19 15:25:01.276275 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270538 2579 flags.go:64] FLAG: --vmodule="" Apr 19 15:25:01.276275 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270542 2579 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 19 15:25:01.276275 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.270545 2579 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 19 15:25:01.276275 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.270638 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 19 15:25:01.276275 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.270642 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 19 15:25:01.276275 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.270646 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 19 15:25:01.276275 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.270649 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 19 15:25:01.276275 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.270652 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 19 15:25:01.276275 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.270655 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 19 15:25:01.276896 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.270659 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 19 15:25:01.276896 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.270662 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 19 15:25:01.276896 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.270665 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 19 15:25:01.276896 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.270667 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 19 15:25:01.276896 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.270670 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 19 15:25:01.276896 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.270672 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 19 15:25:01.276896 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.270674 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 19 15:25:01.276896 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.270677 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 19 15:25:01.276896 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.270679 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 19 15:25:01.276896 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.270681 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 19 15:25:01.276896 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.270684 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 19 15:25:01.276896 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.270687 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 19 15:25:01.276896 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.270689 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 19 15:25:01.276896 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.270691 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 19 15:25:01.276896 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.270694 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 19 15:25:01.276896 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.270696 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 19 15:25:01.276896 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.270699 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 19 15:25:01.276896 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.270701 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 19 15:25:01.276896 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.270704 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 19 15:25:01.277420 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.270706 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 19 15:25:01.277420 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.270708 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 19 15:25:01.277420 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.270711 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 19 15:25:01.277420 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.270713 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 19 15:25:01.277420 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.270730 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 19 15:25:01.277420 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.270733 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 19 15:25:01.277420 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.270736 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 19 15:25:01.277420 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.270738 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 19 15:25:01.277420 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.270741 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 19 15:25:01.277420 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.270743 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 19 15:25:01.277420 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.270746 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 19 15:25:01.277420 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.270749 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 19 15:25:01.277420 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.270751 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 19 15:25:01.277420 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.270754 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 19 15:25:01.277420 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.270757 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 19 15:25:01.277420 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.270759 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 19 15:25:01.277420 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.270762 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 19 15:25:01.277420 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.270764 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 19 15:25:01.277420 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.270767 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 19 15:25:01.277931 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.270769 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 19 15:25:01.277931 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.270772 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 19 15:25:01.277931 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.270774 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 19 15:25:01.277931 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.270777 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 19 15:25:01.277931 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.270779 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 19 15:25:01.277931 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.270782 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 19 15:25:01.277931 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.270785 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 19 15:25:01.277931 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.270788 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 19 15:25:01.277931 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.270790 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 19 15:25:01.277931 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.270792 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 19 15:25:01.277931 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.270795 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 19 15:25:01.277931 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.270797 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 19 15:25:01.277931 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.270800 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 19 15:25:01.277931 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.270802 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 19 15:25:01.277931 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.270805 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 19 15:25:01.277931 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.270807 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 19 15:25:01.277931 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.270811 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 19 15:25:01.277931 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.270814 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 19 15:25:01.277931 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.270816 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 19 15:25:01.277931 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.270819 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 19 15:25:01.278446 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.270821 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 19 15:25:01.278446 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.270824 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 19 15:25:01.278446 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.270826 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 19 15:25:01.278446 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.270829 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 19 15:25:01.278446 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.270831 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 19 15:25:01.278446 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.270834 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 19 15:25:01.278446 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.270836 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 19 15:25:01.278446 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.270838 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 19 15:25:01.278446 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.270841 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 19 15:25:01.278446 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.270843 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 19 15:25:01.278446 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.270846 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 19 15:25:01.278446 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.270848 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 19 15:25:01.278446 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.270850 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 19 15:25:01.278446 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.270853 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 19 15:25:01.278446 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.270855 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 19 15:25:01.278446 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.270858 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 19 15:25:01.278446 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.270860 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 19 15:25:01.278446 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.270863 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 19 15:25:01.278446 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.270865 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 19 15:25:01.278446 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.270868 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 19 15:25:01.278964 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.270870 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 19 15:25:01.278964 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.270873 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 19 15:25:01.278964 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.271776 2579 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 19 15:25:01.278964 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.278379 2579 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 19 15:25:01.278964 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.278394 2579 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 19 15:25:01.278964 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278441 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 19 15:25:01.278964 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278446 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 19 15:25:01.278964 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278449 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 19 15:25:01.278964 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278452 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 19 15:25:01.278964 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278456 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 19 15:25:01.278964 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278459 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 19 15:25:01.278964 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278461 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 19 15:25:01.278964 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278464 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 19 15:25:01.278964 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278467 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 19 15:25:01.278964 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278469 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 19 15:25:01.278964 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278472 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 19 15:25:01.279353 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278475 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 19 15:25:01.279353 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278477 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 19 15:25:01.279353 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278480 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 19 15:25:01.279353 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278482 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 19 15:25:01.279353 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278485 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 19 15:25:01.279353 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278487 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 19 15:25:01.279353 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278490 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 19 15:25:01.279353 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278492 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 19 15:25:01.279353 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278495 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 19 15:25:01.279353 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278497 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 19 15:25:01.279353 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278500 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 19 15:25:01.279353 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278502 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 19 15:25:01.279353 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278505 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 19 15:25:01.279353 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278507 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 19 15:25:01.279353 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278509 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 19 15:25:01.279353 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278512 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 19 15:25:01.279353 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278514 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 19 15:25:01.279353 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278517 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 19 15:25:01.279353 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278519 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 19 15:25:01.279353 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278521 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 19 15:25:01.279859 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278525 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 19 15:25:01.279859 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278528 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 19 15:25:01.279859 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278531 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 19 15:25:01.279859 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278533 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 19 15:25:01.279859 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278536 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 19 15:25:01.279859 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278538 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 19 15:25:01.279859 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278540 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 19 15:25:01.279859 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278543 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 19 15:25:01.279859 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278545 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 19 15:25:01.279859 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278548 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 19 15:25:01.279859 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278550 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 19 15:25:01.279859 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278553 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 19 15:25:01.279859 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278555 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 19 15:25:01.279859 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278558 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 19 15:25:01.279859 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278560 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 19 15:25:01.279859 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278562 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 19 15:25:01.279859 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278565 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 19 15:25:01.279859 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278567 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 19 15:25:01.279859 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278570 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 19 15:25:01.279859 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278572 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 19 15:25:01.280342 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278575 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 19 15:25:01.280342 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278578 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 19 15:25:01.280342 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278580 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 19 15:25:01.280342 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278582 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 19 15:25:01.280342 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278585 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 19 15:25:01.280342 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278587 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 19 15:25:01.280342 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278589 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 19 15:25:01.280342 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278592 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 19 15:25:01.280342 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278594 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 19 15:25:01.280342 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278597 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 19 15:25:01.280342 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278599 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 19 15:25:01.280342 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278601 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 19 15:25:01.280342 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278605 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 19 15:25:01.280342 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278608 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 19 15:25:01.280342 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278611 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 19 15:25:01.280342 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278614 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 19 15:25:01.280342 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278616 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 19 15:25:01.280342 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278620 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 19 15:25:01.280342 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278623 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 19 15:25:01.280823 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278625 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 19 15:25:01.280823 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278628 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 19 15:25:01.280823 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278630 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 19 15:25:01.280823 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278633 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 19 15:25:01.280823 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278635 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 19 15:25:01.280823 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278638 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 19 15:25:01.280823 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278640 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 19 15:25:01.280823 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278643 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 19 15:25:01.280823 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278645 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 19 15:25:01.280823 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278647 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 19 15:25:01.280823 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278650 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 19 15:25:01.280823 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278652 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 19 15:25:01.280823 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278655 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 19 15:25:01.280823 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278659 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 19 15:25:01.280823 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278662 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 19 15:25:01.280823 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278665 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 19 15:25:01.281216 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.278670 2579 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 19 15:25:01.281216 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278784 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 19 15:25:01.281216 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278790 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 19 15:25:01.281216 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278793 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 19 15:25:01.281216 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278796 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 19 15:25:01.281216 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278799 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 19 15:25:01.281216 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278802 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 19 15:25:01.281216 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278804 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 19 15:25:01.281216 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278807 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 19 15:25:01.281216 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278810 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 19 15:25:01.281216 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278813 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 19 15:25:01.281216 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278816 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 19 15:25:01.281216 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278819 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 19 15:25:01.281216 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278821 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 19 15:25:01.281216 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278824 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 19 15:25:01.281216 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278826 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 19 15:25:01.281600 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278829 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 19 15:25:01.281600 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278831 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 19 15:25:01.281600 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278834 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 19 15:25:01.281600 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278836 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 19 15:25:01.281600 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278838 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 19 15:25:01.281600 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278841 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 19 15:25:01.281600 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278844 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 19 15:25:01.281600 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278846 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 19 15:25:01.281600 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278848 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 19 15:25:01.281600 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278851 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 19 15:25:01.281600 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278853 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 19 15:25:01.281600 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278856 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 19 15:25:01.281600 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278858 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 19 15:25:01.281600 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278861 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 19 15:25:01.281600 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278863 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 19 15:25:01.281600 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278865 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 19 15:25:01.281600 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278868 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 19 15:25:01.281600 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278870 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 19 15:25:01.281600 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278872 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 19 15:25:01.282168 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278875 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 19 15:25:01.282168 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278877 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 19 15:25:01.282168 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278880 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 19 15:25:01.282168 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278882 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 19 15:25:01.282168 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278884 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 19 15:25:01.282168 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278888 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 19 15:25:01.282168 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278892 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 19 15:25:01.282168 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278896 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 19 15:25:01.282168 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278899 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 19 15:25:01.282168 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278902 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 19 15:25:01.282168 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278905 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 19 15:25:01.282168 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278908 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 19 15:25:01.282168 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278911 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 19 15:25:01.282168 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278913 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 19 15:25:01.282168 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278916 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 19 15:25:01.282168 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278918 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 19 15:25:01.282168 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278921 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 19 15:25:01.282168 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278924 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 19 15:25:01.282168 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278926 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 19 15:25:01.282629 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278929 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 19 15:25:01.282629 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278931 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 19 15:25:01.282629 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278934 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 19 15:25:01.282629 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278936 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 19 15:25:01.282629 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278938 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 19 15:25:01.282629 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278941 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 19 15:25:01.282629 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278943 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 19 15:25:01.282629 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278946 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 19 15:25:01.282629 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278948 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 19 15:25:01.282629 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278951 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 19 15:25:01.282629 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278953 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 19 15:25:01.282629 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278955 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 19 15:25:01.282629 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278958 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 19 15:25:01.282629 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278960 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 19 15:25:01.282629 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278962 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 19 15:25:01.282629 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278965 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 19 15:25:01.282629 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278967 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 19 15:25:01.282629 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278970 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 19 15:25:01.282629 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278973 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 19 15:25:01.282629 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278976 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 19 15:25:01.283137 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278978 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 19 15:25:01.283137 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278981 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 19 15:25:01.283137 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278983 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 19 15:25:01.283137 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278986 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 19 15:25:01.283137 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278989 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 19 15:25:01.283137 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278991 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 19 15:25:01.283137 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278993 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 19 15:25:01.283137 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278996 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 19 15:25:01.283137 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.278999 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 19 15:25:01.283137 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.279002 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 19 15:25:01.283137 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.279005 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 19 15:25:01.283137 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.279007 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 19 15:25:01.283137 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:01.279009 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 19 15:25:01.283137 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.279014 2579 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 19 15:25:01.283137 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.279733 2579 server.go:962] "Client rotation is on, will bootstrap in background" Apr 19 15:25:01.284054 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.284040 2579 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 19 15:25:01.285169 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.285156 2579 server.go:1019] "Starting client certificate rotation" Apr 19 15:25:01.285275 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.285258 2579 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 19 15:25:01.285309 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.285303 2579 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 19 15:25:01.314811 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.314785 2579 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 19 15:25:01.317424 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.317406 2579 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 19 15:25:01.339678 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.339647 2579 log.go:25] "Validated CRI v1 runtime API" Apr 19 15:25:01.346094 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.346070 2579 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 19 15:25:01.346912 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.346898 2579 log.go:25] "Validated CRI v1 image API" Apr 19 15:25:01.348343 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.348323 2579 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 19 15:25:01.351666 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.351645 2579 fs.go:135] Filesystem UUIDs: map[32f662e8-a59d-4553-883f-4680ac0efef7:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 c358b993-78a9-4cf0-bfa9-e12769346e62:/dev/nvme0n1p4] Apr 19 15:25:01.351732 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.351666 2579 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 19 15:25:01.358523 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.358406 2579 manager.go:217] Machine: {Timestamp:2026-04-19 15:25:01.356092063 +0000 UTC m=+0.462007443 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3103631 MemoryCapacity:32812171264 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2734687cbf7901315fc851b3d05596 SystemUUID:ec273468-7cbf-7901-315f-c851b3d05596 BootID:6598961c-8558-4f4d-9cf5-b2d81629e945 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406085632 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:4e:19:e0:ec:0d Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:4e:19:e0:ec:0d Speed:0 Mtu:9001} {Name:ovs-system MacAddress:0a:60:4e:b6:34:df Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812171264 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 19 15:25:01.358523 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.358517 2579 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 19 15:25:01.358679 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.358666 2579 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 19 15:25:01.360802 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.360770 2579 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 19 15:25:01.360956 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.360803 2579 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-133-218.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 19 15:25:01.360999 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.360966 2579 topology_manager.go:138] "Creating topology manager with none policy" Apr 19 15:25:01.360999 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.360975 2579 container_manager_linux.go:306] "Creating device plugin manager" Apr 19 15:25:01.360999 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.360989 2579 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 19 15:25:01.361075 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.361003 2579 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 19 15:25:01.362502 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.362492 2579 state_mem.go:36] "Initialized new in-memory state store" Apr 19 15:25:01.362613 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.362604 2579 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 19 15:25:01.364250 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.364233 2579 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-5j7d8" Apr 19 15:25:01.368810 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.368795 2579 kubelet.go:491] "Attempting to sync node with API server" Apr 19 15:25:01.368865 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.368816 2579 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 19 15:25:01.368865 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.368831 2579 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 19 15:25:01.368865 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.368841 2579 kubelet.go:397] "Adding apiserver pod source" Apr 19 15:25:01.368865 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.368859 2579 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 19 15:25:01.370263 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.370246 2579 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 19 15:25:01.370342 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.370269 2579 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 19 15:25:01.371478 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.371452 2579 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-5j7d8" Apr 19 15:25:01.373434 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.373416 2579 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 19 15:25:01.374901 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.374888 2579 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 19 15:25:01.376877 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.376864 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 19 15:25:01.376942 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.376881 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 19 15:25:01.376942 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.376887 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 19 15:25:01.376942 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.376892 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 19 15:25:01.376942 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.376897 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 19 15:25:01.376942 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.376903 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 19 15:25:01.376942 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.376910 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 19 15:25:01.376942 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.376916 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 19 15:25:01.376942 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.376923 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 19 15:25:01.376942 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.376930 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 19 15:25:01.376942 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.376938 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 19 15:25:01.376942 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.376947 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 19 15:25:01.377917 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.377904 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 19 15:25:01.377950 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.377919 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 19 15:25:01.381893 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.381871 2579 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 19 15:25:01.382017 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.381955 2579 server.go:1295] "Started kubelet" Apr 19 15:25:01.382532 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.382481 2579 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 19 15:25:01.382608 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.382526 2579 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 19 15:25:01.382663 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.382612 2579 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 19 15:25:01.383192 ip-10-0-133-218 systemd[1]: Started Kubernetes Kubelet. Apr 19 15:25:01.384267 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.384230 2579 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 19 15:25:01.386947 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.386926 2579 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 19 15:25:01.387155 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.387141 2579 server.go:317] "Adding debug handlers to kubelet server" Apr 19 15:25:01.389926 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.389892 2579 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-133-218.ec2.internal" not found Apr 19 15:25:01.390181 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.390160 2579 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 19 15:25:01.393374 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.393354 2579 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 19 15:25:01.393871 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.393852 2579 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 19 15:25:01.394602 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.394584 2579 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 19 15:25:01.394602 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.394602 2579 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 19 15:25:01.395331 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.394750 2579 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 19 15:25:01.395331 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.394910 2579 reconstruct.go:97] "Volume reconstruction finished" Apr 19 15:25:01.395331 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.394919 2579 reconciler.go:26] "Reconciler: start to sync state" Apr 19 15:25:01.395331 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:01.394933 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-218.ec2.internal\" not found" Apr 19 15:25:01.395331 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:01.395131 2579 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 19 15:25:01.395612 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.395418 2579 factory.go:55] Registering systemd factory Apr 19 15:25:01.395612 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.395435 2579 factory.go:223] Registration of the systemd container factory successfully Apr 19 15:25:01.395774 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.395624 2579 factory.go:153] Registering CRI-O factory Apr 19 15:25:01.395774 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.395633 2579 factory.go:223] Registration of the crio container factory successfully Apr 19 15:25:01.395774 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.395679 2579 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 19 15:25:01.395774 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.395702 2579 factory.go:103] Registering Raw factory Apr 19 15:25:01.395774 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.395738 2579 manager.go:1196] Started watching for new ooms in manager Apr 19 15:25:01.396044 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.395926 2579 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 19 15:25:01.397025 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.397006 2579 manager.go:319] Starting recovery of all containers Apr 19 15:25:01.401915 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:01.401892 2579 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-133-218.ec2.internal\" not found" node="ip-10-0-133-218.ec2.internal" Apr 19 15:25:01.405094 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.405073 2579 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-133-218.ec2.internal" not found Apr 19 15:25:01.408221 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.408202 2579 manager.go:324] Recovery completed Apr 19 15:25:01.412354 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.412341 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 19 15:25:01.414589 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.414576 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-218.ec2.internal" event="NodeHasSufficientMemory" Apr 19 15:25:01.414650 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.414603 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-218.ec2.internal" event="NodeHasNoDiskPressure" Apr 19 15:25:01.414650 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.414614 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-218.ec2.internal" event="NodeHasSufficientPID" Apr 19 15:25:01.415578 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.415555 2579 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 19 15:25:01.415578 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.415577 2579 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 19 15:25:01.415691 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.415607 2579 state_mem.go:36] "Initialized new in-memory state store" Apr 19 15:25:01.418142 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.418128 2579 policy_none.go:49] "None policy: Start" Apr 19 15:25:01.418142 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.418145 2579 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 19 15:25:01.418241 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.418155 2579 state_mem.go:35] "Initializing new in-memory state store" Apr 19 15:25:01.449305 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.449290 2579 manager.go:341] "Starting Device Plugin manager" Apr 19 15:25:01.449388 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:01.449344 2579 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 19 15:25:01.449388 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.449358 2579 server.go:85] "Starting device plugin registration server" Apr 19 15:25:01.449597 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.449583 2579 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 19 15:25:01.449652 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.449598 2579 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 19 15:25:01.449878 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.449863 2579 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 19 15:25:01.449950 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.449932 2579 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 19 15:25:01.449950 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.449939 2579 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 19 15:25:01.450536 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:01.450517 2579 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 19 15:25:01.450637 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:01.450556 2579 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-133-218.ec2.internal\" not found" Apr 19 15:25:01.462832 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.462808 2579 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-133-218.ec2.internal" not found Apr 19 15:25:01.539066 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.538995 2579 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 19 15:25:01.540337 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.540311 2579 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 19 15:25:01.540445 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.540349 2579 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 19 15:25:01.540445 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.540367 2579 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 19 15:25:01.540445 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.540374 2579 kubelet.go:2451] "Starting kubelet main sync loop" Apr 19 15:25:01.540580 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:01.540471 2579 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 19 15:25:01.542917 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.542897 2579 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 19 15:25:01.550707 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.550692 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 19 15:25:01.551427 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.551409 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-218.ec2.internal" event="NodeHasSufficientMemory" Apr 19 15:25:01.551519 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.551439 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-218.ec2.internal" event="NodeHasNoDiskPressure" Apr 19 15:25:01.551519 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.551453 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-218.ec2.internal" event="NodeHasSufficientPID" Apr 19 15:25:01.551519 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.551482 2579 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-133-218.ec2.internal" Apr 19 15:25:01.560793 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.560770 2579 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-133-218.ec2.internal" Apr 19 15:25:01.640928 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.640876 2579 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-218.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-133-218.ec2.internal"] Apr 19 15:25:01.643546 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.643525 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-218.ec2.internal" Apr 19 15:25:01.643633 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.643533 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-218.ec2.internal" Apr 19 15:25:01.668454 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.668432 2579 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-218.ec2.internal" Apr 19 15:25:01.672937 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.672923 2579 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-218.ec2.internal" Apr 19 15:25:01.680342 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.680325 2579 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 19 15:25:01.697688 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.697660 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/02ac498d4e3cb36b3700c57a1bf34412-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-218.ec2.internal\" (UID: \"02ac498d4e3cb36b3700c57a1bf34412\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-218.ec2.internal" Apr 19 15:25:01.697781 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.697692 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/02ac498d4e3cb36b3700c57a1bf34412-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-218.ec2.internal\" (UID: \"02ac498d4e3cb36b3700c57a1bf34412\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-218.ec2.internal" Apr 19 15:25:01.697781 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.697733 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/1ddb39581fd73259e0c27abfdf033d32-config\") pod \"kube-apiserver-proxy-ip-10-0-133-218.ec2.internal\" (UID: \"1ddb39581fd73259e0c27abfdf033d32\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-218.ec2.internal" Apr 19 15:25:01.778251 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.778220 2579 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 19 15:25:01.797934 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.797848 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/02ac498d4e3cb36b3700c57a1bf34412-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-218.ec2.internal\" (UID: \"02ac498d4e3cb36b3700c57a1bf34412\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-218.ec2.internal" Apr 19 15:25:01.797934 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.797883 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/02ac498d4e3cb36b3700c57a1bf34412-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-218.ec2.internal\" (UID: \"02ac498d4e3cb36b3700c57a1bf34412\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-218.ec2.internal" Apr 19 15:25:01.797934 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.797902 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/1ddb39581fd73259e0c27abfdf033d32-config\") pod \"kube-apiserver-proxy-ip-10-0-133-218.ec2.internal\" (UID: \"1ddb39581fd73259e0c27abfdf033d32\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-218.ec2.internal" Apr 19 15:25:01.798122 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.797947 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/1ddb39581fd73259e0c27abfdf033d32-config\") pod \"kube-apiserver-proxy-ip-10-0-133-218.ec2.internal\" (UID: \"1ddb39581fd73259e0c27abfdf033d32\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-218.ec2.internal" Apr 19 15:25:01.798122 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.797964 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/02ac498d4e3cb36b3700c57a1bf34412-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-218.ec2.internal\" (UID: \"02ac498d4e3cb36b3700c57a1bf34412\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-218.ec2.internal" Apr 19 15:25:01.798122 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.797960 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/02ac498d4e3cb36b3700c57a1bf34412-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-218.ec2.internal\" (UID: \"02ac498d4e3cb36b3700c57a1bf34412\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-218.ec2.internal" Apr 19 15:25:01.983418 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:01.983389 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-218.ec2.internal" Apr 19 15:25:02.081430 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.081333 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-218.ec2.internal" Apr 19 15:25:02.284942 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.284909 2579 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 19 15:25:02.285702 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.285049 2579 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 19 15:25:02.285702 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.285076 2579 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 19 15:25:02.285702 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.285088 2579 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 19 15:25:02.369822 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.369794 2579 apiserver.go:52] "Watching apiserver" Apr 19 15:25:02.373336 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.373292 2579 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-18 15:20:01 +0000 UTC" deadline="2027-12-19 09:57:07.387623875 +0000 UTC" Apr 19 15:25:02.373336 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.373330 2579 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14610h32m5.014296048s" Apr 19 15:25:02.375931 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.375905 2579 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 19 15:25:02.376273 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.376253 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/kube-apiserver-proxy-ip-10-0-133-218.ec2.internal","openshift-cluster-node-tuning-operator/tuned-vssx5","openshift-image-registry/node-ca-4tfml","openshift-multus/multus-wnr7b","openshift-multus/network-metrics-daemon-8cprr","openshift-network-diagnostics/network-check-target-r46tx","openshift-ovn-kubernetes/ovnkube-node-xxqlx","kube-system/konnectivity-agent-7r729","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9mv2f","openshift-dns/node-resolver-pss7s","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-218.ec2.internal","openshift-multus/multus-additional-cni-plugins-sz2ds","openshift-network-operator/iptables-alerter-2xg5z"] Apr 19 15:25:02.379292 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.379269 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-vssx5" Apr 19 15:25:02.380338 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.380315 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-4tfml" Apr 19 15:25:02.380475 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.380381 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-wnr7b" Apr 19 15:25:02.381567 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.381544 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 19 15:25:02.381753 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.381664 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 19 15:25:02.381829 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.381801 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-mzwnd\"" Apr 19 15:25:02.381919 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.381899 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8cprr" Apr 19 15:25:02.382003 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:02.381981 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8cprr" podUID="41bb40b9-2854-47c5-8759-3fbea6b42b53" Apr 19 15:25:02.383273 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.382755 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 19 15:25:02.383273 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.382762 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 19 15:25:02.383273 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.382870 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 19 15:25:02.383273 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.382903 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 19 15:25:02.383273 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.382929 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 19 15:25:02.383273 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.382950 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-7lpvg\"" Apr 19 15:25:02.383273 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.382931 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-lqrwl\"" Apr 19 15:25:02.383273 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.383107 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 19 15:25:02.383273 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.383198 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 19 15:25:02.384357 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.384339 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-r46tx" Apr 19 15:25:02.384443 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:02.384406 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-r46tx" podUID="445f4ff9-7c10-4b4e-8d46-b2e4e449c5bc" Apr 19 15:25:02.384525 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.384503 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-xxqlx" Apr 19 15:25:02.386173 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.386157 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-7r729" Apr 19 15:25:02.386569 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.386549 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 19 15:25:02.386657 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.386639 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 19 15:25:02.387118 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.387036 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-qwwqc\"" Apr 19 15:25:02.387118 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.387099 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 19 15:25:02.387243 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.387209 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 19 15:25:02.387285 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.387274 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 19 15:25:02.387669 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.387651 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 19 15:25:02.387797 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.387736 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9mv2f" Apr 19 15:25:02.388179 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.388161 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 19 15:25:02.388179 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.388169 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 19 15:25:02.388313 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.388213 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-k824p\"" Apr 19 15:25:02.389081 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.389065 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-pss7s" Apr 19 15:25:02.389577 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.389563 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 19 15:25:02.389654 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.389640 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 19 15:25:02.389707 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.389655 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 19 15:25:02.390010 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.389994 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-5w29k\"" Apr 19 15:25:02.390347 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.390332 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-sz2ds" Apr 19 15:25:02.390872 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.390854 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 19 15:25:02.390965 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.390923 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-x84qs\"" Apr 19 15:25:02.391026 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.390971 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 19 15:25:02.391439 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.391422 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-2xg5z" Apr 19 15:25:02.392166 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.392147 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 19 15:25:02.392629 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.392609 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 19 15:25:02.392713 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.392691 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-l6tp9\"" Apr 19 15:25:02.393340 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.393318 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 19 15:25:02.393441 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.393396 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 19 15:25:02.393441 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.393437 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 19 15:25:02.393543 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.393462 2579 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 19 15:25:02.393946 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.393925 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-grs4d\"" Apr 19 15:25:02.397757 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.397535 2579 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 19 15:25:02.401605 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.401588 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/119caa96-ae84-4d21-8b14-6d528d9a67fd-host-var-lib-cni-multus\") pod \"multus-wnr7b\" (UID: \"119caa96-ae84-4d21-8b14-6d528d9a67fd\") " pod="openshift-multus/multus-wnr7b" Apr 19 15:25:02.401668 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.401611 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8wwc\" (UniqueName: \"kubernetes.io/projected/41bb40b9-2854-47c5-8759-3fbea6b42b53-kube-api-access-z8wwc\") pod \"network-metrics-daemon-8cprr\" (UID: \"41bb40b9-2854-47c5-8759-3fbea6b42b53\") " pod="openshift-multus/network-metrics-daemon-8cprr" Apr 19 15:25:02.401668 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.401629 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/300a9bea-6a69-423c-8267-02f715cc3b8f-socket-dir\") pod \"aws-ebs-csi-driver-node-9mv2f\" (UID: \"300a9bea-6a69-423c-8267-02f715cc3b8f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9mv2f" Apr 19 15:25:02.401668 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.401645 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/99d8461b-833d-47ad-bf82-81619c11272e-run\") pod \"tuned-vssx5\" (UID: \"99d8461b-833d-47ad-bf82-81619c11272e\") " pod="openshift-cluster-node-tuning-operator/tuned-vssx5" Apr 19 15:25:02.401813 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.401688 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/99d8461b-833d-47ad-bf82-81619c11272e-sys\") pod \"tuned-vssx5\" (UID: \"99d8461b-833d-47ad-bf82-81619c11272e\") " pod="openshift-cluster-node-tuning-operator/tuned-vssx5" Apr 19 15:25:02.401813 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.401763 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/73514b32-300b-4466-b414-022b4c2e1f8e-host-run-netns\") pod \"ovnkube-node-xxqlx\" (UID: \"73514b32-300b-4466-b414-022b4c2e1f8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxqlx" Apr 19 15:25:02.401813 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.401786 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/73514b32-300b-4466-b414-022b4c2e1f8e-ovnkube-config\") pod \"ovnkube-node-xxqlx\" (UID: \"73514b32-300b-4466-b414-022b4c2e1f8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxqlx" Apr 19 15:25:02.401813 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.401801 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/73514b32-300b-4466-b414-022b4c2e1f8e-env-overrides\") pod \"ovnkube-node-xxqlx\" (UID: \"73514b32-300b-4466-b414-022b4c2e1f8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxqlx" Apr 19 15:25:02.401941 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.401818 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89hx2\" (UniqueName: \"kubernetes.io/projected/300a9bea-6a69-423c-8267-02f715cc3b8f-kube-api-access-89hx2\") pod \"aws-ebs-csi-driver-node-9mv2f\" (UID: \"300a9bea-6a69-423c-8267-02f715cc3b8f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9mv2f" Apr 19 15:25:02.401941 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.401841 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/73514b32-300b-4466-b414-022b4c2e1f8e-ovnkube-script-lib\") pod \"ovnkube-node-xxqlx\" (UID: \"73514b32-300b-4466-b414-022b4c2e1f8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxqlx" Apr 19 15:25:02.401941 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.401865 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mk8fx\" (UniqueName: \"kubernetes.io/projected/99d8461b-833d-47ad-bf82-81619c11272e-kube-api-access-mk8fx\") pod \"tuned-vssx5\" (UID: \"99d8461b-833d-47ad-bf82-81619c11272e\") " pod="openshift-cluster-node-tuning-operator/tuned-vssx5" Apr 19 15:25:02.401941 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.401880 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/300a9bea-6a69-423c-8267-02f715cc3b8f-registration-dir\") pod \"aws-ebs-csi-driver-node-9mv2f\" (UID: \"300a9bea-6a69-423c-8267-02f715cc3b8f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9mv2f" Apr 19 15:25:02.401941 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.401908 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/99d8461b-833d-47ad-bf82-81619c11272e-etc-sysctl-conf\") pod \"tuned-vssx5\" (UID: \"99d8461b-833d-47ad-bf82-81619c11272e\") " pod="openshift-cluster-node-tuning-operator/tuned-vssx5" Apr 19 15:25:02.401941 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.401926 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/99d8461b-833d-47ad-bf82-81619c11272e-etc-tuned\") pod \"tuned-vssx5\" (UID: \"99d8461b-833d-47ad-bf82-81619c11272e\") " pod="openshift-cluster-node-tuning-operator/tuned-vssx5" Apr 19 15:25:02.401941 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.401941 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/73514b32-300b-4466-b414-022b4c2e1f8e-host-kubelet\") pod \"ovnkube-node-xxqlx\" (UID: \"73514b32-300b-4466-b414-022b4c2e1f8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxqlx" Apr 19 15:25:02.402136 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.401954 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/73514b32-300b-4466-b414-022b4c2e1f8e-log-socket\") pod \"ovnkube-node-xxqlx\" (UID: \"73514b32-300b-4466-b414-022b4c2e1f8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxqlx" Apr 19 15:25:02.402136 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.401968 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/119caa96-ae84-4d21-8b14-6d528d9a67fd-host-run-netns\") pod \"multus-wnr7b\" (UID: \"119caa96-ae84-4d21-8b14-6d528d9a67fd\") " pod="openshift-multus/multus-wnr7b" Apr 19 15:25:02.402136 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.401985 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/119caa96-ae84-4d21-8b14-6d528d9a67fd-hostroot\") pod \"multus-wnr7b\" (UID: \"119caa96-ae84-4d21-8b14-6d528d9a67fd\") " pod="openshift-multus/multus-wnr7b" Apr 19 15:25:02.402136 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.402003 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/3f939e64-6dbd-4802-9d83-7251b53cdcb5-konnectivity-ca\") pod \"konnectivity-agent-7r729\" (UID: \"3f939e64-6dbd-4802-9d83-7251b53cdcb5\") " pod="kube-system/konnectivity-agent-7r729" Apr 19 15:25:02.402136 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.402019 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d6faab90-56cc-458f-bf13-4b00ae0b1686-system-cni-dir\") pod \"multus-additional-cni-plugins-sz2ds\" (UID: \"d6faab90-56cc-458f-bf13-4b00ae0b1686\") " pod="openshift-multus/multus-additional-cni-plugins-sz2ds" Apr 19 15:25:02.402136 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.402035 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/73514b32-300b-4466-b414-022b4c2e1f8e-run-ovn\") pod \"ovnkube-node-xxqlx\" (UID: \"73514b32-300b-4466-b414-022b4c2e1f8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxqlx" Apr 19 15:25:02.402136 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.402048 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/73514b32-300b-4466-b414-022b4c2e1f8e-host-run-ovn-kubernetes\") pod \"ovnkube-node-xxqlx\" (UID: \"73514b32-300b-4466-b414-022b4c2e1f8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxqlx" Apr 19 15:25:02.402136 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.402062 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/73514b32-300b-4466-b414-022b4c2e1f8e-host-cni-netd\") pod \"ovnkube-node-xxqlx\" (UID: \"73514b32-300b-4466-b414-022b4c2e1f8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxqlx" Apr 19 15:25:02.402136 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.402076 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/119caa96-ae84-4d21-8b14-6d528d9a67fd-os-release\") pod \"multus-wnr7b\" (UID: \"119caa96-ae84-4d21-8b14-6d528d9a67fd\") " pod="openshift-multus/multus-wnr7b" Apr 19 15:25:02.402136 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.402091 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwmzv\" (UniqueName: \"kubernetes.io/projected/445f4ff9-7c10-4b4e-8d46-b2e4e449c5bc-kube-api-access-rwmzv\") pod \"network-check-target-r46tx\" (UID: \"445f4ff9-7c10-4b4e-8d46-b2e4e449c5bc\") " pod="openshift-network-diagnostics/network-check-target-r46tx" Apr 19 15:25:02.402136 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.402104 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/73514b32-300b-4466-b414-022b4c2e1f8e-systemd-units\") pod \"ovnkube-node-xxqlx\" (UID: \"73514b32-300b-4466-b414-022b4c2e1f8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxqlx" Apr 19 15:25:02.402136 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.402119 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/119caa96-ae84-4d21-8b14-6d528d9a67fd-multus-cni-dir\") pod \"multus-wnr7b\" (UID: \"119caa96-ae84-4d21-8b14-6d528d9a67fd\") " pod="openshift-multus/multus-wnr7b" Apr 19 15:25:02.402136 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.402132 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/119caa96-ae84-4d21-8b14-6d528d9a67fd-host-run-k8s-cni-cncf-io\") pod \"multus-wnr7b\" (UID: \"119caa96-ae84-4d21-8b14-6d528d9a67fd\") " pod="openshift-multus/multus-wnr7b" Apr 19 15:25:02.402609 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.402147 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dssq\" (UniqueName: \"kubernetes.io/projected/119caa96-ae84-4d21-8b14-6d528d9a67fd-kube-api-access-4dssq\") pod \"multus-wnr7b\" (UID: \"119caa96-ae84-4d21-8b14-6d528d9a67fd\") " pod="openshift-multus/multus-wnr7b" Apr 19 15:25:02.402609 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.402161 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d6faab90-56cc-458f-bf13-4b00ae0b1686-cni-binary-copy\") pod \"multus-additional-cni-plugins-sz2ds\" (UID: \"d6faab90-56cc-458f-bf13-4b00ae0b1686\") " pod="openshift-multus/multus-additional-cni-plugins-sz2ds" Apr 19 15:25:02.402609 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.402182 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phxpv\" (UniqueName: \"kubernetes.io/projected/73514b32-300b-4466-b414-022b4c2e1f8e-kube-api-access-phxpv\") pod \"ovnkube-node-xxqlx\" (UID: \"73514b32-300b-4466-b414-022b4c2e1f8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxqlx" Apr 19 15:25:02.402609 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.402197 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/dac7973c-ee33-410c-8f77-093953d73a03-hosts-file\") pod \"node-resolver-pss7s\" (UID: \"dac7973c-ee33-410c-8f77-093953d73a03\") " pod="openshift-dns/node-resolver-pss7s" Apr 19 15:25:02.402609 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.402209 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/119caa96-ae84-4d21-8b14-6d528d9a67fd-cnibin\") pod \"multus-wnr7b\" (UID: \"119caa96-ae84-4d21-8b14-6d528d9a67fd\") " pod="openshift-multus/multus-wnr7b" Apr 19 15:25:02.402609 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.402223 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b3a083b4-d7b2-4f52-b323-b957d5ebc531-serviceca\") pod \"node-ca-4tfml\" (UID: \"b3a083b4-d7b2-4f52-b323-b957d5ebc531\") " pod="openshift-image-registry/node-ca-4tfml" Apr 19 15:25:02.402609 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.402264 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/73514b32-300b-4466-b414-022b4c2e1f8e-etc-openvswitch\") pod \"ovnkube-node-xxqlx\" (UID: \"73514b32-300b-4466-b414-022b4c2e1f8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxqlx" Apr 19 15:25:02.402609 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.402287 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/73514b32-300b-4466-b414-022b4c2e1f8e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-xxqlx\" (UID: \"73514b32-300b-4466-b414-022b4c2e1f8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxqlx" Apr 19 15:25:02.402609 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.402303 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgw26\" (UniqueName: \"kubernetes.io/projected/d24bd074-79ed-4888-8f3c-4aa16738fea6-kube-api-access-tgw26\") pod \"iptables-alerter-2xg5z\" (UID: \"d24bd074-79ed-4888-8f3c-4aa16738fea6\") " pod="openshift-network-operator/iptables-alerter-2xg5z" Apr 19 15:25:02.402609 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.402317 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/99d8461b-833d-47ad-bf82-81619c11272e-etc-modprobe-d\") pod \"tuned-vssx5\" (UID: \"99d8461b-833d-47ad-bf82-81619c11272e\") " pod="openshift-cluster-node-tuning-operator/tuned-vssx5" Apr 19 15:25:02.402609 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.402346 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/99d8461b-833d-47ad-bf82-81619c11272e-host\") pod \"tuned-vssx5\" (UID: \"99d8461b-833d-47ad-bf82-81619c11272e\") " pod="openshift-cluster-node-tuning-operator/tuned-vssx5" Apr 19 15:25:02.402609 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.402368 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d24bd074-79ed-4888-8f3c-4aa16738fea6-host-slash\") pod \"iptables-alerter-2xg5z\" (UID: \"d24bd074-79ed-4888-8f3c-4aa16738fea6\") " pod="openshift-network-operator/iptables-alerter-2xg5z" Apr 19 15:25:02.402609 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.402384 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/41bb40b9-2854-47c5-8759-3fbea6b42b53-metrics-certs\") pod \"network-metrics-daemon-8cprr\" (UID: \"41bb40b9-2854-47c5-8759-3fbea6b42b53\") " pod="openshift-multus/network-metrics-daemon-8cprr" Apr 19 15:25:02.402609 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.402399 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d6faab90-56cc-458f-bf13-4b00ae0b1686-cnibin\") pod \"multus-additional-cni-plugins-sz2ds\" (UID: \"d6faab90-56cc-458f-bf13-4b00ae0b1686\") " pod="openshift-multus/multus-additional-cni-plugins-sz2ds" Apr 19 15:25:02.402609 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.402414 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5fml\" (UniqueName: \"kubernetes.io/projected/b3a083b4-d7b2-4f52-b323-b957d5ebc531-kube-api-access-d5fml\") pod \"node-ca-4tfml\" (UID: \"b3a083b4-d7b2-4f52-b323-b957d5ebc531\") " pod="openshift-image-registry/node-ca-4tfml" Apr 19 15:25:02.402609 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.402429 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/73514b32-300b-4466-b414-022b4c2e1f8e-host-slash\") pod \"ovnkube-node-xxqlx\" (UID: \"73514b32-300b-4466-b414-022b4c2e1f8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxqlx" Apr 19 15:25:02.403234 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.402465 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/73514b32-300b-4466-b414-022b4c2e1f8e-run-openvswitch\") pod \"ovnkube-node-xxqlx\" (UID: \"73514b32-300b-4466-b414-022b4c2e1f8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxqlx" Apr 19 15:25:02.403234 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.402520 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/119caa96-ae84-4d21-8b14-6d528d9a67fd-host-var-lib-kubelet\") pod \"multus-wnr7b\" (UID: \"119caa96-ae84-4d21-8b14-6d528d9a67fd\") " pod="openshift-multus/multus-wnr7b" Apr 19 15:25:02.403234 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.402547 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/300a9bea-6a69-423c-8267-02f715cc3b8f-sys-fs\") pod \"aws-ebs-csi-driver-node-9mv2f\" (UID: \"300a9bea-6a69-423c-8267-02f715cc3b8f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9mv2f" Apr 19 15:25:02.403234 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.402571 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/99d8461b-833d-47ad-bf82-81619c11272e-etc-sysconfig\") pod \"tuned-vssx5\" (UID: \"99d8461b-833d-47ad-bf82-81619c11272e\") " pod="openshift-cluster-node-tuning-operator/tuned-vssx5" Apr 19 15:25:02.403234 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.402594 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/99d8461b-833d-47ad-bf82-81619c11272e-etc-sysctl-d\") pod \"tuned-vssx5\" (UID: \"99d8461b-833d-47ad-bf82-81619c11272e\") " pod="openshift-cluster-node-tuning-operator/tuned-vssx5" Apr 19 15:25:02.403234 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.402617 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/73514b32-300b-4466-b414-022b4c2e1f8e-run-systemd\") pod \"ovnkube-node-xxqlx\" (UID: \"73514b32-300b-4466-b414-022b4c2e1f8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxqlx" Apr 19 15:25:02.403234 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.402638 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/73514b32-300b-4466-b414-022b4c2e1f8e-node-log\") pod \"ovnkube-node-xxqlx\" (UID: \"73514b32-300b-4466-b414-022b4c2e1f8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxqlx" Apr 19 15:25:02.403234 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.402659 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d24bd074-79ed-4888-8f3c-4aa16738fea6-iptables-alerter-script\") pod \"iptables-alerter-2xg5z\" (UID: \"d24bd074-79ed-4888-8f3c-4aa16738fea6\") " pod="openshift-network-operator/iptables-alerter-2xg5z" Apr 19 15:25:02.403234 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.402691 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b3a083b4-d7b2-4f52-b323-b957d5ebc531-host\") pod \"node-ca-4tfml\" (UID: \"b3a083b4-d7b2-4f52-b323-b957d5ebc531\") " pod="openshift-image-registry/node-ca-4tfml" Apr 19 15:25:02.403234 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.402705 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/119caa96-ae84-4d21-8b14-6d528d9a67fd-host-run-multus-certs\") pod \"multus-wnr7b\" (UID: \"119caa96-ae84-4d21-8b14-6d528d9a67fd\") " pod="openshift-multus/multus-wnr7b" Apr 19 15:25:02.403234 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.402737 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/119caa96-ae84-4d21-8b14-6d528d9a67fd-etc-kubernetes\") pod \"multus-wnr7b\" (UID: \"119caa96-ae84-4d21-8b14-6d528d9a67fd\") " pod="openshift-multus/multus-wnr7b" Apr 19 15:25:02.403234 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.402757 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/3f939e64-6dbd-4802-9d83-7251b53cdcb5-agent-certs\") pod \"konnectivity-agent-7r729\" (UID: \"3f939e64-6dbd-4802-9d83-7251b53cdcb5\") " pod="kube-system/konnectivity-agent-7r729" Apr 19 15:25:02.403234 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.402771 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d6faab90-56cc-458f-bf13-4b00ae0b1686-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-sz2ds\" (UID: \"d6faab90-56cc-458f-bf13-4b00ae0b1686\") " pod="openshift-multus/multus-additional-cni-plugins-sz2ds" Apr 19 15:25:02.403234 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.402790 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/dac7973c-ee33-410c-8f77-093953d73a03-tmp-dir\") pod \"node-resolver-pss7s\" (UID: \"dac7973c-ee33-410c-8f77-093953d73a03\") " pod="openshift-dns/node-resolver-pss7s" Apr 19 15:25:02.403234 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.402810 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/99d8461b-833d-47ad-bf82-81619c11272e-etc-kubernetes\") pod \"tuned-vssx5\" (UID: \"99d8461b-833d-47ad-bf82-81619c11272e\") " pod="openshift-cluster-node-tuning-operator/tuned-vssx5" Apr 19 15:25:02.403234 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.402834 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/99d8461b-833d-47ad-bf82-81619c11272e-tmp\") pod \"tuned-vssx5\" (UID: \"99d8461b-833d-47ad-bf82-81619c11272e\") " pod="openshift-cluster-node-tuning-operator/tuned-vssx5" Apr 19 15:25:02.403234 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.402856 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/119caa96-ae84-4d21-8b14-6d528d9a67fd-cni-binary-copy\") pod \"multus-wnr7b\" (UID: \"119caa96-ae84-4d21-8b14-6d528d9a67fd\") " pod="openshift-multus/multus-wnr7b" Apr 19 15:25:02.403793 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.402875 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/119caa96-ae84-4d21-8b14-6d528d9a67fd-multus-conf-dir\") pod \"multus-wnr7b\" (UID: \"119caa96-ae84-4d21-8b14-6d528d9a67fd\") " pod="openshift-multus/multus-wnr7b" Apr 19 15:25:02.403793 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.402896 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/d6faab90-56cc-458f-bf13-4b00ae0b1686-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-sz2ds\" (UID: \"d6faab90-56cc-458f-bf13-4b00ae0b1686\") " pod="openshift-multus/multus-additional-cni-plugins-sz2ds" Apr 19 15:25:02.403793 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.402910 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/119caa96-ae84-4d21-8b14-6d528d9a67fd-system-cni-dir\") pod \"multus-wnr7b\" (UID: \"119caa96-ae84-4d21-8b14-6d528d9a67fd\") " pod="openshift-multus/multus-wnr7b" Apr 19 15:25:02.403793 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.402925 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/119caa96-ae84-4d21-8b14-6d528d9a67fd-host-var-lib-cni-bin\") pod \"multus-wnr7b\" (UID: \"119caa96-ae84-4d21-8b14-6d528d9a67fd\") " pod="openshift-multus/multus-wnr7b" Apr 19 15:25:02.403793 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.402959 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/300a9bea-6a69-423c-8267-02f715cc3b8f-kubelet-dir\") pod \"aws-ebs-csi-driver-node-9mv2f\" (UID: \"300a9bea-6a69-423c-8267-02f715cc3b8f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9mv2f" Apr 19 15:25:02.403793 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.402977 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d6faab90-56cc-458f-bf13-4b00ae0b1686-os-release\") pod \"multus-additional-cni-plugins-sz2ds\" (UID: \"d6faab90-56cc-458f-bf13-4b00ae0b1686\") " pod="openshift-multus/multus-additional-cni-plugins-sz2ds" Apr 19 15:25:02.403793 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.402991 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d6faab90-56cc-458f-bf13-4b00ae0b1686-tuning-conf-dir\") pod \"multus-additional-cni-plugins-sz2ds\" (UID: \"d6faab90-56cc-458f-bf13-4b00ae0b1686\") " pod="openshift-multus/multus-additional-cni-plugins-sz2ds" Apr 19 15:25:02.403793 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.403006 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/99d8461b-833d-47ad-bf82-81619c11272e-etc-systemd\") pod \"tuned-vssx5\" (UID: \"99d8461b-833d-47ad-bf82-81619c11272e\") " pod="openshift-cluster-node-tuning-operator/tuned-vssx5" Apr 19 15:25:02.403793 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.403028 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/99d8461b-833d-47ad-bf82-81619c11272e-var-lib-kubelet\") pod \"tuned-vssx5\" (UID: \"99d8461b-833d-47ad-bf82-81619c11272e\") " pod="openshift-cluster-node-tuning-operator/tuned-vssx5" Apr 19 15:25:02.403793 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.403048 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/73514b32-300b-4466-b414-022b4c2e1f8e-var-lib-openvswitch\") pod \"ovnkube-node-xxqlx\" (UID: \"73514b32-300b-4466-b414-022b4c2e1f8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxqlx" Apr 19 15:25:02.403793 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.403062 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/300a9bea-6a69-423c-8267-02f715cc3b8f-device-dir\") pod \"aws-ebs-csi-driver-node-9mv2f\" (UID: \"300a9bea-6a69-423c-8267-02f715cc3b8f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9mv2f" Apr 19 15:25:02.403793 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.403094 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/300a9bea-6a69-423c-8267-02f715cc3b8f-etc-selinux\") pod \"aws-ebs-csi-driver-node-9mv2f\" (UID: \"300a9bea-6a69-423c-8267-02f715cc3b8f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9mv2f" Apr 19 15:25:02.403793 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.403124 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/99d8461b-833d-47ad-bf82-81619c11272e-lib-modules\") pod \"tuned-vssx5\" (UID: \"99d8461b-833d-47ad-bf82-81619c11272e\") " pod="openshift-cluster-node-tuning-operator/tuned-vssx5" Apr 19 15:25:02.403793 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.403148 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/73514b32-300b-4466-b414-022b4c2e1f8e-ovn-node-metrics-cert\") pod \"ovnkube-node-xxqlx\" (UID: \"73514b32-300b-4466-b414-022b4c2e1f8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxqlx" Apr 19 15:25:02.403793 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.403176 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89cj4\" (UniqueName: \"kubernetes.io/projected/dac7973c-ee33-410c-8f77-093953d73a03-kube-api-access-89cj4\") pod \"node-resolver-pss7s\" (UID: \"dac7973c-ee33-410c-8f77-093953d73a03\") " pod="openshift-dns/node-resolver-pss7s" Apr 19 15:25:02.403793 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.403201 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/119caa96-ae84-4d21-8b14-6d528d9a67fd-multus-socket-dir-parent\") pod \"multus-wnr7b\" (UID: \"119caa96-ae84-4d21-8b14-6d528d9a67fd\") " pod="openshift-multus/multus-wnr7b" Apr 19 15:25:02.404298 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.403224 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/119caa96-ae84-4d21-8b14-6d528d9a67fd-multus-daemon-config\") pod \"multus-wnr7b\" (UID: \"119caa96-ae84-4d21-8b14-6d528d9a67fd\") " pod="openshift-multus/multus-wnr7b" Apr 19 15:25:02.404298 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.403247 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2fwh\" (UniqueName: \"kubernetes.io/projected/d6faab90-56cc-458f-bf13-4b00ae0b1686-kube-api-access-v2fwh\") pod \"multus-additional-cni-plugins-sz2ds\" (UID: \"d6faab90-56cc-458f-bf13-4b00ae0b1686\") " pod="openshift-multus/multus-additional-cni-plugins-sz2ds" Apr 19 15:25:02.404298 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.403269 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/73514b32-300b-4466-b414-022b4c2e1f8e-host-cni-bin\") pod \"ovnkube-node-xxqlx\" (UID: \"73514b32-300b-4466-b414-022b4c2e1f8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxqlx" Apr 19 15:25:02.407817 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.407793 2579 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 19 15:25:02.427424 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.427396 2579 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-n2fhm" Apr 19 15:25:02.434982 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.434958 2579 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-n2fhm" Apr 19 15:25:02.458672 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:02.458637 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ddb39581fd73259e0c27abfdf033d32.slice/crio-d45d4c83296731e6fd61a6cdde5dcfbe9f092c072a3d44455a2814296f05f10b WatchSource:0}: Error finding container d45d4c83296731e6fd61a6cdde5dcfbe9f092c072a3d44455a2814296f05f10b: Status 404 returned error can't find the container with id d45d4c83296731e6fd61a6cdde5dcfbe9f092c072a3d44455a2814296f05f10b Apr 19 15:25:02.464507 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.464487 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 19 15:25:02.467625 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:02.467597 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02ac498d4e3cb36b3700c57a1bf34412.slice/crio-61e9611d5847960fc3865424aa53eaa1fbe3b7ba4d7470e23064c0c99c3173b6 WatchSource:0}: Error finding container 61e9611d5847960fc3865424aa53eaa1fbe3b7ba4d7470e23064c0c99c3173b6: Status 404 returned error can't find the container with id 61e9611d5847960fc3865424aa53eaa1fbe3b7ba4d7470e23064c0c99c3173b6 Apr 19 15:25:02.503545 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.503513 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4dssq\" (UniqueName: \"kubernetes.io/projected/119caa96-ae84-4d21-8b14-6d528d9a67fd-kube-api-access-4dssq\") pod \"multus-wnr7b\" (UID: \"119caa96-ae84-4d21-8b14-6d528d9a67fd\") " pod="openshift-multus/multus-wnr7b" Apr 19 15:25:02.503545 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.503549 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d6faab90-56cc-458f-bf13-4b00ae0b1686-cni-binary-copy\") pod \"multus-additional-cni-plugins-sz2ds\" (UID: \"d6faab90-56cc-458f-bf13-4b00ae0b1686\") " pod="openshift-multus/multus-additional-cni-plugins-sz2ds" Apr 19 15:25:02.503772 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.503567 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-phxpv\" (UniqueName: \"kubernetes.io/projected/73514b32-300b-4466-b414-022b4c2e1f8e-kube-api-access-phxpv\") pod \"ovnkube-node-xxqlx\" (UID: \"73514b32-300b-4466-b414-022b4c2e1f8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxqlx" Apr 19 15:25:02.503772 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.503608 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/dac7973c-ee33-410c-8f77-093953d73a03-hosts-file\") pod \"node-resolver-pss7s\" (UID: \"dac7973c-ee33-410c-8f77-093953d73a03\") " pod="openshift-dns/node-resolver-pss7s" Apr 19 15:25:02.503772 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.503626 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/119caa96-ae84-4d21-8b14-6d528d9a67fd-cnibin\") pod \"multus-wnr7b\" (UID: \"119caa96-ae84-4d21-8b14-6d528d9a67fd\") " pod="openshift-multus/multus-wnr7b" Apr 19 15:25:02.503772 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.503640 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b3a083b4-d7b2-4f52-b323-b957d5ebc531-serviceca\") pod \"node-ca-4tfml\" (UID: \"b3a083b4-d7b2-4f52-b323-b957d5ebc531\") " pod="openshift-image-registry/node-ca-4tfml" Apr 19 15:25:02.503772 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.503655 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/73514b32-300b-4466-b414-022b4c2e1f8e-etc-openvswitch\") pod \"ovnkube-node-xxqlx\" (UID: \"73514b32-300b-4466-b414-022b4c2e1f8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxqlx" Apr 19 15:25:02.503772 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.503750 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/119caa96-ae84-4d21-8b14-6d528d9a67fd-cnibin\") pod \"multus-wnr7b\" (UID: \"119caa96-ae84-4d21-8b14-6d528d9a67fd\") " pod="openshift-multus/multus-wnr7b" Apr 19 15:25:02.503772 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.503749 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/dac7973c-ee33-410c-8f77-093953d73a03-hosts-file\") pod \"node-resolver-pss7s\" (UID: \"dac7973c-ee33-410c-8f77-093953d73a03\") " pod="openshift-dns/node-resolver-pss7s" Apr 19 15:25:02.504055 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.503797 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/73514b32-300b-4466-b414-022b4c2e1f8e-etc-openvswitch\") pod \"ovnkube-node-xxqlx\" (UID: \"73514b32-300b-4466-b414-022b4c2e1f8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxqlx" Apr 19 15:25:02.504055 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.503803 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/73514b32-300b-4466-b414-022b4c2e1f8e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-xxqlx\" (UID: \"73514b32-300b-4466-b414-022b4c2e1f8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxqlx" Apr 19 15:25:02.504055 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.503843 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/73514b32-300b-4466-b414-022b4c2e1f8e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-xxqlx\" (UID: \"73514b32-300b-4466-b414-022b4c2e1f8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxqlx" Apr 19 15:25:02.504055 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.503869 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tgw26\" (UniqueName: \"kubernetes.io/projected/d24bd074-79ed-4888-8f3c-4aa16738fea6-kube-api-access-tgw26\") pod \"iptables-alerter-2xg5z\" (UID: \"d24bd074-79ed-4888-8f3c-4aa16738fea6\") " pod="openshift-network-operator/iptables-alerter-2xg5z" Apr 19 15:25:02.504055 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.503910 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/99d8461b-833d-47ad-bf82-81619c11272e-etc-modprobe-d\") pod \"tuned-vssx5\" (UID: \"99d8461b-833d-47ad-bf82-81619c11272e\") " pod="openshift-cluster-node-tuning-operator/tuned-vssx5" Apr 19 15:25:02.504055 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.503934 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/99d8461b-833d-47ad-bf82-81619c11272e-host\") pod \"tuned-vssx5\" (UID: \"99d8461b-833d-47ad-bf82-81619c11272e\") " pod="openshift-cluster-node-tuning-operator/tuned-vssx5" Apr 19 15:25:02.504055 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.503958 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d24bd074-79ed-4888-8f3c-4aa16738fea6-host-slash\") pod \"iptables-alerter-2xg5z\" (UID: \"d24bd074-79ed-4888-8f3c-4aa16738fea6\") " pod="openshift-network-operator/iptables-alerter-2xg5z" Apr 19 15:25:02.504055 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.503992 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/41bb40b9-2854-47c5-8759-3fbea6b42b53-metrics-certs\") pod \"network-metrics-daemon-8cprr\" (UID: \"41bb40b9-2854-47c5-8759-3fbea6b42b53\") " pod="openshift-multus/network-metrics-daemon-8cprr" Apr 19 15:25:02.504055 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.504026 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/99d8461b-833d-47ad-bf82-81619c11272e-host\") pod \"tuned-vssx5\" (UID: \"99d8461b-833d-47ad-bf82-81619c11272e\") " pod="openshift-cluster-node-tuning-operator/tuned-vssx5" Apr 19 15:25:02.504055 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.504033 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d24bd074-79ed-4888-8f3c-4aa16738fea6-host-slash\") pod \"iptables-alerter-2xg5z\" (UID: \"d24bd074-79ed-4888-8f3c-4aa16738fea6\") " pod="openshift-network-operator/iptables-alerter-2xg5z" Apr 19 15:25:02.504438 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.504067 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d6faab90-56cc-458f-bf13-4b00ae0b1686-cnibin\") pod \"multus-additional-cni-plugins-sz2ds\" (UID: \"d6faab90-56cc-458f-bf13-4b00ae0b1686\") " pod="openshift-multus/multus-additional-cni-plugins-sz2ds" Apr 19 15:25:02.504438 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:02.504091 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 19 15:25:02.504438 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.504096 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d5fml\" (UniqueName: \"kubernetes.io/projected/b3a083b4-d7b2-4f52-b323-b957d5ebc531-kube-api-access-d5fml\") pod \"node-ca-4tfml\" (UID: \"b3a083b4-d7b2-4f52-b323-b957d5ebc531\") " pod="openshift-image-registry/node-ca-4tfml" Apr 19 15:25:02.504438 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.504101 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d6faab90-56cc-458f-bf13-4b00ae0b1686-cnibin\") pod \"multus-additional-cni-plugins-sz2ds\" (UID: \"d6faab90-56cc-458f-bf13-4b00ae0b1686\") " pod="openshift-multus/multus-additional-cni-plugins-sz2ds" Apr 19 15:25:02.504438 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.504123 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/73514b32-300b-4466-b414-022b4c2e1f8e-host-slash\") pod \"ovnkube-node-xxqlx\" (UID: \"73514b32-300b-4466-b414-022b4c2e1f8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxqlx" Apr 19 15:25:02.504438 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.504085 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b3a083b4-d7b2-4f52-b323-b957d5ebc531-serviceca\") pod \"node-ca-4tfml\" (UID: \"b3a083b4-d7b2-4f52-b323-b957d5ebc531\") " pod="openshift-image-registry/node-ca-4tfml" Apr 19 15:25:02.504438 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.504166 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/73514b32-300b-4466-b414-022b4c2e1f8e-host-slash\") pod \"ovnkube-node-xxqlx\" (UID: \"73514b32-300b-4466-b414-022b4c2e1f8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxqlx" Apr 19 15:25:02.504438 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.504157 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/99d8461b-833d-47ad-bf82-81619c11272e-etc-modprobe-d\") pod \"tuned-vssx5\" (UID: \"99d8461b-833d-47ad-bf82-81619c11272e\") " pod="openshift-cluster-node-tuning-operator/tuned-vssx5" Apr 19 15:25:02.504438 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.504162 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d6faab90-56cc-458f-bf13-4b00ae0b1686-cni-binary-copy\") pod \"multus-additional-cni-plugins-sz2ds\" (UID: \"d6faab90-56cc-458f-bf13-4b00ae0b1686\") " pod="openshift-multus/multus-additional-cni-plugins-sz2ds" Apr 19 15:25:02.504438 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:02.504165 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/41bb40b9-2854-47c5-8759-3fbea6b42b53-metrics-certs podName:41bb40b9-2854-47c5-8759-3fbea6b42b53 nodeName:}" failed. No retries permitted until 2026-04-19 15:25:03.004139968 +0000 UTC m=+2.110055334 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/41bb40b9-2854-47c5-8759-3fbea6b42b53-metrics-certs") pod "network-metrics-daemon-8cprr" (UID: "41bb40b9-2854-47c5-8759-3fbea6b42b53") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 19 15:25:02.504438 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.504230 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/73514b32-300b-4466-b414-022b4c2e1f8e-run-openvswitch\") pod \"ovnkube-node-xxqlx\" (UID: \"73514b32-300b-4466-b414-022b4c2e1f8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxqlx" Apr 19 15:25:02.504438 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.504258 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/119caa96-ae84-4d21-8b14-6d528d9a67fd-host-var-lib-kubelet\") pod \"multus-wnr7b\" (UID: \"119caa96-ae84-4d21-8b14-6d528d9a67fd\") " pod="openshift-multus/multus-wnr7b" Apr 19 15:25:02.504438 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.504268 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/73514b32-300b-4466-b414-022b4c2e1f8e-run-openvswitch\") pod \"ovnkube-node-xxqlx\" (UID: \"73514b32-300b-4466-b414-022b4c2e1f8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxqlx" Apr 19 15:25:02.504438 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.504283 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/300a9bea-6a69-423c-8267-02f715cc3b8f-sys-fs\") pod \"aws-ebs-csi-driver-node-9mv2f\" (UID: \"300a9bea-6a69-423c-8267-02f715cc3b8f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9mv2f" Apr 19 15:25:02.504438 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.504308 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/99d8461b-833d-47ad-bf82-81619c11272e-etc-sysconfig\") pod \"tuned-vssx5\" (UID: \"99d8461b-833d-47ad-bf82-81619c11272e\") " pod="openshift-cluster-node-tuning-operator/tuned-vssx5" Apr 19 15:25:02.504438 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.504310 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/119caa96-ae84-4d21-8b14-6d528d9a67fd-host-var-lib-kubelet\") pod \"multus-wnr7b\" (UID: \"119caa96-ae84-4d21-8b14-6d528d9a67fd\") " pod="openshift-multus/multus-wnr7b" Apr 19 15:25:02.504438 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.504333 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/99d8461b-833d-47ad-bf82-81619c11272e-etc-sysctl-d\") pod \"tuned-vssx5\" (UID: \"99d8461b-833d-47ad-bf82-81619c11272e\") " pod="openshift-cluster-node-tuning-operator/tuned-vssx5" Apr 19 15:25:02.505147 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.504367 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/300a9bea-6a69-423c-8267-02f715cc3b8f-sys-fs\") pod \"aws-ebs-csi-driver-node-9mv2f\" (UID: \"300a9bea-6a69-423c-8267-02f715cc3b8f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9mv2f" Apr 19 15:25:02.505147 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.504376 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/99d8461b-833d-47ad-bf82-81619c11272e-etc-sysconfig\") pod \"tuned-vssx5\" (UID: \"99d8461b-833d-47ad-bf82-81619c11272e\") " pod="openshift-cluster-node-tuning-operator/tuned-vssx5" Apr 19 15:25:02.505147 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.504401 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/73514b32-300b-4466-b414-022b4c2e1f8e-run-systemd\") pod \"ovnkube-node-xxqlx\" (UID: \"73514b32-300b-4466-b414-022b4c2e1f8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxqlx" Apr 19 15:25:02.505147 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.504432 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/73514b32-300b-4466-b414-022b4c2e1f8e-node-log\") pod \"ovnkube-node-xxqlx\" (UID: \"73514b32-300b-4466-b414-022b4c2e1f8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxqlx" Apr 19 15:25:02.505147 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.504435 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/99d8461b-833d-47ad-bf82-81619c11272e-etc-sysctl-d\") pod \"tuned-vssx5\" (UID: \"99d8461b-833d-47ad-bf82-81619c11272e\") " pod="openshift-cluster-node-tuning-operator/tuned-vssx5" Apr 19 15:25:02.505147 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.504450 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d24bd074-79ed-4888-8f3c-4aa16738fea6-iptables-alerter-script\") pod \"iptables-alerter-2xg5z\" (UID: \"d24bd074-79ed-4888-8f3c-4aa16738fea6\") " pod="openshift-network-operator/iptables-alerter-2xg5z" Apr 19 15:25:02.505147 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.504467 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b3a083b4-d7b2-4f52-b323-b957d5ebc531-host\") pod \"node-ca-4tfml\" (UID: \"b3a083b4-d7b2-4f52-b323-b957d5ebc531\") " pod="openshift-image-registry/node-ca-4tfml" Apr 19 15:25:02.505147 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.504475 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/73514b32-300b-4466-b414-022b4c2e1f8e-run-systemd\") pod \"ovnkube-node-xxqlx\" (UID: \"73514b32-300b-4466-b414-022b4c2e1f8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxqlx" Apr 19 15:25:02.505147 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.504484 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/119caa96-ae84-4d21-8b14-6d528d9a67fd-host-run-multus-certs\") pod \"multus-wnr7b\" (UID: \"119caa96-ae84-4d21-8b14-6d528d9a67fd\") " pod="openshift-multus/multus-wnr7b" Apr 19 15:25:02.505147 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.504499 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/119caa96-ae84-4d21-8b14-6d528d9a67fd-etc-kubernetes\") pod \"multus-wnr7b\" (UID: \"119caa96-ae84-4d21-8b14-6d528d9a67fd\") " pod="openshift-multus/multus-wnr7b" Apr 19 15:25:02.505147 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.504513 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/73514b32-300b-4466-b414-022b4c2e1f8e-node-log\") pod \"ovnkube-node-xxqlx\" (UID: \"73514b32-300b-4466-b414-022b4c2e1f8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxqlx" Apr 19 15:25:02.505147 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.504516 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b3a083b4-d7b2-4f52-b323-b957d5ebc531-host\") pod \"node-ca-4tfml\" (UID: \"b3a083b4-d7b2-4f52-b323-b957d5ebc531\") " pod="openshift-image-registry/node-ca-4tfml" Apr 19 15:25:02.505147 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.504534 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/119caa96-ae84-4d21-8b14-6d528d9a67fd-host-run-multus-certs\") pod \"multus-wnr7b\" (UID: \"119caa96-ae84-4d21-8b14-6d528d9a67fd\") " pod="openshift-multus/multus-wnr7b" Apr 19 15:25:02.505147 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.504535 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/119caa96-ae84-4d21-8b14-6d528d9a67fd-etc-kubernetes\") pod \"multus-wnr7b\" (UID: \"119caa96-ae84-4d21-8b14-6d528d9a67fd\") " pod="openshift-multus/multus-wnr7b" Apr 19 15:25:02.505147 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.504547 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/3f939e64-6dbd-4802-9d83-7251b53cdcb5-agent-certs\") pod \"konnectivity-agent-7r729\" (UID: \"3f939e64-6dbd-4802-9d83-7251b53cdcb5\") " pod="kube-system/konnectivity-agent-7r729" Apr 19 15:25:02.505147 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.504564 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d6faab90-56cc-458f-bf13-4b00ae0b1686-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-sz2ds\" (UID: \"d6faab90-56cc-458f-bf13-4b00ae0b1686\") " pod="openshift-multus/multus-additional-cni-plugins-sz2ds" Apr 19 15:25:02.505147 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.504580 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/dac7973c-ee33-410c-8f77-093953d73a03-tmp-dir\") pod \"node-resolver-pss7s\" (UID: \"dac7973c-ee33-410c-8f77-093953d73a03\") " pod="openshift-dns/node-resolver-pss7s" Apr 19 15:25:02.505147 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.504690 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/99d8461b-833d-47ad-bf82-81619c11272e-etc-kubernetes\") pod \"tuned-vssx5\" (UID: \"99d8461b-833d-47ad-bf82-81619c11272e\") " pod="openshift-cluster-node-tuning-operator/tuned-vssx5" Apr 19 15:25:02.506166 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.504769 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/99d8461b-833d-47ad-bf82-81619c11272e-tmp\") pod \"tuned-vssx5\" (UID: \"99d8461b-833d-47ad-bf82-81619c11272e\") " pod="openshift-cluster-node-tuning-operator/tuned-vssx5" Apr 19 15:25:02.506166 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.504791 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/99d8461b-833d-47ad-bf82-81619c11272e-etc-kubernetes\") pod \"tuned-vssx5\" (UID: \"99d8461b-833d-47ad-bf82-81619c11272e\") " pod="openshift-cluster-node-tuning-operator/tuned-vssx5" Apr 19 15:25:02.506166 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.504847 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/dac7973c-ee33-410c-8f77-093953d73a03-tmp-dir\") pod \"node-resolver-pss7s\" (UID: \"dac7973c-ee33-410c-8f77-093953d73a03\") " pod="openshift-dns/node-resolver-pss7s" Apr 19 15:25:02.506166 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.504795 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/119caa96-ae84-4d21-8b14-6d528d9a67fd-cni-binary-copy\") pod \"multus-wnr7b\" (UID: \"119caa96-ae84-4d21-8b14-6d528d9a67fd\") " pod="openshift-multus/multus-wnr7b" Apr 19 15:25:02.506166 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.504925 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/119caa96-ae84-4d21-8b14-6d528d9a67fd-multus-conf-dir\") pod \"multus-wnr7b\" (UID: \"119caa96-ae84-4d21-8b14-6d528d9a67fd\") " pod="openshift-multus/multus-wnr7b" Apr 19 15:25:02.506166 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.504947 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/d6faab90-56cc-458f-bf13-4b00ae0b1686-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-sz2ds\" (UID: \"d6faab90-56cc-458f-bf13-4b00ae0b1686\") " pod="openshift-multus/multus-additional-cni-plugins-sz2ds" Apr 19 15:25:02.506166 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.504963 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/119caa96-ae84-4d21-8b14-6d528d9a67fd-system-cni-dir\") pod \"multus-wnr7b\" (UID: \"119caa96-ae84-4d21-8b14-6d528d9a67fd\") " pod="openshift-multus/multus-wnr7b" Apr 19 15:25:02.506166 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.504960 2579 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 19 15:25:02.506166 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.504994 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d24bd074-79ed-4888-8f3c-4aa16738fea6-iptables-alerter-script\") pod \"iptables-alerter-2xg5z\" (UID: \"d24bd074-79ed-4888-8f3c-4aa16738fea6\") " pod="openshift-network-operator/iptables-alerter-2xg5z" Apr 19 15:25:02.506166 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.505011 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/119caa96-ae84-4d21-8b14-6d528d9a67fd-multus-conf-dir\") pod \"multus-wnr7b\" (UID: \"119caa96-ae84-4d21-8b14-6d528d9a67fd\") " pod="openshift-multus/multus-wnr7b" Apr 19 15:25:02.506166 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.505048 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/119caa96-ae84-4d21-8b14-6d528d9a67fd-host-var-lib-cni-bin\") pod \"multus-wnr7b\" (UID: \"119caa96-ae84-4d21-8b14-6d528d9a67fd\") " pod="openshift-multus/multus-wnr7b" Apr 19 15:25:02.506166 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.505054 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/119caa96-ae84-4d21-8b14-6d528d9a67fd-system-cni-dir\") pod \"multus-wnr7b\" (UID: \"119caa96-ae84-4d21-8b14-6d528d9a67fd\") " pod="openshift-multus/multus-wnr7b" Apr 19 15:25:02.506166 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.505084 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d6faab90-56cc-458f-bf13-4b00ae0b1686-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-sz2ds\" (UID: \"d6faab90-56cc-458f-bf13-4b00ae0b1686\") " pod="openshift-multus/multus-additional-cni-plugins-sz2ds" Apr 19 15:25:02.506166 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.505072 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/300a9bea-6a69-423c-8267-02f715cc3b8f-kubelet-dir\") pod \"aws-ebs-csi-driver-node-9mv2f\" (UID: \"300a9bea-6a69-423c-8267-02f715cc3b8f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9mv2f" Apr 19 15:25:02.506166 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.505137 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d6faab90-56cc-458f-bf13-4b00ae0b1686-os-release\") pod \"multus-additional-cni-plugins-sz2ds\" (UID: \"d6faab90-56cc-458f-bf13-4b00ae0b1686\") " pod="openshift-multus/multus-additional-cni-plugins-sz2ds" Apr 19 15:25:02.506166 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.505148 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/300a9bea-6a69-423c-8267-02f715cc3b8f-kubelet-dir\") pod \"aws-ebs-csi-driver-node-9mv2f\" (UID: \"300a9bea-6a69-423c-8267-02f715cc3b8f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9mv2f" Apr 19 15:25:02.506166 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.505165 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d6faab90-56cc-458f-bf13-4b00ae0b1686-tuning-conf-dir\") pod \"multus-additional-cni-plugins-sz2ds\" (UID: \"d6faab90-56cc-458f-bf13-4b00ae0b1686\") " pod="openshift-multus/multus-additional-cni-plugins-sz2ds" Apr 19 15:25:02.506166 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.505093 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/119caa96-ae84-4d21-8b14-6d528d9a67fd-host-var-lib-cni-bin\") pod \"multus-wnr7b\" (UID: \"119caa96-ae84-4d21-8b14-6d528d9a67fd\") " pod="openshift-multus/multus-wnr7b" Apr 19 15:25:02.506958 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.505191 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/99d8461b-833d-47ad-bf82-81619c11272e-etc-systemd\") pod \"tuned-vssx5\" (UID: \"99d8461b-833d-47ad-bf82-81619c11272e\") " pod="openshift-cluster-node-tuning-operator/tuned-vssx5" Apr 19 15:25:02.506958 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.505220 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/99d8461b-833d-47ad-bf82-81619c11272e-var-lib-kubelet\") pod \"tuned-vssx5\" (UID: \"99d8461b-833d-47ad-bf82-81619c11272e\") " pod="openshift-cluster-node-tuning-operator/tuned-vssx5" Apr 19 15:25:02.506958 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.505237 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d6faab90-56cc-458f-bf13-4b00ae0b1686-os-release\") pod \"multus-additional-cni-plugins-sz2ds\" (UID: \"d6faab90-56cc-458f-bf13-4b00ae0b1686\") " pod="openshift-multus/multus-additional-cni-plugins-sz2ds" Apr 19 15:25:02.506958 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.505248 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/73514b32-300b-4466-b414-022b4c2e1f8e-var-lib-openvswitch\") pod \"ovnkube-node-xxqlx\" (UID: \"73514b32-300b-4466-b414-022b4c2e1f8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxqlx" Apr 19 15:25:02.506958 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.505274 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/300a9bea-6a69-423c-8267-02f715cc3b8f-device-dir\") pod \"aws-ebs-csi-driver-node-9mv2f\" (UID: \"300a9bea-6a69-423c-8267-02f715cc3b8f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9mv2f" Apr 19 15:25:02.506958 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.505284 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/99d8461b-833d-47ad-bf82-81619c11272e-etc-systemd\") pod \"tuned-vssx5\" (UID: \"99d8461b-833d-47ad-bf82-81619c11272e\") " pod="openshift-cluster-node-tuning-operator/tuned-vssx5" Apr 19 15:25:02.506958 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.505297 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/119caa96-ae84-4d21-8b14-6d528d9a67fd-cni-binary-copy\") pod \"multus-wnr7b\" (UID: \"119caa96-ae84-4d21-8b14-6d528d9a67fd\") " pod="openshift-multus/multus-wnr7b" Apr 19 15:25:02.506958 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.505306 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d6faab90-56cc-458f-bf13-4b00ae0b1686-tuning-conf-dir\") pod \"multus-additional-cni-plugins-sz2ds\" (UID: \"d6faab90-56cc-458f-bf13-4b00ae0b1686\") " pod="openshift-multus/multus-additional-cni-plugins-sz2ds" Apr 19 15:25:02.506958 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.505299 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/300a9bea-6a69-423c-8267-02f715cc3b8f-etc-selinux\") pod \"aws-ebs-csi-driver-node-9mv2f\" (UID: \"300a9bea-6a69-423c-8267-02f715cc3b8f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9mv2f" Apr 19 15:25:02.506958 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.505328 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/73514b32-300b-4466-b414-022b4c2e1f8e-var-lib-openvswitch\") pod \"ovnkube-node-xxqlx\" (UID: \"73514b32-300b-4466-b414-022b4c2e1f8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxqlx" Apr 19 15:25:02.506958 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.505351 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/300a9bea-6a69-423c-8267-02f715cc3b8f-etc-selinux\") pod \"aws-ebs-csi-driver-node-9mv2f\" (UID: \"300a9bea-6a69-423c-8267-02f715cc3b8f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9mv2f" Apr 19 15:25:02.506958 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.505387 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/99d8461b-833d-47ad-bf82-81619c11272e-var-lib-kubelet\") pod \"tuned-vssx5\" (UID: \"99d8461b-833d-47ad-bf82-81619c11272e\") " pod="openshift-cluster-node-tuning-operator/tuned-vssx5" Apr 19 15:25:02.506958 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.505416 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/99d8461b-833d-47ad-bf82-81619c11272e-lib-modules\") pod \"tuned-vssx5\" (UID: \"99d8461b-833d-47ad-bf82-81619c11272e\") " pod="openshift-cluster-node-tuning-operator/tuned-vssx5" Apr 19 15:25:02.506958 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.505423 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/300a9bea-6a69-423c-8267-02f715cc3b8f-device-dir\") pod \"aws-ebs-csi-driver-node-9mv2f\" (UID: \"300a9bea-6a69-423c-8267-02f715cc3b8f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9mv2f" Apr 19 15:25:02.506958 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.505448 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/73514b32-300b-4466-b414-022b4c2e1f8e-ovn-node-metrics-cert\") pod \"ovnkube-node-xxqlx\" (UID: \"73514b32-300b-4466-b414-022b4c2e1f8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxqlx" Apr 19 15:25:02.506958 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.505458 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/d6faab90-56cc-458f-bf13-4b00ae0b1686-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-sz2ds\" (UID: \"d6faab90-56cc-458f-bf13-4b00ae0b1686\") " pod="openshift-multus/multus-additional-cni-plugins-sz2ds" Apr 19 15:25:02.506958 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.505472 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-89cj4\" (UniqueName: \"kubernetes.io/projected/dac7973c-ee33-410c-8f77-093953d73a03-kube-api-access-89cj4\") pod \"node-resolver-pss7s\" (UID: \"dac7973c-ee33-410c-8f77-093953d73a03\") " pod="openshift-dns/node-resolver-pss7s" Apr 19 15:25:02.507700 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.505496 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/119caa96-ae84-4d21-8b14-6d528d9a67fd-multus-socket-dir-parent\") pod \"multus-wnr7b\" (UID: \"119caa96-ae84-4d21-8b14-6d528d9a67fd\") " pod="openshift-multus/multus-wnr7b" Apr 19 15:25:02.507700 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.505565 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/99d8461b-833d-47ad-bf82-81619c11272e-lib-modules\") pod \"tuned-vssx5\" (UID: \"99d8461b-833d-47ad-bf82-81619c11272e\") " pod="openshift-cluster-node-tuning-operator/tuned-vssx5" Apr 19 15:25:02.507700 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.505590 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/119caa96-ae84-4d21-8b14-6d528d9a67fd-multus-socket-dir-parent\") pod \"multus-wnr7b\" (UID: \"119caa96-ae84-4d21-8b14-6d528d9a67fd\") " pod="openshift-multus/multus-wnr7b" Apr 19 15:25:02.507700 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.505634 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/119caa96-ae84-4d21-8b14-6d528d9a67fd-multus-daemon-config\") pod \"multus-wnr7b\" (UID: \"119caa96-ae84-4d21-8b14-6d528d9a67fd\") " pod="openshift-multus/multus-wnr7b" Apr 19 15:25:02.507700 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.505741 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v2fwh\" (UniqueName: \"kubernetes.io/projected/d6faab90-56cc-458f-bf13-4b00ae0b1686-kube-api-access-v2fwh\") pod \"multus-additional-cni-plugins-sz2ds\" (UID: \"d6faab90-56cc-458f-bf13-4b00ae0b1686\") " pod="openshift-multus/multus-additional-cni-plugins-sz2ds" Apr 19 15:25:02.507700 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.505761 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/73514b32-300b-4466-b414-022b4c2e1f8e-host-cni-bin\") pod \"ovnkube-node-xxqlx\" (UID: \"73514b32-300b-4466-b414-022b4c2e1f8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxqlx" Apr 19 15:25:02.507700 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.505776 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/119caa96-ae84-4d21-8b14-6d528d9a67fd-host-var-lib-cni-multus\") pod \"multus-wnr7b\" (UID: \"119caa96-ae84-4d21-8b14-6d528d9a67fd\") " pod="openshift-multus/multus-wnr7b" Apr 19 15:25:02.507700 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.505798 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z8wwc\" (UniqueName: \"kubernetes.io/projected/41bb40b9-2854-47c5-8759-3fbea6b42b53-kube-api-access-z8wwc\") pod \"network-metrics-daemon-8cprr\" (UID: \"41bb40b9-2854-47c5-8759-3fbea6b42b53\") " pod="openshift-multus/network-metrics-daemon-8cprr" Apr 19 15:25:02.507700 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.505823 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/300a9bea-6a69-423c-8267-02f715cc3b8f-socket-dir\") pod \"aws-ebs-csi-driver-node-9mv2f\" (UID: \"300a9bea-6a69-423c-8267-02f715cc3b8f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9mv2f" Apr 19 15:25:02.507700 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.505824 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/73514b32-300b-4466-b414-022b4c2e1f8e-host-cni-bin\") pod \"ovnkube-node-xxqlx\" (UID: \"73514b32-300b-4466-b414-022b4c2e1f8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxqlx" Apr 19 15:25:02.507700 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.505843 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/119caa96-ae84-4d21-8b14-6d528d9a67fd-host-var-lib-cni-multus\") pod \"multus-wnr7b\" (UID: \"119caa96-ae84-4d21-8b14-6d528d9a67fd\") " pod="openshift-multus/multus-wnr7b" Apr 19 15:25:02.507700 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.505846 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/99d8461b-833d-47ad-bf82-81619c11272e-run\") pod \"tuned-vssx5\" (UID: \"99d8461b-833d-47ad-bf82-81619c11272e\") " pod="openshift-cluster-node-tuning-operator/tuned-vssx5" Apr 19 15:25:02.507700 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.505941 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/99d8461b-833d-47ad-bf82-81619c11272e-sys\") pod \"tuned-vssx5\" (UID: \"99d8461b-833d-47ad-bf82-81619c11272e\") " pod="openshift-cluster-node-tuning-operator/tuned-vssx5" Apr 19 15:25:02.507700 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.505984 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/73514b32-300b-4466-b414-022b4c2e1f8e-host-run-netns\") pod \"ovnkube-node-xxqlx\" (UID: \"73514b32-300b-4466-b414-022b4c2e1f8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxqlx" Apr 19 15:25:02.507700 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.506014 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/73514b32-300b-4466-b414-022b4c2e1f8e-ovnkube-config\") pod \"ovnkube-node-xxqlx\" (UID: \"73514b32-300b-4466-b414-022b4c2e1f8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxqlx" Apr 19 15:25:02.507700 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.506027 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/300a9bea-6a69-423c-8267-02f715cc3b8f-socket-dir\") pod \"aws-ebs-csi-driver-node-9mv2f\" (UID: \"300a9bea-6a69-423c-8267-02f715cc3b8f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9mv2f" Apr 19 15:25:02.507700 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.506040 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/73514b32-300b-4466-b414-022b4c2e1f8e-env-overrides\") pod \"ovnkube-node-xxqlx\" (UID: \"73514b32-300b-4466-b414-022b4c2e1f8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxqlx" Apr 19 15:25:02.508427 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.506052 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/99d8461b-833d-47ad-bf82-81619c11272e-run\") pod \"tuned-vssx5\" (UID: \"99d8461b-833d-47ad-bf82-81619c11272e\") " pod="openshift-cluster-node-tuning-operator/tuned-vssx5" Apr 19 15:25:02.508427 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.506067 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-89hx2\" (UniqueName: \"kubernetes.io/projected/300a9bea-6a69-423c-8267-02f715cc3b8f-kube-api-access-89hx2\") pod \"aws-ebs-csi-driver-node-9mv2f\" (UID: \"300a9bea-6a69-423c-8267-02f715cc3b8f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9mv2f" Apr 19 15:25:02.508427 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.506094 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/73514b32-300b-4466-b414-022b4c2e1f8e-ovnkube-script-lib\") pod \"ovnkube-node-xxqlx\" (UID: \"73514b32-300b-4466-b414-022b4c2e1f8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxqlx" Apr 19 15:25:02.508427 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.506118 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mk8fx\" (UniqueName: \"kubernetes.io/projected/99d8461b-833d-47ad-bf82-81619c11272e-kube-api-access-mk8fx\") pod \"tuned-vssx5\" (UID: \"99d8461b-833d-47ad-bf82-81619c11272e\") " pod="openshift-cluster-node-tuning-operator/tuned-vssx5" Apr 19 15:25:02.508427 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.506145 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/300a9bea-6a69-423c-8267-02f715cc3b8f-registration-dir\") pod \"aws-ebs-csi-driver-node-9mv2f\" (UID: \"300a9bea-6a69-423c-8267-02f715cc3b8f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9mv2f" Apr 19 15:25:02.508427 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.506173 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/99d8461b-833d-47ad-bf82-81619c11272e-etc-sysctl-conf\") pod \"tuned-vssx5\" (UID: \"99d8461b-833d-47ad-bf82-81619c11272e\") " pod="openshift-cluster-node-tuning-operator/tuned-vssx5" Apr 19 15:25:02.508427 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.506227 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/99d8461b-833d-47ad-bf82-81619c11272e-etc-tuned\") pod \"tuned-vssx5\" (UID: \"99d8461b-833d-47ad-bf82-81619c11272e\") " pod="openshift-cluster-node-tuning-operator/tuned-vssx5" Apr 19 15:25:02.508427 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.506254 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/73514b32-300b-4466-b414-022b4c2e1f8e-host-kubelet\") pod \"ovnkube-node-xxqlx\" (UID: \"73514b32-300b-4466-b414-022b4c2e1f8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxqlx" Apr 19 15:25:02.508427 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.506277 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/119caa96-ae84-4d21-8b14-6d528d9a67fd-multus-daemon-config\") pod \"multus-wnr7b\" (UID: \"119caa96-ae84-4d21-8b14-6d528d9a67fd\") " pod="openshift-multus/multus-wnr7b" Apr 19 15:25:02.508427 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.506282 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/73514b32-300b-4466-b414-022b4c2e1f8e-log-socket\") pod \"ovnkube-node-xxqlx\" (UID: \"73514b32-300b-4466-b414-022b4c2e1f8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxqlx" Apr 19 15:25:02.508427 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.506306 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/119caa96-ae84-4d21-8b14-6d528d9a67fd-host-run-netns\") pod \"multus-wnr7b\" (UID: \"119caa96-ae84-4d21-8b14-6d528d9a67fd\") " pod="openshift-multus/multus-wnr7b" Apr 19 15:25:02.508427 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.506329 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/119caa96-ae84-4d21-8b14-6d528d9a67fd-hostroot\") pod \"multus-wnr7b\" (UID: \"119caa96-ae84-4d21-8b14-6d528d9a67fd\") " pod="openshift-multus/multus-wnr7b" Apr 19 15:25:02.508427 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.506354 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/3f939e64-6dbd-4802-9d83-7251b53cdcb5-konnectivity-ca\") pod \"konnectivity-agent-7r729\" (UID: \"3f939e64-6dbd-4802-9d83-7251b53cdcb5\") " pod="kube-system/konnectivity-agent-7r729" Apr 19 15:25:02.508427 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.506396 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d6faab90-56cc-458f-bf13-4b00ae0b1686-system-cni-dir\") pod \"multus-additional-cni-plugins-sz2ds\" (UID: \"d6faab90-56cc-458f-bf13-4b00ae0b1686\") " pod="openshift-multus/multus-additional-cni-plugins-sz2ds" Apr 19 15:25:02.508427 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.506420 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/73514b32-300b-4466-b414-022b4c2e1f8e-run-ovn\") pod \"ovnkube-node-xxqlx\" (UID: \"73514b32-300b-4466-b414-022b4c2e1f8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxqlx" Apr 19 15:25:02.508427 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.506445 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/73514b32-300b-4466-b414-022b4c2e1f8e-host-run-ovn-kubernetes\") pod \"ovnkube-node-xxqlx\" (UID: \"73514b32-300b-4466-b414-022b4c2e1f8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxqlx" Apr 19 15:25:02.508427 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.506503 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/73514b32-300b-4466-b414-022b4c2e1f8e-host-cni-netd\") pod \"ovnkube-node-xxqlx\" (UID: \"73514b32-300b-4466-b414-022b4c2e1f8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxqlx" Apr 19 15:25:02.508916 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.506529 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/119caa96-ae84-4d21-8b14-6d528d9a67fd-os-release\") pod \"multus-wnr7b\" (UID: \"119caa96-ae84-4d21-8b14-6d528d9a67fd\") " pod="openshift-multus/multus-wnr7b" Apr 19 15:25:02.508916 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.506555 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rwmzv\" (UniqueName: \"kubernetes.io/projected/445f4ff9-7c10-4b4e-8d46-b2e4e449c5bc-kube-api-access-rwmzv\") pod \"network-check-target-r46tx\" (UID: \"445f4ff9-7c10-4b4e-8d46-b2e4e449c5bc\") " pod="openshift-network-diagnostics/network-check-target-r46tx" Apr 19 15:25:02.508916 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.506566 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/73514b32-300b-4466-b414-022b4c2e1f8e-log-socket\") pod \"ovnkube-node-xxqlx\" (UID: \"73514b32-300b-4466-b414-022b4c2e1f8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxqlx" Apr 19 15:25:02.508916 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.506580 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/73514b32-300b-4466-b414-022b4c2e1f8e-systemd-units\") pod \"ovnkube-node-xxqlx\" (UID: \"73514b32-300b-4466-b414-022b4c2e1f8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxqlx" Apr 19 15:25:02.508916 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.506582 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d6faab90-56cc-458f-bf13-4b00ae0b1686-system-cni-dir\") pod \"multus-additional-cni-plugins-sz2ds\" (UID: \"d6faab90-56cc-458f-bf13-4b00ae0b1686\") " pod="openshift-multus/multus-additional-cni-plugins-sz2ds" Apr 19 15:25:02.508916 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.506606 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/119caa96-ae84-4d21-8b14-6d528d9a67fd-multus-cni-dir\") pod \"multus-wnr7b\" (UID: \"119caa96-ae84-4d21-8b14-6d528d9a67fd\") " pod="openshift-multus/multus-wnr7b" Apr 19 15:25:02.508916 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.506636 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/119caa96-ae84-4d21-8b14-6d528d9a67fd-host-run-k8s-cni-cncf-io\") pod \"multus-wnr7b\" (UID: \"119caa96-ae84-4d21-8b14-6d528d9a67fd\") " pod="openshift-multus/multus-wnr7b" Apr 19 15:25:02.508916 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.506642 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/119caa96-ae84-4d21-8b14-6d528d9a67fd-host-run-netns\") pod \"multus-wnr7b\" (UID: \"119caa96-ae84-4d21-8b14-6d528d9a67fd\") " pod="openshift-multus/multus-wnr7b" Apr 19 15:25:02.508916 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.506743 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/119caa96-ae84-4d21-8b14-6d528d9a67fd-os-release\") pod \"multus-wnr7b\" (UID: \"119caa96-ae84-4d21-8b14-6d528d9a67fd\") " pod="openshift-multus/multus-wnr7b" Apr 19 15:25:02.508916 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.506782 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/73514b32-300b-4466-b414-022b4c2e1f8e-run-ovn\") pod \"ovnkube-node-xxqlx\" (UID: \"73514b32-300b-4466-b414-022b4c2e1f8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxqlx" Apr 19 15:25:02.508916 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.506826 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/73514b32-300b-4466-b414-022b4c2e1f8e-host-run-ovn-kubernetes\") pod \"ovnkube-node-xxqlx\" (UID: \"73514b32-300b-4466-b414-022b4c2e1f8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxqlx" Apr 19 15:25:02.508916 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.506831 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/300a9bea-6a69-423c-8267-02f715cc3b8f-registration-dir\") pod \"aws-ebs-csi-driver-node-9mv2f\" (UID: \"300a9bea-6a69-423c-8267-02f715cc3b8f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9mv2f" Apr 19 15:25:02.508916 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.506864 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/73514b32-300b-4466-b414-022b4c2e1f8e-host-cni-netd\") pod \"ovnkube-node-xxqlx\" (UID: \"73514b32-300b-4466-b414-022b4c2e1f8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxqlx" Apr 19 15:25:02.508916 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.506927 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/99d8461b-833d-47ad-bf82-81619c11272e-etc-sysctl-conf\") pod \"tuned-vssx5\" (UID: \"99d8461b-833d-47ad-bf82-81619c11272e\") " pod="openshift-cluster-node-tuning-operator/tuned-vssx5" Apr 19 15:25:02.508916 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.506970 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/73514b32-300b-4466-b414-022b4c2e1f8e-host-kubelet\") pod \"ovnkube-node-xxqlx\" (UID: \"73514b32-300b-4466-b414-022b4c2e1f8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxqlx" Apr 19 15:25:02.508916 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.507053 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/119caa96-ae84-4d21-8b14-6d528d9a67fd-hostroot\") pod \"multus-wnr7b\" (UID: \"119caa96-ae84-4d21-8b14-6d528d9a67fd\") " pod="openshift-multus/multus-wnr7b" Apr 19 15:25:02.508916 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.507111 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/119caa96-ae84-4d21-8b14-6d528d9a67fd-multus-cni-dir\") pod \"multus-wnr7b\" (UID: \"119caa96-ae84-4d21-8b14-6d528d9a67fd\") " pod="openshift-multus/multus-wnr7b" Apr 19 15:25:02.508916 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.507142 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/73514b32-300b-4466-b414-022b4c2e1f8e-env-overrides\") pod \"ovnkube-node-xxqlx\" (UID: \"73514b32-300b-4466-b414-022b4c2e1f8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxqlx" Apr 19 15:25:02.509665 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.507324 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/73514b32-300b-4466-b414-022b4c2e1f8e-host-run-netns\") pod \"ovnkube-node-xxqlx\" (UID: \"73514b32-300b-4466-b414-022b4c2e1f8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxqlx" Apr 19 15:25:02.509665 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.507348 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/73514b32-300b-4466-b414-022b4c2e1f8e-ovnkube-script-lib\") pod \"ovnkube-node-xxqlx\" (UID: \"73514b32-300b-4466-b414-022b4c2e1f8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxqlx" Apr 19 15:25:02.509665 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.507379 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/99d8461b-833d-47ad-bf82-81619c11272e-sys\") pod \"tuned-vssx5\" (UID: \"99d8461b-833d-47ad-bf82-81619c11272e\") " pod="openshift-cluster-node-tuning-operator/tuned-vssx5" Apr 19 15:25:02.509665 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.507421 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/119caa96-ae84-4d21-8b14-6d528d9a67fd-host-run-k8s-cni-cncf-io\") pod \"multus-wnr7b\" (UID: \"119caa96-ae84-4d21-8b14-6d528d9a67fd\") " pod="openshift-multus/multus-wnr7b" Apr 19 15:25:02.509665 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.507455 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/73514b32-300b-4466-b414-022b4c2e1f8e-systemd-units\") pod \"ovnkube-node-xxqlx\" (UID: \"73514b32-300b-4466-b414-022b4c2e1f8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxqlx" Apr 19 15:25:02.509665 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.507583 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/73514b32-300b-4466-b414-022b4c2e1f8e-ovnkube-config\") pod \"ovnkube-node-xxqlx\" (UID: \"73514b32-300b-4466-b414-022b4c2e1f8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxqlx" Apr 19 15:25:02.509665 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.507738 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/3f939e64-6dbd-4802-9d83-7251b53cdcb5-konnectivity-ca\") pod \"konnectivity-agent-7r729\" (UID: \"3f939e64-6dbd-4802-9d83-7251b53cdcb5\") " pod="kube-system/konnectivity-agent-7r729" Apr 19 15:25:02.509665 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.508604 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/99d8461b-833d-47ad-bf82-81619c11272e-tmp\") pod \"tuned-vssx5\" (UID: \"99d8461b-833d-47ad-bf82-81619c11272e\") " pod="openshift-cluster-node-tuning-operator/tuned-vssx5" Apr 19 15:25:02.509665 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.508681 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/73514b32-300b-4466-b414-022b4c2e1f8e-ovn-node-metrics-cert\") pod \"ovnkube-node-xxqlx\" (UID: \"73514b32-300b-4466-b414-022b4c2e1f8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxqlx" Apr 19 15:25:02.509665 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.508751 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/3f939e64-6dbd-4802-9d83-7251b53cdcb5-agent-certs\") pod \"konnectivity-agent-7r729\" (UID: \"3f939e64-6dbd-4802-9d83-7251b53cdcb5\") " pod="kube-system/konnectivity-agent-7r729" Apr 19 15:25:02.509665 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.509331 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/99d8461b-833d-47ad-bf82-81619c11272e-etc-tuned\") pod \"tuned-vssx5\" (UID: \"99d8461b-833d-47ad-bf82-81619c11272e\") " pod="openshift-cluster-node-tuning-operator/tuned-vssx5" Apr 19 15:25:02.512326 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:02.512302 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 19 15:25:02.512326 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:02.512330 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 19 15:25:02.512480 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:02.512343 2579 projected.go:194] Error preparing data for projected volume kube-api-access-rwmzv for pod openshift-network-diagnostics/network-check-target-r46tx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 19 15:25:02.512480 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:02.512413 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/445f4ff9-7c10-4b4e-8d46-b2e4e449c5bc-kube-api-access-rwmzv podName:445f4ff9-7c10-4b4e-8d46-b2e4e449c5bc nodeName:}" failed. No retries permitted until 2026-04-19 15:25:03.012395516 +0000 UTC m=+2.118310905 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-rwmzv" (UniqueName: "kubernetes.io/projected/445f4ff9-7c10-4b4e-8d46-b2e4e449c5bc-kube-api-access-rwmzv") pod "network-check-target-r46tx" (UID: "445f4ff9-7c10-4b4e-8d46-b2e4e449c5bc") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 19 15:25:02.512983 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.512924 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dssq\" (UniqueName: \"kubernetes.io/projected/119caa96-ae84-4d21-8b14-6d528d9a67fd-kube-api-access-4dssq\") pod \"multus-wnr7b\" (UID: \"119caa96-ae84-4d21-8b14-6d528d9a67fd\") " pod="openshift-multus/multus-wnr7b" Apr 19 15:25:02.513314 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.513284 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgw26\" (UniqueName: \"kubernetes.io/projected/d24bd074-79ed-4888-8f3c-4aa16738fea6-kube-api-access-tgw26\") pod \"iptables-alerter-2xg5z\" (UID: \"d24bd074-79ed-4888-8f3c-4aa16738fea6\") " pod="openshift-network-operator/iptables-alerter-2xg5z" Apr 19 15:25:02.513655 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.513636 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-phxpv\" (UniqueName: \"kubernetes.io/projected/73514b32-300b-4466-b414-022b4c2e1f8e-kube-api-access-phxpv\") pod \"ovnkube-node-xxqlx\" (UID: \"73514b32-300b-4466-b414-022b4c2e1f8e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxqlx" Apr 19 15:25:02.513995 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.513976 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-89cj4\" (UniqueName: \"kubernetes.io/projected/dac7973c-ee33-410c-8f77-093953d73a03-kube-api-access-89cj4\") pod \"node-resolver-pss7s\" (UID: \"dac7973c-ee33-410c-8f77-093953d73a03\") " pod="openshift-dns/node-resolver-pss7s" Apr 19 15:25:02.514201 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.514186 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5fml\" (UniqueName: \"kubernetes.io/projected/b3a083b4-d7b2-4f52-b323-b957d5ebc531-kube-api-access-d5fml\") pod \"node-ca-4tfml\" (UID: \"b3a083b4-d7b2-4f52-b323-b957d5ebc531\") " pod="openshift-image-registry/node-ca-4tfml" Apr 19 15:25:02.514291 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.514270 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2fwh\" (UniqueName: \"kubernetes.io/projected/d6faab90-56cc-458f-bf13-4b00ae0b1686-kube-api-access-v2fwh\") pod \"multus-additional-cni-plugins-sz2ds\" (UID: \"d6faab90-56cc-458f-bf13-4b00ae0b1686\") " pod="openshift-multus/multus-additional-cni-plugins-sz2ds" Apr 19 15:25:02.514672 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.514652 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8wwc\" (UniqueName: \"kubernetes.io/projected/41bb40b9-2854-47c5-8759-3fbea6b42b53-kube-api-access-z8wwc\") pod \"network-metrics-daemon-8cprr\" (UID: \"41bb40b9-2854-47c5-8759-3fbea6b42b53\") " pod="openshift-multus/network-metrics-daemon-8cprr" Apr 19 15:25:02.514952 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.514937 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-89hx2\" (UniqueName: \"kubernetes.io/projected/300a9bea-6a69-423c-8267-02f715cc3b8f-kube-api-access-89hx2\") pod \"aws-ebs-csi-driver-node-9mv2f\" (UID: \"300a9bea-6a69-423c-8267-02f715cc3b8f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9mv2f" Apr 19 15:25:02.515758 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.515713 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mk8fx\" (UniqueName: \"kubernetes.io/projected/99d8461b-833d-47ad-bf82-81619c11272e-kube-api-access-mk8fx\") pod \"tuned-vssx5\" (UID: \"99d8461b-833d-47ad-bf82-81619c11272e\") " pod="openshift-cluster-node-tuning-operator/tuned-vssx5" Apr 19 15:25:02.543552 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.543488 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-218.ec2.internal" event={"ID":"02ac498d4e3cb36b3700c57a1bf34412","Type":"ContainerStarted","Data":"61e9611d5847960fc3865424aa53eaa1fbe3b7ba4d7470e23064c0c99c3173b6"} Apr 19 15:25:02.544368 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.544342 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-218.ec2.internal" event={"ID":"1ddb39581fd73259e0c27abfdf033d32","Type":"ContainerStarted","Data":"d45d4c83296731e6fd61a6cdde5dcfbe9f092c072a3d44455a2814296f05f10b"} Apr 19 15:25:02.710921 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.710882 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-vssx5" Apr 19 15:25:02.714613 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.714591 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-4tfml" Apr 19 15:25:02.722445 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:02.722420 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb3a083b4_d7b2_4f52_b323_b957d5ebc531.slice/crio-b864d5f6f1f3151041902977a182e9faf5a117105920dc8522f821cd8c54d30c WatchSource:0}: Error finding container b864d5f6f1f3151041902977a182e9faf5a117105920dc8522f821cd8c54d30c: Status 404 returned error can't find the container with id b864d5f6f1f3151041902977a182e9faf5a117105920dc8522f821cd8c54d30c Apr 19 15:25:02.732367 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.732347 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-wnr7b" Apr 19 15:25:02.738449 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:02.738420 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod119caa96_ae84_4d21_8b14_6d528d9a67fd.slice/crio-1d5da73c9b012693fe40717020540da6d4fcd0d7b60fbc0e810a519928a52dee WatchSource:0}: Error finding container 1d5da73c9b012693fe40717020540da6d4fcd0d7b60fbc0e810a519928a52dee: Status 404 returned error can't find the container with id 1d5da73c9b012693fe40717020540da6d4fcd0d7b60fbc0e810a519928a52dee Apr 19 15:25:02.746693 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.746677 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-xxqlx" Apr 19 15:25:02.752071 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:02.752040 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73514b32_300b_4466_b414_022b4c2e1f8e.slice/crio-d2a727543d5a733a9545cb2d8d96545db75941ec19ca8e6a5317fe9470aff1ee WatchSource:0}: Error finding container d2a727543d5a733a9545cb2d8d96545db75941ec19ca8e6a5317fe9470aff1ee: Status 404 returned error can't find the container with id d2a727543d5a733a9545cb2d8d96545db75941ec19ca8e6a5317fe9470aff1ee Apr 19 15:25:02.770732 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.770698 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-7r729" Apr 19 15:25:02.775362 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.775344 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9mv2f" Apr 19 15:25:02.777843 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:02.777811 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f939e64_6dbd_4802_9d83_7251b53cdcb5.slice/crio-a4cc36d99355b1275eb6707e11833c93344674d3bd87473e7c894e51dc22a597 WatchSource:0}: Error finding container a4cc36d99355b1275eb6707e11833c93344674d3bd87473e7c894e51dc22a597: Status 404 returned error can't find the container with id a4cc36d99355b1275eb6707e11833c93344674d3bd87473e7c894e51dc22a597 Apr 19 15:25:02.781476 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.781450 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-pss7s" Apr 19 15:25:02.781768 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:02.781742 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod300a9bea_6a69_423c_8267_02f715cc3b8f.slice/crio-2a737c324f73e1a5e1e5e73e9a19f8cc6025513f2049f6abd4d66c727668fb79 WatchSource:0}: Error finding container 2a737c324f73e1a5e1e5e73e9a19f8cc6025513f2049f6abd4d66c727668fb79: Status 404 returned error can't find the container with id 2a737c324f73e1a5e1e5e73e9a19f8cc6025513f2049f6abd4d66c727668fb79 Apr 19 15:25:02.786956 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.786938 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-sz2ds" Apr 19 15:25:02.787199 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:02.787178 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddac7973c_ee33_410c_8f77_093953d73a03.slice/crio-a6dd8a0bc479ce2a5fa6dfd53783cd97af3619c32d2549beb6aeab689318e5a4 WatchSource:0}: Error finding container a6dd8a0bc479ce2a5fa6dfd53783cd97af3619c32d2549beb6aeab689318e5a4: Status 404 returned error can't find the container with id a6dd8a0bc479ce2a5fa6dfd53783cd97af3619c32d2549beb6aeab689318e5a4 Apr 19 15:25:02.792549 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:02.792531 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-2xg5z" Apr 19 15:25:02.793359 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:02.793340 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6faab90_56cc_458f_bf13_4b00ae0b1686.slice/crio-4d5b5356ef5badc6b9378b57d40584810ae3f2268dac8569e9d06a91e0e78dfe WatchSource:0}: Error finding container 4d5b5356ef5badc6b9378b57d40584810ae3f2268dac8569e9d06a91e0e78dfe: Status 404 returned error can't find the container with id 4d5b5356ef5badc6b9378b57d40584810ae3f2268dac8569e9d06a91e0e78dfe Apr 19 15:25:02.799130 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:02.799109 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd24bd074_79ed_4888_8f3c_4aa16738fea6.slice/crio-609310811efd70e7e4a74cc4a008533721b10f98ef2b0f5f24ab638d4da89d3b WatchSource:0}: Error finding container 609310811efd70e7e4a74cc4a008533721b10f98ef2b0f5f24ab638d4da89d3b: Status 404 returned error can't find the container with id 609310811efd70e7e4a74cc4a008533721b10f98ef2b0f5f24ab638d4da89d3b Apr 19 15:25:03.010677 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:03.010593 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/41bb40b9-2854-47c5-8759-3fbea6b42b53-metrics-certs\") pod \"network-metrics-daemon-8cprr\" (UID: \"41bb40b9-2854-47c5-8759-3fbea6b42b53\") " pod="openshift-multus/network-metrics-daemon-8cprr" Apr 19 15:25:03.010865 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:03.010844 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 19 15:25:03.010929 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:03.010911 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/41bb40b9-2854-47c5-8759-3fbea6b42b53-metrics-certs podName:41bb40b9-2854-47c5-8759-3fbea6b42b53 nodeName:}" failed. No retries permitted until 2026-04-19 15:25:04.010889894 +0000 UTC m=+3.116805266 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/41bb40b9-2854-47c5-8759-3fbea6b42b53-metrics-certs") pod "network-metrics-daemon-8cprr" (UID: "41bb40b9-2854-47c5-8759-3fbea6b42b53") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 19 15:25:03.111299 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:03.111225 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rwmzv\" (UniqueName: \"kubernetes.io/projected/445f4ff9-7c10-4b4e-8d46-b2e4e449c5bc-kube-api-access-rwmzv\") pod \"network-check-target-r46tx\" (UID: \"445f4ff9-7c10-4b4e-8d46-b2e4e449c5bc\") " pod="openshift-network-diagnostics/network-check-target-r46tx" Apr 19 15:25:03.111471 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:03.111389 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 19 15:25:03.111471 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:03.111410 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 19 15:25:03.111471 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:03.111423 2579 projected.go:194] Error preparing data for projected volume kube-api-access-rwmzv for pod openshift-network-diagnostics/network-check-target-r46tx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 19 15:25:03.111627 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:03.111488 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/445f4ff9-7c10-4b4e-8d46-b2e4e449c5bc-kube-api-access-rwmzv podName:445f4ff9-7c10-4b4e-8d46-b2e4e449c5bc nodeName:}" failed. No retries permitted until 2026-04-19 15:25:04.111468832 +0000 UTC m=+3.217384212 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-rwmzv" (UniqueName: "kubernetes.io/projected/445f4ff9-7c10-4b4e-8d46-b2e4e449c5bc-kube-api-access-rwmzv") pod "network-check-target-r46tx" (UID: "445f4ff9-7c10-4b4e-8d46-b2e4e449c5bc") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 19 15:25:03.431792 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:03.431762 2579 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 19 15:25:03.436596 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:03.436554 2579 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-18 15:20:02 +0000 UTC" deadline="2027-10-01 20:23:25.647551319 +0000 UTC" Apr 19 15:25:03.436596 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:03.436596 2579 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12724h58m22.210959738s" Apr 19 15:25:03.543557 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:03.543519 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8cprr" Apr 19 15:25:03.543747 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:03.543669 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8cprr" podUID="41bb40b9-2854-47c5-8759-3fbea6b42b53" Apr 19 15:25:03.549621 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:03.549593 2579 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 19 15:25:03.577285 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:03.577222 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-2xg5z" event={"ID":"d24bd074-79ed-4888-8f3c-4aa16738fea6","Type":"ContainerStarted","Data":"609310811efd70e7e4a74cc4a008533721b10f98ef2b0f5f24ab638d4da89d3b"} Apr 19 15:25:03.579051 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:03.578996 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-pss7s" event={"ID":"dac7973c-ee33-410c-8f77-093953d73a03","Type":"ContainerStarted","Data":"a6dd8a0bc479ce2a5fa6dfd53783cd97af3619c32d2549beb6aeab689318e5a4"} Apr 19 15:25:03.586797 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:03.586771 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-7r729" event={"ID":"3f939e64-6dbd-4802-9d83-7251b53cdcb5","Type":"ContainerStarted","Data":"a4cc36d99355b1275eb6707e11833c93344674d3bd87473e7c894e51dc22a597"} Apr 19 15:25:03.600531 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:03.600496 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wnr7b" event={"ID":"119caa96-ae84-4d21-8b14-6d528d9a67fd","Type":"ContainerStarted","Data":"1d5da73c9b012693fe40717020540da6d4fcd0d7b60fbc0e810a519928a52dee"} Apr 19 15:25:03.620012 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:03.619975 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-4tfml" event={"ID":"b3a083b4-d7b2-4f52-b323-b957d5ebc531","Type":"ContainerStarted","Data":"b864d5f6f1f3151041902977a182e9faf5a117105920dc8522f821cd8c54d30c"} Apr 19 15:25:03.625329 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:03.625296 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sz2ds" event={"ID":"d6faab90-56cc-458f-bf13-4b00ae0b1686","Type":"ContainerStarted","Data":"4d5b5356ef5badc6b9378b57d40584810ae3f2268dac8569e9d06a91e0e78dfe"} Apr 19 15:25:03.629703 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:03.629677 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9mv2f" event={"ID":"300a9bea-6a69-423c-8267-02f715cc3b8f","Type":"ContainerStarted","Data":"2a737c324f73e1a5e1e5e73e9a19f8cc6025513f2049f6abd4d66c727668fb79"} Apr 19 15:25:03.639973 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:03.639936 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xxqlx" event={"ID":"73514b32-300b-4466-b414-022b4c2e1f8e","Type":"ContainerStarted","Data":"d2a727543d5a733a9545cb2d8d96545db75941ec19ca8e6a5317fe9470aff1ee"} Apr 19 15:25:03.652187 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:03.652157 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-vssx5" event={"ID":"99d8461b-833d-47ad-bf82-81619c11272e","Type":"ContainerStarted","Data":"225239bc71a7550eee13058d01d5fb028dfe9d35ae2c522c32f0557f0bf094cc"} Apr 19 15:25:03.668558 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:03.668531 2579 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 19 15:25:04.022054 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:04.021436 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/41bb40b9-2854-47c5-8759-3fbea6b42b53-metrics-certs\") pod \"network-metrics-daemon-8cprr\" (UID: \"41bb40b9-2854-47c5-8759-3fbea6b42b53\") " pod="openshift-multus/network-metrics-daemon-8cprr" Apr 19 15:25:04.022054 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:04.021584 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 19 15:25:04.022054 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:04.021649 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/41bb40b9-2854-47c5-8759-3fbea6b42b53-metrics-certs podName:41bb40b9-2854-47c5-8759-3fbea6b42b53 nodeName:}" failed. No retries permitted until 2026-04-19 15:25:06.021629328 +0000 UTC m=+5.127544717 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/41bb40b9-2854-47c5-8759-3fbea6b42b53-metrics-certs") pod "network-metrics-daemon-8cprr" (UID: "41bb40b9-2854-47c5-8759-3fbea6b42b53") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 19 15:25:04.123050 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:04.122374 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rwmzv\" (UniqueName: \"kubernetes.io/projected/445f4ff9-7c10-4b4e-8d46-b2e4e449c5bc-kube-api-access-rwmzv\") pod \"network-check-target-r46tx\" (UID: \"445f4ff9-7c10-4b4e-8d46-b2e4e449c5bc\") " pod="openshift-network-diagnostics/network-check-target-r46tx" Apr 19 15:25:04.123050 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:04.122567 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 19 15:25:04.123050 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:04.122589 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 19 15:25:04.123050 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:04.122602 2579 projected.go:194] Error preparing data for projected volume kube-api-access-rwmzv for pod openshift-network-diagnostics/network-check-target-r46tx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 19 15:25:04.123050 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:04.122662 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/445f4ff9-7c10-4b4e-8d46-b2e4e449c5bc-kube-api-access-rwmzv podName:445f4ff9-7c10-4b4e-8d46-b2e4e449c5bc nodeName:}" failed. No retries permitted until 2026-04-19 15:25:06.122643673 +0000 UTC m=+5.228559045 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-rwmzv" (UniqueName: "kubernetes.io/projected/445f4ff9-7c10-4b4e-8d46-b2e4e449c5bc-kube-api-access-rwmzv") pod "network-check-target-r46tx" (UID: "445f4ff9-7c10-4b4e-8d46-b2e4e449c5bc") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 19 15:25:04.437458 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:04.437413 2579 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-18 15:20:02 +0000 UTC" deadline="2027-12-16 08:57:43.896327881 +0000 UTC" Apr 19 15:25:04.437458 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:04.437456 2579 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14537h32m39.458874833s" Apr 19 15:25:04.541153 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:04.541121 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-r46tx" Apr 19 15:25:04.541328 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:04.541247 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-r46tx" podUID="445f4ff9-7c10-4b4e-8d46-b2e4e449c5bc" Apr 19 15:25:05.541333 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:05.541298 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8cprr" Apr 19 15:25:05.541747 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:05.541454 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8cprr" podUID="41bb40b9-2854-47c5-8759-3fbea6b42b53" Apr 19 15:25:06.038669 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:06.038033 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/41bb40b9-2854-47c5-8759-3fbea6b42b53-metrics-certs\") pod \"network-metrics-daemon-8cprr\" (UID: \"41bb40b9-2854-47c5-8759-3fbea6b42b53\") " pod="openshift-multus/network-metrics-daemon-8cprr" Apr 19 15:25:06.038669 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:06.038230 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 19 15:25:06.038669 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:06.038296 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/41bb40b9-2854-47c5-8759-3fbea6b42b53-metrics-certs podName:41bb40b9-2854-47c5-8759-3fbea6b42b53 nodeName:}" failed. No retries permitted until 2026-04-19 15:25:10.038277182 +0000 UTC m=+9.144192554 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/41bb40b9-2854-47c5-8759-3fbea6b42b53-metrics-certs") pod "network-metrics-daemon-8cprr" (UID: "41bb40b9-2854-47c5-8759-3fbea6b42b53") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 19 15:25:06.139158 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:06.139123 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rwmzv\" (UniqueName: \"kubernetes.io/projected/445f4ff9-7c10-4b4e-8d46-b2e4e449c5bc-kube-api-access-rwmzv\") pod \"network-check-target-r46tx\" (UID: \"445f4ff9-7c10-4b4e-8d46-b2e4e449c5bc\") " pod="openshift-network-diagnostics/network-check-target-r46tx" Apr 19 15:25:06.139343 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:06.139321 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 19 15:25:06.139343 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:06.139342 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 19 15:25:06.139454 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:06.139356 2579 projected.go:194] Error preparing data for projected volume kube-api-access-rwmzv for pod openshift-network-diagnostics/network-check-target-r46tx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 19 15:25:06.139454 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:06.139418 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/445f4ff9-7c10-4b4e-8d46-b2e4e449c5bc-kube-api-access-rwmzv podName:445f4ff9-7c10-4b4e-8d46-b2e4e449c5bc nodeName:}" failed. No retries permitted until 2026-04-19 15:25:10.139399141 +0000 UTC m=+9.245314528 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-rwmzv" (UniqueName: "kubernetes.io/projected/445f4ff9-7c10-4b4e-8d46-b2e4e449c5bc-kube-api-access-rwmzv") pod "network-check-target-r46tx" (UID: "445f4ff9-7c10-4b4e-8d46-b2e4e449c5bc") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 19 15:25:06.541148 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:06.541033 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-r46tx" Apr 19 15:25:06.541346 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:06.541176 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-r46tx" podUID="445f4ff9-7c10-4b4e-8d46-b2e4e449c5bc" Apr 19 15:25:07.541525 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:07.540983 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8cprr" Apr 19 15:25:07.541525 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:07.541125 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8cprr" podUID="41bb40b9-2854-47c5-8759-3fbea6b42b53" Apr 19 15:25:08.541750 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:08.541234 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-r46tx" Apr 19 15:25:08.541750 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:08.541361 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-r46tx" podUID="445f4ff9-7c10-4b4e-8d46-b2e4e449c5bc" Apr 19 15:25:09.541100 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:09.541051 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8cprr" Apr 19 15:25:09.541256 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:09.541206 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8cprr" podUID="41bb40b9-2854-47c5-8759-3fbea6b42b53" Apr 19 15:25:10.073226 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:10.073188 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/41bb40b9-2854-47c5-8759-3fbea6b42b53-metrics-certs\") pod \"network-metrics-daemon-8cprr\" (UID: \"41bb40b9-2854-47c5-8759-3fbea6b42b53\") " pod="openshift-multus/network-metrics-daemon-8cprr" Apr 19 15:25:10.073615 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:10.073389 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 19 15:25:10.073615 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:10.073451 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/41bb40b9-2854-47c5-8759-3fbea6b42b53-metrics-certs podName:41bb40b9-2854-47c5-8759-3fbea6b42b53 nodeName:}" failed. No retries permitted until 2026-04-19 15:25:18.073431576 +0000 UTC m=+17.179346960 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/41bb40b9-2854-47c5-8759-3fbea6b42b53-metrics-certs") pod "network-metrics-daemon-8cprr" (UID: "41bb40b9-2854-47c5-8759-3fbea6b42b53") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 19 15:25:10.174625 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:10.173916 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rwmzv\" (UniqueName: \"kubernetes.io/projected/445f4ff9-7c10-4b4e-8d46-b2e4e449c5bc-kube-api-access-rwmzv\") pod \"network-check-target-r46tx\" (UID: \"445f4ff9-7c10-4b4e-8d46-b2e4e449c5bc\") " pod="openshift-network-diagnostics/network-check-target-r46tx" Apr 19 15:25:10.174625 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:10.174114 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 19 15:25:10.174625 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:10.174136 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 19 15:25:10.174625 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:10.174148 2579 projected.go:194] Error preparing data for projected volume kube-api-access-rwmzv for pod openshift-network-diagnostics/network-check-target-r46tx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 19 15:25:10.174625 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:10.174205 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/445f4ff9-7c10-4b4e-8d46-b2e4e449c5bc-kube-api-access-rwmzv podName:445f4ff9-7c10-4b4e-8d46-b2e4e449c5bc nodeName:}" failed. No retries permitted until 2026-04-19 15:25:18.174186154 +0000 UTC m=+17.280101524 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-rwmzv" (UniqueName: "kubernetes.io/projected/445f4ff9-7c10-4b4e-8d46-b2e4e449c5bc-kube-api-access-rwmzv") pod "network-check-target-r46tx" (UID: "445f4ff9-7c10-4b4e-8d46-b2e4e449c5bc") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 19 15:25:10.541109 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:10.541064 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-r46tx" Apr 19 15:25:10.541273 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:10.541208 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-r46tx" podUID="445f4ff9-7c10-4b4e-8d46-b2e4e449c5bc" Apr 19 15:25:11.542531 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:11.542430 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8cprr" Apr 19 15:25:11.542973 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:11.542590 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8cprr" podUID="41bb40b9-2854-47c5-8759-3fbea6b42b53" Apr 19 15:25:12.540610 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:12.540561 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-r46tx" Apr 19 15:25:12.540788 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:12.540682 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-r46tx" podUID="445f4ff9-7c10-4b4e-8d46-b2e4e449c5bc" Apr 19 15:25:13.541674 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:13.541637 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8cprr" Apr 19 15:25:13.542151 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:13.541797 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8cprr" podUID="41bb40b9-2854-47c5-8759-3fbea6b42b53" Apr 19 15:25:14.541236 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:14.541206 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-r46tx" Apr 19 15:25:14.541387 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:14.541305 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-r46tx" podUID="445f4ff9-7c10-4b4e-8d46-b2e4e449c5bc" Apr 19 15:25:15.540680 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:15.540638 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8cprr" Apr 19 15:25:15.541124 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:15.540784 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8cprr" podUID="41bb40b9-2854-47c5-8759-3fbea6b42b53" Apr 19 15:25:16.541594 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:16.541508 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-r46tx" Apr 19 15:25:16.542052 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:16.541647 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-r46tx" podUID="445f4ff9-7c10-4b4e-8d46-b2e4e449c5bc" Apr 19 15:25:17.540881 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:17.540843 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8cprr" Apr 19 15:25:17.541051 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:17.540995 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8cprr" podUID="41bb40b9-2854-47c5-8759-3fbea6b42b53" Apr 19 15:25:18.131145 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:18.131109 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/41bb40b9-2854-47c5-8759-3fbea6b42b53-metrics-certs\") pod \"network-metrics-daemon-8cprr\" (UID: \"41bb40b9-2854-47c5-8759-3fbea6b42b53\") " pod="openshift-multus/network-metrics-daemon-8cprr" Apr 19 15:25:18.131599 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:18.131236 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 19 15:25:18.131599 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:18.131301 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/41bb40b9-2854-47c5-8759-3fbea6b42b53-metrics-certs podName:41bb40b9-2854-47c5-8759-3fbea6b42b53 nodeName:}" failed. No retries permitted until 2026-04-19 15:25:34.131281627 +0000 UTC m=+33.237196998 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/41bb40b9-2854-47c5-8759-3fbea6b42b53-metrics-certs") pod "network-metrics-daemon-8cprr" (UID: "41bb40b9-2854-47c5-8759-3fbea6b42b53") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 19 15:25:18.232419 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:18.232380 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rwmzv\" (UniqueName: \"kubernetes.io/projected/445f4ff9-7c10-4b4e-8d46-b2e4e449c5bc-kube-api-access-rwmzv\") pod \"network-check-target-r46tx\" (UID: \"445f4ff9-7c10-4b4e-8d46-b2e4e449c5bc\") " pod="openshift-network-diagnostics/network-check-target-r46tx" Apr 19 15:25:18.232609 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:18.232526 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 19 15:25:18.232609 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:18.232548 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 19 15:25:18.232609 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:18.232560 2579 projected.go:194] Error preparing data for projected volume kube-api-access-rwmzv for pod openshift-network-diagnostics/network-check-target-r46tx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 19 15:25:18.232773 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:18.232618 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/445f4ff9-7c10-4b4e-8d46-b2e4e449c5bc-kube-api-access-rwmzv podName:445f4ff9-7c10-4b4e-8d46-b2e4e449c5bc nodeName:}" failed. No retries permitted until 2026-04-19 15:25:34.23260492 +0000 UTC m=+33.338520287 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-rwmzv" (UniqueName: "kubernetes.io/projected/445f4ff9-7c10-4b4e-8d46-b2e4e449c5bc-kube-api-access-rwmzv") pod "network-check-target-r46tx" (UID: "445f4ff9-7c10-4b4e-8d46-b2e4e449c5bc") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 19 15:25:18.540990 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:18.540956 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-r46tx" Apr 19 15:25:18.541174 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:18.541081 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-r46tx" podUID="445f4ff9-7c10-4b4e-8d46-b2e4e449c5bc" Apr 19 15:25:19.541084 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:19.541038 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8cprr" Apr 19 15:25:19.541647 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:19.541199 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8cprr" podUID="41bb40b9-2854-47c5-8759-3fbea6b42b53" Apr 19 15:25:20.540778 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:20.540749 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-r46tx" Apr 19 15:25:20.540961 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:20.540861 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-r46tx" podUID="445f4ff9-7c10-4b4e-8d46-b2e4e449c5bc" Apr 19 15:25:21.543482 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:21.543105 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8cprr" Apr 19 15:25:21.544144 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:21.543482 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8cprr" podUID="41bb40b9-2854-47c5-8759-3fbea6b42b53" Apr 19 15:25:21.699983 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:21.699779 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wnr7b" event={"ID":"119caa96-ae84-4d21-8b14-6d528d9a67fd","Type":"ContainerStarted","Data":"21d5bcaeca3f7ebabdcb4ca874d3d1e8fbca2ffff9e491c81448aebecf1dedf5"} Apr 19 15:25:21.703246 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:21.703226 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xxqlx_73514b32-300b-4466-b414-022b4c2e1f8e/ovn-acl-logging/0.log" Apr 19 15:25:21.703601 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:21.703579 2579 generic.go:358] "Generic (PLEG): container finished" podID="73514b32-300b-4466-b414-022b4c2e1f8e" containerID="a599e13244418cf70e9adea124dfc597e76f165aef3b74881cb27cb2a17f3bcb" exitCode=1 Apr 19 15:25:21.703692 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:21.703648 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xxqlx" event={"ID":"73514b32-300b-4466-b414-022b4c2e1f8e","Type":"ContainerStarted","Data":"06c361262b919c2d72097d3346fec2284b20541bd73809ac665321bdad85dcff"} Apr 19 15:25:21.703692 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:21.703667 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xxqlx" event={"ID":"73514b32-300b-4466-b414-022b4c2e1f8e","Type":"ContainerStarted","Data":"222fb05c0dbf62a96710d62121c765900f4b1fc8ab6dd87b8b2ca377da3098dc"} Apr 19 15:25:21.703692 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:21.703676 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xxqlx" event={"ID":"73514b32-300b-4466-b414-022b4c2e1f8e","Type":"ContainerStarted","Data":"8926999e9a527ced0575c6376b5e438653f5343c9d0842d5ea29772db6f6eb19"} Apr 19 15:25:21.703692 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:21.703684 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xxqlx" event={"ID":"73514b32-300b-4466-b414-022b4c2e1f8e","Type":"ContainerDied","Data":"a599e13244418cf70e9adea124dfc597e76f165aef3b74881cb27cb2a17f3bcb"} Apr 19 15:25:21.703868 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:21.703694 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xxqlx" event={"ID":"73514b32-300b-4466-b414-022b4c2e1f8e","Type":"ContainerStarted","Data":"b943eccb170fecca39cb761a859f3dee1ab5b376fe3386f9eb523e146635164b"} Apr 19 15:25:21.706238 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:21.706217 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-vssx5" event={"ID":"99d8461b-833d-47ad-bf82-81619c11272e","Type":"ContainerStarted","Data":"cec9948314d74c3157d07304b4ad7506e4da28369d1fb343caea63420265c79b"} Apr 19 15:25:21.714481 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:21.714374 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-wnr7b" podStartSLOduration=2.04889595 podStartE2EDuration="20.714357645s" podCreationTimestamp="2026-04-19 15:25:01 +0000 UTC" firstStartedPulling="2026-04-19 15:25:02.739797151 +0000 UTC m=+1.845712521" lastFinishedPulling="2026-04-19 15:25:21.40525885 +0000 UTC m=+20.511174216" observedRunningTime="2026-04-19 15:25:21.713779497 +0000 UTC m=+20.819694875" watchObservedRunningTime="2026-04-19 15:25:21.714357645 +0000 UTC m=+20.820273035" Apr 19 15:25:21.715247 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:21.715224 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-218.ec2.internal" event={"ID":"1ddb39581fd73259e0c27abfdf033d32","Type":"ContainerStarted","Data":"c1c0cc42c9f79a4cdf0410e3dbe71b3774b0469a3ede0ab99504acc345924825"} Apr 19 15:25:21.728133 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:21.728095 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-vssx5" podStartSLOduration=2.341044003 podStartE2EDuration="20.728082538s" podCreationTimestamp="2026-04-19 15:25:01 +0000 UTC" firstStartedPulling="2026-04-19 15:25:02.721822748 +0000 UTC m=+1.827738115" lastFinishedPulling="2026-04-19 15:25:21.108861265 +0000 UTC m=+20.214776650" observedRunningTime="2026-04-19 15:25:21.727837027 +0000 UTC m=+20.833752411" watchObservedRunningTime="2026-04-19 15:25:21.728082538 +0000 UTC m=+20.833997925" Apr 19 15:25:21.740686 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:21.740621 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-218.ec2.internal" podStartSLOduration=20.740549612 podStartE2EDuration="20.740549612s" podCreationTimestamp="2026-04-19 15:25:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-19 15:25:21.739754524 +0000 UTC m=+20.845669910" watchObservedRunningTime="2026-04-19 15:25:21.740549612 +0000 UTC m=+20.846465001" Apr 19 15:25:22.541148 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:22.541123 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-r46tx" Apr 19 15:25:22.541247 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:22.541229 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-r46tx" podUID="445f4ff9-7c10-4b4e-8d46-b2e4e449c5bc" Apr 19 15:25:22.719126 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:22.719085 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9mv2f" event={"ID":"300a9bea-6a69-423c-8267-02f715cc3b8f","Type":"ContainerStarted","Data":"99ecca3365c2a56224a80e36b6ce36884fb0b020ed879fc1c9a56ea0921441c7"} Apr 19 15:25:22.722015 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:22.721992 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xxqlx_73514b32-300b-4466-b414-022b4c2e1f8e/ovn-acl-logging/0.log" Apr 19 15:25:22.722397 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:22.722358 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xxqlx" event={"ID":"73514b32-300b-4466-b414-022b4c2e1f8e","Type":"ContainerStarted","Data":"a8990780606007dd428114698eab1c6276921d2b1be04e0b21a4cec9c143b51b"} Apr 19 15:25:22.723815 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:22.723793 2579 generic.go:358] "Generic (PLEG): container finished" podID="02ac498d4e3cb36b3700c57a1bf34412" containerID="d8883f584bdfa0c49e23a09fb46952ed6808d5ed7ca1dbbf34bb0ab91dac3a73" exitCode=0 Apr 19 15:25:22.723936 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:22.723877 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-218.ec2.internal" event={"ID":"02ac498d4e3cb36b3700c57a1bf34412","Type":"ContainerDied","Data":"d8883f584bdfa0c49e23a09fb46952ed6808d5ed7ca1dbbf34bb0ab91dac3a73"} Apr 19 15:25:22.725285 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:22.725264 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-2xg5z" event={"ID":"d24bd074-79ed-4888-8f3c-4aa16738fea6","Type":"ContainerStarted","Data":"0d0738c830c06dbc6c0eb8729d334133fd4b3168aa7e59b37ab1d241e9a02772"} Apr 19 15:25:22.726741 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:22.726682 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-pss7s" event={"ID":"dac7973c-ee33-410c-8f77-093953d73a03","Type":"ContainerStarted","Data":"8dd6e9ecc7072350d6e36fb67a0789ce0cfea69899ceef3af032a0a9096d27bd"} Apr 19 15:25:22.728173 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:22.728144 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-7r729" event={"ID":"3f939e64-6dbd-4802-9d83-7251b53cdcb5","Type":"ContainerStarted","Data":"ce48a2ec40fe517bf4527d9c1028ed2875fba7204f57b5a3f9d75346e2fa4a93"} Apr 19 15:25:22.729413 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:22.729392 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-4tfml" event={"ID":"b3a083b4-d7b2-4f52-b323-b957d5ebc531","Type":"ContainerStarted","Data":"8b3cf717a0f2c68b6b1990d62ed44e165606459850230e53e0712c955fc322d8"} Apr 19 15:25:22.730868 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:22.730844 2579 generic.go:358] "Generic (PLEG): container finished" podID="d6faab90-56cc-458f-bf13-4b00ae0b1686" containerID="bc88ac06701bd905509d7131c3c49ba50324564624e5923972a49dde9b9bc809" exitCode=0 Apr 19 15:25:22.730954 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:22.730938 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sz2ds" event={"ID":"d6faab90-56cc-458f-bf13-4b00ae0b1686","Type":"ContainerDied","Data":"bc88ac06701bd905509d7131c3c49ba50324564624e5923972a49dde9b9bc809"} Apr 19 15:25:22.787091 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:22.787033 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-4tfml" podStartSLOduration=3.402246268 podStartE2EDuration="21.787014366s" podCreationTimestamp="2026-04-19 15:25:01 +0000 UTC" firstStartedPulling="2026-04-19 15:25:02.724099863 +0000 UTC m=+1.830015229" lastFinishedPulling="2026-04-19 15:25:21.108867954 +0000 UTC m=+20.214783327" observedRunningTime="2026-04-19 15:25:22.764048615 +0000 UTC m=+21.869964004" watchObservedRunningTime="2026-04-19 15:25:22.787014366 +0000 UTC m=+21.892929748" Apr 19 15:25:22.787713 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:22.787676 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-pss7s" podStartSLOduration=3.467610004 podStartE2EDuration="21.787663478s" podCreationTimestamp="2026-04-19 15:25:01 +0000 UTC" firstStartedPulling="2026-04-19 15:25:02.789589101 +0000 UTC m=+1.895504480" lastFinishedPulling="2026-04-19 15:25:21.109642582 +0000 UTC m=+20.215557954" observedRunningTime="2026-04-19 15:25:22.787111389 +0000 UTC m=+21.893026778" watchObservedRunningTime="2026-04-19 15:25:22.787663478 +0000 UTC m=+21.893578867" Apr 19 15:25:22.820665 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:22.820623 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-7r729" podStartSLOduration=3.49042304 podStartE2EDuration="21.820608727s" podCreationTimestamp="2026-04-19 15:25:01 +0000 UTC" firstStartedPulling="2026-04-19 15:25:02.779643694 +0000 UTC m=+1.885559060" lastFinishedPulling="2026-04-19 15:25:21.109829376 +0000 UTC m=+20.215744747" observedRunningTime="2026-04-19 15:25:22.820066128 +0000 UTC m=+21.925981529" watchObservedRunningTime="2026-04-19 15:25:22.820608727 +0000 UTC m=+21.926524133" Apr 19 15:25:22.833149 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:22.833096 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-2xg5z" podStartSLOduration=3.551269139 podStartE2EDuration="21.833069389s" podCreationTimestamp="2026-04-19 15:25:01 +0000 UTC" firstStartedPulling="2026-04-19 15:25:02.800515415 +0000 UTC m=+1.906430782" lastFinishedPulling="2026-04-19 15:25:21.082315662 +0000 UTC m=+20.188231032" observedRunningTime="2026-04-19 15:25:22.832430124 +0000 UTC m=+21.938345511" watchObservedRunningTime="2026-04-19 15:25:22.833069389 +0000 UTC m=+21.938984777" Apr 19 15:25:23.206047 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:23.205819 2579 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 19 15:25:23.468446 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:23.468344 2579 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-19T15:25:23.206045177Z","UUID":"22a0b47e-c0c4-4cb1-a6cf-09cff0536e2a","Handler":null,"Name":"","Endpoint":""} Apr 19 15:25:23.472142 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:23.472114 2579 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 19 15:25:23.472142 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:23.472150 2579 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 19 15:25:23.541217 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:23.541178 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8cprr" Apr 19 15:25:23.541383 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:23.541312 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8cprr" podUID="41bb40b9-2854-47c5-8759-3fbea6b42b53" Apr 19 15:25:23.655295 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:23.655255 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-7r729" Apr 19 15:25:23.655953 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:23.655935 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-7r729" Apr 19 15:25:23.734837 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:23.734755 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9mv2f" event={"ID":"300a9bea-6a69-423c-8267-02f715cc3b8f","Type":"ContainerStarted","Data":"50c0416897e80647599dfab2fb38ef34b6ddc781401ad88cef74423ba22dac5e"} Apr 19 15:25:23.736698 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:23.736670 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-218.ec2.internal" event={"ID":"02ac498d4e3cb36b3700c57a1bf34412","Type":"ContainerStarted","Data":"84cac2ecd83f20bd1ee5e7393e890b2713804a2701ced1120b5b2009afc09a8c"} Apr 19 15:25:23.736905 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:23.736881 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-7r729" Apr 19 15:25:23.737758 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:23.737713 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-7r729" Apr 19 15:25:23.750944 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:23.750902 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-218.ec2.internal" podStartSLOduration=22.750888868 podStartE2EDuration="22.750888868s" podCreationTimestamp="2026-04-19 15:25:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-19 15:25:23.750635841 +0000 UTC m=+22.856551229" watchObservedRunningTime="2026-04-19 15:25:23.750888868 +0000 UTC m=+22.856804257" Apr 19 15:25:24.541594 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:24.541509 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-r46tx" Apr 19 15:25:24.541774 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:24.541628 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-r46tx" podUID="445f4ff9-7c10-4b4e-8d46-b2e4e449c5bc" Apr 19 15:25:24.740136 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:24.740101 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9mv2f" event={"ID":"300a9bea-6a69-423c-8267-02f715cc3b8f","Type":"ContainerStarted","Data":"0067b8c80230139ef22f0db96f107cd0cbc0b9d52fbe0690c3b1e53b62205ceb"} Apr 19 15:25:24.742954 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:24.742907 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xxqlx_73514b32-300b-4466-b414-022b4c2e1f8e/ovn-acl-logging/0.log" Apr 19 15:25:24.743249 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:24.743221 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xxqlx" event={"ID":"73514b32-300b-4466-b414-022b4c2e1f8e","Type":"ContainerStarted","Data":"afb23eb2808bdba05b92656ac63158f3062d664c24afc96535a4ee216e8c1b87"} Apr 19 15:25:25.541313 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:25.541274 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8cprr" Apr 19 15:25:25.541503 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:25.541413 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8cprr" podUID="41bb40b9-2854-47c5-8759-3fbea6b42b53" Apr 19 15:25:26.540662 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:26.540624 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-r46tx" Apr 19 15:25:26.541187 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:26.540792 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-r46tx" podUID="445f4ff9-7c10-4b4e-8d46-b2e4e449c5bc" Apr 19 15:25:27.541600 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:27.541348 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8cprr" Apr 19 15:25:27.541927 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:27.541758 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8cprr" podUID="41bb40b9-2854-47c5-8759-3fbea6b42b53" Apr 19 15:25:27.750179 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:27.750141 2579 generic.go:358] "Generic (PLEG): container finished" podID="d6faab90-56cc-458f-bf13-4b00ae0b1686" containerID="2dead4dd2513ef2b7fcf28c8c9feeaf9a21fb67b993ba6769d2104488b62460a" exitCode=0 Apr 19 15:25:27.750349 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:27.750234 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sz2ds" event={"ID":"d6faab90-56cc-458f-bf13-4b00ae0b1686","Type":"ContainerDied","Data":"2dead4dd2513ef2b7fcf28c8c9feeaf9a21fb67b993ba6769d2104488b62460a"} Apr 19 15:25:27.753419 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:27.753397 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xxqlx_73514b32-300b-4466-b414-022b4c2e1f8e/ovn-acl-logging/0.log" Apr 19 15:25:27.753769 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:27.753738 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xxqlx" event={"ID":"73514b32-300b-4466-b414-022b4c2e1f8e","Type":"ContainerStarted","Data":"86b1645e5833344b023f04e9bf89ac474bf26c2e79bba2ed787a799761610e40"} Apr 19 15:25:27.754110 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:27.754092 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-xxqlx" Apr 19 15:25:27.754272 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:27.754257 2579 scope.go:117] "RemoveContainer" containerID="a599e13244418cf70e9adea124dfc597e76f165aef3b74881cb27cb2a17f3bcb" Apr 19 15:25:27.769680 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:27.769658 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-xxqlx" Apr 19 15:25:27.770573 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:27.770538 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9mv2f" podStartSLOduration=5.310720874 podStartE2EDuration="26.770526148s" podCreationTimestamp="2026-04-19 15:25:01 +0000 UTC" firstStartedPulling="2026-04-19 15:25:02.784083843 +0000 UTC m=+1.889999223" lastFinishedPulling="2026-04-19 15:25:24.243889117 +0000 UTC m=+23.349804497" observedRunningTime="2026-04-19 15:25:24.756272493 +0000 UTC m=+23.862187881" watchObservedRunningTime="2026-04-19 15:25:27.770526148 +0000 UTC m=+26.876441536" Apr 19 15:25:28.541204 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:28.541170 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-r46tx" Apr 19 15:25:28.541462 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:28.541277 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-r46tx" podUID="445f4ff9-7c10-4b4e-8d46-b2e4e449c5bc" Apr 19 15:25:28.760281 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:28.760061 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xxqlx_73514b32-300b-4466-b414-022b4c2e1f8e/ovn-acl-logging/0.log" Apr 19 15:25:28.760695 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:28.760670 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xxqlx" event={"ID":"73514b32-300b-4466-b414-022b4c2e1f8e","Type":"ContainerStarted","Data":"2dfe86ed26f8314cda35ad29109e458a8ff423d403ee25d7d3a5f8f806c1c18b"} Apr 19 15:25:28.760866 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:28.760850 2579 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 19 15:25:28.761132 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:28.761113 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-xxqlx" Apr 19 15:25:28.775988 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:28.775959 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-xxqlx" Apr 19 15:25:28.784837 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:28.784779 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-xxqlx" podStartSLOduration=9.392985281 podStartE2EDuration="27.784763292s" podCreationTimestamp="2026-04-19 15:25:01 +0000 UTC" firstStartedPulling="2026-04-19 15:25:02.753582856 +0000 UTC m=+1.859498234" lastFinishedPulling="2026-04-19 15:25:21.145360864 +0000 UTC m=+20.251276245" observedRunningTime="2026-04-19 15:25:28.783469497 +0000 UTC m=+27.889384887" watchObservedRunningTime="2026-04-19 15:25:28.784763292 +0000 UTC m=+27.890678679" Apr 19 15:25:29.097910 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:29.097835 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-8cprr"] Apr 19 15:25:29.098101 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:29.097977 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8cprr" Apr 19 15:25:29.098101 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:29.098071 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8cprr" podUID="41bb40b9-2854-47c5-8759-3fbea6b42b53" Apr 19 15:25:29.100561 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:29.100535 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-r46tx"] Apr 19 15:25:29.100697 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:29.100611 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-r46tx" Apr 19 15:25:29.100697 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:29.100679 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-r46tx" podUID="445f4ff9-7c10-4b4e-8d46-b2e4e449c5bc" Apr 19 15:25:29.764355 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:29.764325 2579 generic.go:358] "Generic (PLEG): container finished" podID="d6faab90-56cc-458f-bf13-4b00ae0b1686" containerID="90358b773180959eadfcf8e7a035b75888df39b8f19e0c12e2cc335be474b797" exitCode=0 Apr 19 15:25:29.765074 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:29.764410 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sz2ds" event={"ID":"d6faab90-56cc-458f-bf13-4b00ae0b1686","Type":"ContainerDied","Data":"90358b773180959eadfcf8e7a035b75888df39b8f19e0c12e2cc335be474b797"} Apr 19 15:25:29.765074 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:29.764627 2579 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 19 15:25:30.541102 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:30.541072 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-r46tx" Apr 19 15:25:30.541209 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:30.541078 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8cprr" Apr 19 15:25:30.541209 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:30.541184 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-r46tx" podUID="445f4ff9-7c10-4b4e-8d46-b2e4e449c5bc" Apr 19 15:25:30.541394 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:30.541296 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8cprr" podUID="41bb40b9-2854-47c5-8759-3fbea6b42b53" Apr 19 15:25:30.768470 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:30.768440 2579 generic.go:358] "Generic (PLEG): container finished" podID="d6faab90-56cc-458f-bf13-4b00ae0b1686" containerID="99fd733a5660b8d1a6c037e78aeb3ea046ca38d0484e147dbbed8d130e06ade9" exitCode=0 Apr 19 15:25:30.769177 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:30.768534 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sz2ds" event={"ID":"d6faab90-56cc-458f-bf13-4b00ae0b1686","Type":"ContainerDied","Data":"99fd733a5660b8d1a6c037e78aeb3ea046ca38d0484e147dbbed8d130e06ade9"} Apr 19 15:25:30.769177 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:30.768770 2579 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 19 15:25:31.231397 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:31.231354 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-xxqlx" Apr 19 15:25:32.540895 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:32.540867 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-r46tx" Apr 19 15:25:32.541277 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:32.540879 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8cprr" Apr 19 15:25:32.541277 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:32.540999 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-r46tx" podUID="445f4ff9-7c10-4b4e-8d46-b2e4e449c5bc" Apr 19 15:25:32.541277 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:32.541051 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8cprr" podUID="41bb40b9-2854-47c5-8759-3fbea6b42b53" Apr 19 15:25:34.154227 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:34.154190 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/41bb40b9-2854-47c5-8759-3fbea6b42b53-metrics-certs\") pod \"network-metrics-daemon-8cprr\" (UID: \"41bb40b9-2854-47c5-8759-3fbea6b42b53\") " pod="openshift-multus/network-metrics-daemon-8cprr" Apr 19 15:25:34.154650 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:34.154340 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 19 15:25:34.154650 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:34.154398 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/41bb40b9-2854-47c5-8759-3fbea6b42b53-metrics-certs podName:41bb40b9-2854-47c5-8759-3fbea6b42b53 nodeName:}" failed. No retries permitted until 2026-04-19 15:26:06.154382209 +0000 UTC m=+65.260297576 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/41bb40b9-2854-47c5-8759-3fbea6b42b53-metrics-certs") pod "network-metrics-daemon-8cprr" (UID: "41bb40b9-2854-47c5-8759-3fbea6b42b53") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 19 15:25:34.154927 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:34.154907 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-218.ec2.internal" event="NodeReady" Apr 19 15:25:34.155065 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:34.155050 2579 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 19 15:25:34.199510 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:34.199472 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-rs4pv"] Apr 19 15:25:34.235172 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:34.235133 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-2xfjp"] Apr 19 15:25:34.235338 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:34.235249 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-rs4pv" Apr 19 15:25:34.237777 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:34.237696 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 19 15:25:34.237777 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:34.237712 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-dvb5z\"" Apr 19 15:25:34.237777 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:34.237698 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 19 15:25:34.255166 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:34.255131 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rwmzv\" (UniqueName: \"kubernetes.io/projected/445f4ff9-7c10-4b4e-8d46-b2e4e449c5bc-kube-api-access-rwmzv\") pod \"network-check-target-r46tx\" (UID: \"445f4ff9-7c10-4b4e-8d46-b2e4e449c5bc\") " pod="openshift-network-diagnostics/network-check-target-r46tx" Apr 19 15:25:34.255323 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:34.255294 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 19 15:25:34.255323 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:34.255314 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 19 15:25:34.255323 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:34.255323 2579 projected.go:194] Error preparing data for projected volume kube-api-access-rwmzv for pod openshift-network-diagnostics/network-check-target-r46tx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 19 15:25:34.255463 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:34.255375 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/445f4ff9-7c10-4b4e-8d46-b2e4e449c5bc-kube-api-access-rwmzv podName:445f4ff9-7c10-4b4e-8d46-b2e4e449c5bc nodeName:}" failed. No retries permitted until 2026-04-19 15:26:06.255357951 +0000 UTC m=+65.361273318 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-rwmzv" (UniqueName: "kubernetes.io/projected/445f4ff9-7c10-4b4e-8d46-b2e4e449c5bc-kube-api-access-rwmzv") pod "network-check-target-r46tx" (UID: "445f4ff9-7c10-4b4e-8d46-b2e4e449c5bc") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 19 15:25:34.263739 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:34.263693 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-rs4pv"] Apr 19 15:25:34.263739 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:34.263746 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-2xfjp"] Apr 19 15:25:34.263925 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:34.263855 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-2xfjp" Apr 19 15:25:34.265971 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:34.265932 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 19 15:25:34.265971 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:34.265938 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 19 15:25:34.266163 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:34.266018 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 19 15:25:34.266163 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:34.265950 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-zxl4j\"" Apr 19 15:25:34.355628 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:34.355590 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/835a8643-4c16-4d4b-bbc6-4e4a5fa3a156-metrics-tls\") pod \"dns-default-rs4pv\" (UID: \"835a8643-4c16-4d4b-bbc6-4e4a5fa3a156\") " pod="openshift-dns/dns-default-rs4pv" Apr 19 15:25:34.355628 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:34.355631 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/835a8643-4c16-4d4b-bbc6-4e4a5fa3a156-tmp-dir\") pod \"dns-default-rs4pv\" (UID: \"835a8643-4c16-4d4b-bbc6-4e4a5fa3a156\") " pod="openshift-dns/dns-default-rs4pv" Apr 19 15:25:34.355863 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:34.355655 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/835a8643-4c16-4d4b-bbc6-4e4a5fa3a156-config-volume\") pod \"dns-default-rs4pv\" (UID: \"835a8643-4c16-4d4b-bbc6-4e4a5fa3a156\") " pod="openshift-dns/dns-default-rs4pv" Apr 19 15:25:34.355863 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:34.355699 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwxv4\" (UniqueName: \"kubernetes.io/projected/b10af7e0-ddd6-409a-bf97-0223a35bb81a-kube-api-access-jwxv4\") pod \"ingress-canary-2xfjp\" (UID: \"b10af7e0-ddd6-409a-bf97-0223a35bb81a\") " pod="openshift-ingress-canary/ingress-canary-2xfjp" Apr 19 15:25:34.355863 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:34.355739 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b10af7e0-ddd6-409a-bf97-0223a35bb81a-cert\") pod \"ingress-canary-2xfjp\" (UID: \"b10af7e0-ddd6-409a-bf97-0223a35bb81a\") " pod="openshift-ingress-canary/ingress-canary-2xfjp" Apr 19 15:25:34.355863 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:34.355768 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gllg\" (UniqueName: \"kubernetes.io/projected/835a8643-4c16-4d4b-bbc6-4e4a5fa3a156-kube-api-access-5gllg\") pod \"dns-default-rs4pv\" (UID: \"835a8643-4c16-4d4b-bbc6-4e4a5fa3a156\") " pod="openshift-dns/dns-default-rs4pv" Apr 19 15:25:34.456068 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:34.456036 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b10af7e0-ddd6-409a-bf97-0223a35bb81a-cert\") pod \"ingress-canary-2xfjp\" (UID: \"b10af7e0-ddd6-409a-bf97-0223a35bb81a\") " pod="openshift-ingress-canary/ingress-canary-2xfjp" Apr 19 15:25:34.456253 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:34.456084 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5gllg\" (UniqueName: \"kubernetes.io/projected/835a8643-4c16-4d4b-bbc6-4e4a5fa3a156-kube-api-access-5gllg\") pod \"dns-default-rs4pv\" (UID: \"835a8643-4c16-4d4b-bbc6-4e4a5fa3a156\") " pod="openshift-dns/dns-default-rs4pv" Apr 19 15:25:34.456253 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:34.456115 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/835a8643-4c16-4d4b-bbc6-4e4a5fa3a156-metrics-tls\") pod \"dns-default-rs4pv\" (UID: \"835a8643-4c16-4d4b-bbc6-4e4a5fa3a156\") " pod="openshift-dns/dns-default-rs4pv" Apr 19 15:25:34.456253 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:34.456131 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/835a8643-4c16-4d4b-bbc6-4e4a5fa3a156-tmp-dir\") pod \"dns-default-rs4pv\" (UID: \"835a8643-4c16-4d4b-bbc6-4e4a5fa3a156\") " pod="openshift-dns/dns-default-rs4pv" Apr 19 15:25:34.456253 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:34.456204 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 19 15:25:34.456456 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:34.456266 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 19 15:25:34.456456 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:34.456278 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b10af7e0-ddd6-409a-bf97-0223a35bb81a-cert podName:b10af7e0-ddd6-409a-bf97-0223a35bb81a nodeName:}" failed. No retries permitted until 2026-04-19 15:25:34.956256207 +0000 UTC m=+34.062171573 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b10af7e0-ddd6-409a-bf97-0223a35bb81a-cert") pod "ingress-canary-2xfjp" (UID: "b10af7e0-ddd6-409a-bf97-0223a35bb81a") : secret "canary-serving-cert" not found Apr 19 15:25:34.456456 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:34.456318 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/835a8643-4c16-4d4b-bbc6-4e4a5fa3a156-config-volume\") pod \"dns-default-rs4pv\" (UID: \"835a8643-4c16-4d4b-bbc6-4e4a5fa3a156\") " pod="openshift-dns/dns-default-rs4pv" Apr 19 15:25:34.456456 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:34.456333 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/835a8643-4c16-4d4b-bbc6-4e4a5fa3a156-metrics-tls podName:835a8643-4c16-4d4b-bbc6-4e4a5fa3a156 nodeName:}" failed. No retries permitted until 2026-04-19 15:25:34.956315928 +0000 UTC m=+34.062231293 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/835a8643-4c16-4d4b-bbc6-4e4a5fa3a156-metrics-tls") pod "dns-default-rs4pv" (UID: "835a8643-4c16-4d4b-bbc6-4e4a5fa3a156") : secret "dns-default-metrics-tls" not found Apr 19 15:25:34.456456 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:34.456382 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jwxv4\" (UniqueName: \"kubernetes.io/projected/b10af7e0-ddd6-409a-bf97-0223a35bb81a-kube-api-access-jwxv4\") pod \"ingress-canary-2xfjp\" (UID: \"b10af7e0-ddd6-409a-bf97-0223a35bb81a\") " pod="openshift-ingress-canary/ingress-canary-2xfjp" Apr 19 15:25:34.456456 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:34.456438 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/835a8643-4c16-4d4b-bbc6-4e4a5fa3a156-tmp-dir\") pod \"dns-default-rs4pv\" (UID: \"835a8643-4c16-4d4b-bbc6-4e4a5fa3a156\") " pod="openshift-dns/dns-default-rs4pv" Apr 19 15:25:34.465329 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:34.465176 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/835a8643-4c16-4d4b-bbc6-4e4a5fa3a156-config-volume\") pod \"dns-default-rs4pv\" (UID: \"835a8643-4c16-4d4b-bbc6-4e4a5fa3a156\") " pod="openshift-dns/dns-default-rs4pv" Apr 19 15:25:34.466561 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:34.466538 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gllg\" (UniqueName: \"kubernetes.io/projected/835a8643-4c16-4d4b-bbc6-4e4a5fa3a156-kube-api-access-5gllg\") pod \"dns-default-rs4pv\" (UID: \"835a8643-4c16-4d4b-bbc6-4e4a5fa3a156\") " pod="openshift-dns/dns-default-rs4pv" Apr 19 15:25:34.466669 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:34.466623 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwxv4\" (UniqueName: \"kubernetes.io/projected/b10af7e0-ddd6-409a-bf97-0223a35bb81a-kube-api-access-jwxv4\") pod \"ingress-canary-2xfjp\" (UID: \"b10af7e0-ddd6-409a-bf97-0223a35bb81a\") " pod="openshift-ingress-canary/ingress-canary-2xfjp" Apr 19 15:25:34.540838 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:34.540701 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-r46tx" Apr 19 15:25:34.541004 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:34.540932 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8cprr" Apr 19 15:25:34.543214 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:34.543186 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 19 15:25:34.543214 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:34.543206 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 19 15:25:34.543489 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:34.543458 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-b2zbg\"" Apr 19 15:25:34.543605 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:34.543500 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 19 15:25:34.543605 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:34.543515 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-zjn96\"" Apr 19 15:25:34.961422 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:34.961390 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b10af7e0-ddd6-409a-bf97-0223a35bb81a-cert\") pod \"ingress-canary-2xfjp\" (UID: \"b10af7e0-ddd6-409a-bf97-0223a35bb81a\") " pod="openshift-ingress-canary/ingress-canary-2xfjp" Apr 19 15:25:34.961639 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:34.961463 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/835a8643-4c16-4d4b-bbc6-4e4a5fa3a156-metrics-tls\") pod \"dns-default-rs4pv\" (UID: \"835a8643-4c16-4d4b-bbc6-4e4a5fa3a156\") " pod="openshift-dns/dns-default-rs4pv" Apr 19 15:25:34.961639 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:34.961577 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 19 15:25:34.961639 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:34.961581 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 19 15:25:34.961828 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:34.961669 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b10af7e0-ddd6-409a-bf97-0223a35bb81a-cert podName:b10af7e0-ddd6-409a-bf97-0223a35bb81a nodeName:}" failed. No retries permitted until 2026-04-19 15:25:35.961647768 +0000 UTC m=+35.067563143 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b10af7e0-ddd6-409a-bf97-0223a35bb81a-cert") pod "ingress-canary-2xfjp" (UID: "b10af7e0-ddd6-409a-bf97-0223a35bb81a") : secret "canary-serving-cert" not found Apr 19 15:25:34.961828 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:34.961689 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/835a8643-4c16-4d4b-bbc6-4e4a5fa3a156-metrics-tls podName:835a8643-4c16-4d4b-bbc6-4e4a5fa3a156 nodeName:}" failed. No retries permitted until 2026-04-19 15:25:35.961679463 +0000 UTC m=+35.067594831 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/835a8643-4c16-4d4b-bbc6-4e4a5fa3a156-metrics-tls") pod "dns-default-rs4pv" (UID: "835a8643-4c16-4d4b-bbc6-4e4a5fa3a156") : secret "dns-default-metrics-tls" not found Apr 19 15:25:35.968992 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:35.968949 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/835a8643-4c16-4d4b-bbc6-4e4a5fa3a156-metrics-tls\") pod \"dns-default-rs4pv\" (UID: \"835a8643-4c16-4d4b-bbc6-4e4a5fa3a156\") " pod="openshift-dns/dns-default-rs4pv" Apr 19 15:25:35.969421 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:35.969036 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b10af7e0-ddd6-409a-bf97-0223a35bb81a-cert\") pod \"ingress-canary-2xfjp\" (UID: \"b10af7e0-ddd6-409a-bf97-0223a35bb81a\") " pod="openshift-ingress-canary/ingress-canary-2xfjp" Apr 19 15:25:35.969421 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:35.969132 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 19 15:25:35.969421 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:35.969194 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 19 15:25:35.969421 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:35.969210 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/835a8643-4c16-4d4b-bbc6-4e4a5fa3a156-metrics-tls podName:835a8643-4c16-4d4b-bbc6-4e4a5fa3a156 nodeName:}" failed. No retries permitted until 2026-04-19 15:25:37.969191145 +0000 UTC m=+37.075106511 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/835a8643-4c16-4d4b-bbc6-4e4a5fa3a156-metrics-tls") pod "dns-default-rs4pv" (UID: "835a8643-4c16-4d4b-bbc6-4e4a5fa3a156") : secret "dns-default-metrics-tls" not found Apr 19 15:25:35.969421 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:35.969245 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b10af7e0-ddd6-409a-bf97-0223a35bb81a-cert podName:b10af7e0-ddd6-409a-bf97-0223a35bb81a nodeName:}" failed. No retries permitted until 2026-04-19 15:25:37.9692279 +0000 UTC m=+37.075143273 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b10af7e0-ddd6-409a-bf97-0223a35bb81a-cert") pod "ingress-canary-2xfjp" (UID: "b10af7e0-ddd6-409a-bf97-0223a35bb81a") : secret "canary-serving-cert" not found Apr 19 15:25:37.785643 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:37.785610 2579 generic.go:358] "Generic (PLEG): container finished" podID="d6faab90-56cc-458f-bf13-4b00ae0b1686" containerID="13d67f6c5a25870e2d819cb83ff51f4053a630e23fa93e1eda53d4fddeaed68b" exitCode=0 Apr 19 15:25:37.786137 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:37.785694 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sz2ds" event={"ID":"d6faab90-56cc-458f-bf13-4b00ae0b1686","Type":"ContainerDied","Data":"13d67f6c5a25870e2d819cb83ff51f4053a630e23fa93e1eda53d4fddeaed68b"} Apr 19 15:25:37.981978 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:37.981945 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b10af7e0-ddd6-409a-bf97-0223a35bb81a-cert\") pod \"ingress-canary-2xfjp\" (UID: \"b10af7e0-ddd6-409a-bf97-0223a35bb81a\") " pod="openshift-ingress-canary/ingress-canary-2xfjp" Apr 19 15:25:37.982107 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:37.982020 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/835a8643-4c16-4d4b-bbc6-4e4a5fa3a156-metrics-tls\") pod \"dns-default-rs4pv\" (UID: \"835a8643-4c16-4d4b-bbc6-4e4a5fa3a156\") " pod="openshift-dns/dns-default-rs4pv" Apr 19 15:25:37.982158 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:37.982105 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 19 15:25:37.982158 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:37.982118 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 19 15:25:37.982240 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:37.982173 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/835a8643-4c16-4d4b-bbc6-4e4a5fa3a156-metrics-tls podName:835a8643-4c16-4d4b-bbc6-4e4a5fa3a156 nodeName:}" failed. No retries permitted until 2026-04-19 15:25:41.982157346 +0000 UTC m=+41.088072712 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/835a8643-4c16-4d4b-bbc6-4e4a5fa3a156-metrics-tls") pod "dns-default-rs4pv" (UID: "835a8643-4c16-4d4b-bbc6-4e4a5fa3a156") : secret "dns-default-metrics-tls" not found Apr 19 15:25:37.982240 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:37.982191 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b10af7e0-ddd6-409a-bf97-0223a35bb81a-cert podName:b10af7e0-ddd6-409a-bf97-0223a35bb81a nodeName:}" failed. No retries permitted until 2026-04-19 15:25:41.982183176 +0000 UTC m=+41.088098542 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b10af7e0-ddd6-409a-bf97-0223a35bb81a-cert") pod "ingress-canary-2xfjp" (UID: "b10af7e0-ddd6-409a-bf97-0223a35bb81a") : secret "canary-serving-cert" not found Apr 19 15:25:38.790005 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:38.789976 2579 generic.go:358] "Generic (PLEG): container finished" podID="d6faab90-56cc-458f-bf13-4b00ae0b1686" containerID="2bc908b43d90fc16acd03e8d381204f783c0649fdd5f1e7ff65cbb4b7399b70b" exitCode=0 Apr 19 15:25:38.790353 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:38.790013 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sz2ds" event={"ID":"d6faab90-56cc-458f-bf13-4b00ae0b1686","Type":"ContainerDied","Data":"2bc908b43d90fc16acd03e8d381204f783c0649fdd5f1e7ff65cbb4b7399b70b"} Apr 19 15:25:39.794367 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:39.794325 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sz2ds" event={"ID":"d6faab90-56cc-458f-bf13-4b00ae0b1686","Type":"ContainerStarted","Data":"37f51b9de33b08e7f0ed5625d9f623732f3348d0269376db1305e827b20f60e9"} Apr 19 15:25:39.814384 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:39.814328 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-sz2ds" podStartSLOduration=4.661889488 podStartE2EDuration="38.814311035s" podCreationTimestamp="2026-04-19 15:25:01 +0000 UTC" firstStartedPulling="2026-04-19 15:25:02.795292029 +0000 UTC m=+1.901207409" lastFinishedPulling="2026-04-19 15:25:36.94771359 +0000 UTC m=+36.053628956" observedRunningTime="2026-04-19 15:25:39.814099124 +0000 UTC m=+38.920014511" watchObservedRunningTime="2026-04-19 15:25:39.814311035 +0000 UTC m=+38.920226424" Apr 19 15:25:42.011613 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:42.011567 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b10af7e0-ddd6-409a-bf97-0223a35bb81a-cert\") pod \"ingress-canary-2xfjp\" (UID: \"b10af7e0-ddd6-409a-bf97-0223a35bb81a\") " pod="openshift-ingress-canary/ingress-canary-2xfjp" Apr 19 15:25:42.012020 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:42.011639 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/835a8643-4c16-4d4b-bbc6-4e4a5fa3a156-metrics-tls\") pod \"dns-default-rs4pv\" (UID: \"835a8643-4c16-4d4b-bbc6-4e4a5fa3a156\") " pod="openshift-dns/dns-default-rs4pv" Apr 19 15:25:42.012020 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:42.011744 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 19 15:25:42.012020 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:42.011760 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 19 15:25:42.012020 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:42.011817 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b10af7e0-ddd6-409a-bf97-0223a35bb81a-cert podName:b10af7e0-ddd6-409a-bf97-0223a35bb81a nodeName:}" failed. No retries permitted until 2026-04-19 15:25:50.011800616 +0000 UTC m=+49.117715982 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b10af7e0-ddd6-409a-bf97-0223a35bb81a-cert") pod "ingress-canary-2xfjp" (UID: "b10af7e0-ddd6-409a-bf97-0223a35bb81a") : secret "canary-serving-cert" not found Apr 19 15:25:42.012020 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:42.011831 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/835a8643-4c16-4d4b-bbc6-4e4a5fa3a156-metrics-tls podName:835a8643-4c16-4d4b-bbc6-4e4a5fa3a156 nodeName:}" failed. No retries permitted until 2026-04-19 15:25:50.011825376 +0000 UTC m=+49.117740742 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/835a8643-4c16-4d4b-bbc6-4e4a5fa3a156-metrics-tls") pod "dns-default-rs4pv" (UID: "835a8643-4c16-4d4b-bbc6-4e4a5fa3a156") : secret "dns-default-metrics-tls" not found Apr 19 15:25:50.068184 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:50.068142 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b10af7e0-ddd6-409a-bf97-0223a35bb81a-cert\") pod \"ingress-canary-2xfjp\" (UID: \"b10af7e0-ddd6-409a-bf97-0223a35bb81a\") " pod="openshift-ingress-canary/ingress-canary-2xfjp" Apr 19 15:25:50.068711 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:50.068204 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/835a8643-4c16-4d4b-bbc6-4e4a5fa3a156-metrics-tls\") pod \"dns-default-rs4pv\" (UID: \"835a8643-4c16-4d4b-bbc6-4e4a5fa3a156\") " pod="openshift-dns/dns-default-rs4pv" Apr 19 15:25:50.068711 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:50.068289 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 19 15:25:50.068711 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:50.068289 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 19 15:25:50.068711 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:50.068343 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/835a8643-4c16-4d4b-bbc6-4e4a5fa3a156-metrics-tls podName:835a8643-4c16-4d4b-bbc6-4e4a5fa3a156 nodeName:}" failed. No retries permitted until 2026-04-19 15:26:06.068328316 +0000 UTC m=+65.174243681 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/835a8643-4c16-4d4b-bbc6-4e4a5fa3a156-metrics-tls") pod "dns-default-rs4pv" (UID: "835a8643-4c16-4d4b-bbc6-4e4a5fa3a156") : secret "dns-default-metrics-tls" not found Apr 19 15:25:50.068711 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:50.068356 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b10af7e0-ddd6-409a-bf97-0223a35bb81a-cert podName:b10af7e0-ddd6-409a-bf97-0223a35bb81a nodeName:}" failed. No retries permitted until 2026-04-19 15:26:06.068350014 +0000 UTC m=+65.174265380 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b10af7e0-ddd6-409a-bf97-0223a35bb81a-cert") pod "ingress-canary-2xfjp" (UID: "b10af7e0-ddd6-409a-bf97-0223a35bb81a") : secret "canary-serving-cert" not found Apr 19 15:25:52.049490 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:52.049459 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-k9hxh"] Apr 19 15:25:52.083522 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:52.083492 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-569cffc76c-jl8vf"] Apr 19 15:25:52.083667 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:52.083653 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-k9hxh" Apr 19 15:25:52.085731 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:52.085688 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 19 15:25:52.085882 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:52.085777 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 19 15:25:52.085882 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:52.085788 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 19 15:25:52.085882 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:52.085873 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-tmwfv\"" Apr 19 15:25:52.086054 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:52.085913 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 19 15:25:52.109389 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:52.109361 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-dhs7x"] Apr 19 15:25:52.109553 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:52.109536 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-569cffc76c-jl8vf" Apr 19 15:25:52.111742 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:52.111707 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-b94dq\"" Apr 19 15:25:52.111742 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:52.111711 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 19 15:25:52.111921 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:52.111709 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 19 15:25:52.111921 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:52.111709 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 19 15:25:52.116010 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:52.115992 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 19 15:25:52.122014 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:52.121994 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-lgqmg"] Apr 19 15:25:52.122132 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:52.122119 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-dhs7x" Apr 19 15:25:52.124055 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:52.124036 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 19 15:25:52.124161 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:52.124042 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 19 15:25:52.124161 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:52.124067 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-tqn9d\"" Apr 19 15:25:52.124161 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:52.124039 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 19 15:25:52.134087 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:52.134069 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-k9hxh"] Apr 19 15:25:52.134087 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:52.134091 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-lgqmg"] Apr 19 15:25:52.134210 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:52.134099 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-dhs7x"] Apr 19 15:25:52.134210 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:52.134107 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-569cffc76c-jl8vf"] Apr 19 15:25:52.134210 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:52.134193 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lgqmg" Apr 19 15:25:52.136338 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:52.136322 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 19 15:25:52.136730 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:52.136698 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 19 15:25:52.136811 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:52.136771 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 19 15:25:52.136811 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:52.136700 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-q7nk9\"" Apr 19 15:25:52.136924 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:52.136835 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 19 15:25:52.138896 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:52.138875 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-dbkrb"] Apr 19 15:25:52.166950 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:52.166921 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-dbkrb"] Apr 19 15:25:52.167133 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:52.167022 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-dbkrb" Apr 19 15:25:52.169295 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:52.169273 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 19 15:25:52.170177 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:52.170155 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 19 15:25:52.170302 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:52.170195 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 19 15:25:52.170463 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:52.170442 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 19 15:25:52.170657 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:52.170540 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-slt85\"" Apr 19 15:25:52.176317 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:52.176287 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 19 15:25:52.180597 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:52.180566 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e18d4ee3-accb-4d8b-aad0-8801d1395e00-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-k9hxh\" (UID: \"e18d4ee3-accb-4d8b-aad0-8801d1395e00\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-k9hxh" Apr 19 15:25:52.180681 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:52.180627 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6r4v\" (UniqueName: \"kubernetes.io/projected/e18d4ee3-accb-4d8b-aad0-8801d1395e00-kube-api-access-p6r4v\") pod \"kube-storage-version-migrator-operator-6769c5d45-k9hxh\" (UID: \"e18d4ee3-accb-4d8b-aad0-8801d1395e00\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-k9hxh" Apr 19 15:25:52.180681 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:52.180659 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e18d4ee3-accb-4d8b-aad0-8801d1395e00-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-k9hxh\" (UID: \"e18d4ee3-accb-4d8b-aad0-8801d1395e00\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-k9hxh" Apr 19 15:25:52.281559 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:52.281515 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/3db4c72b-149d-4caa-8a3b-7a449266aa07-image-registry-private-configuration\") pod \"image-registry-569cffc76c-jl8vf\" (UID: \"3db4c72b-149d-4caa-8a3b-7a449266aa07\") " pod="openshift-image-registry/image-registry-569cffc76c-jl8vf" Apr 19 15:25:52.281559 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:52.281560 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3db4c72b-149d-4caa-8a3b-7a449266aa07-ca-trust-extracted\") pod \"image-registry-569cffc76c-jl8vf\" (UID: \"3db4c72b-149d-4caa-8a3b-7a449266aa07\") " pod="openshift-image-registry/image-registry-569cffc76c-jl8vf" Apr 19 15:25:52.281817 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:52.281582 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3db4c72b-149d-4caa-8a3b-7a449266aa07-registry-certificates\") pod \"image-registry-569cffc76c-jl8vf\" (UID: \"3db4c72b-149d-4caa-8a3b-7a449266aa07\") " pod="openshift-image-registry/image-registry-569cffc76c-jl8vf" Apr 19 15:25:52.281817 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:52.281635 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rx4tq\" (UniqueName: \"kubernetes.io/projected/3db4c72b-149d-4caa-8a3b-7a449266aa07-kube-api-access-rx4tq\") pod \"image-registry-569cffc76c-jl8vf\" (UID: \"3db4c72b-149d-4caa-8a3b-7a449266aa07\") " pod="openshift-image-registry/image-registry-569cffc76c-jl8vf" Apr 19 15:25:52.281817 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:52.281708 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sccvv\" (UniqueName: \"kubernetes.io/projected/0d8e4508-63ae-4c34-9e5a-88f0e8d37185-kube-api-access-sccvv\") pod \"insights-operator-585dfdc468-dbkrb\" (UID: \"0d8e4508-63ae-4c34-9e5a-88f0e8d37185\") " pod="openshift-insights/insights-operator-585dfdc468-dbkrb" Apr 19 15:25:52.281817 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:52.281812 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3db4c72b-149d-4caa-8a3b-7a449266aa07-trusted-ca\") pod \"image-registry-569cffc76c-jl8vf\" (UID: \"3db4c72b-149d-4caa-8a3b-7a449266aa07\") " pod="openshift-image-registry/image-registry-569cffc76c-jl8vf" Apr 19 15:25:52.281955 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:52.281834 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/6185d8e4-ff61-4ce3-9885-8aaeca0c15ca-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-lgqmg\" (UID: \"6185d8e4-ff61-4ce3-9885-8aaeca0c15ca\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lgqmg" Apr 19 15:25:52.281955 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:52.281866 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e18d4ee3-accb-4d8b-aad0-8801d1395e00-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-k9hxh\" (UID: \"e18d4ee3-accb-4d8b-aad0-8801d1395e00\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-k9hxh" Apr 19 15:25:52.281955 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:52.281883 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3db4c72b-149d-4caa-8a3b-7a449266aa07-registry-tls\") pod \"image-registry-569cffc76c-jl8vf\" (UID: \"3db4c72b-149d-4caa-8a3b-7a449266aa07\") " pod="openshift-image-registry/image-registry-569cffc76c-jl8vf" Apr 19 15:25:52.281955 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:52.281901 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3db4c72b-149d-4caa-8a3b-7a449266aa07-bound-sa-token\") pod \"image-registry-569cffc76c-jl8vf\" (UID: \"3db4c72b-149d-4caa-8a3b-7a449266aa07\") " pod="openshift-image-registry/image-registry-569cffc76c-jl8vf" Apr 19 15:25:52.281955 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:52.281928 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0d8e4508-63ae-4c34-9e5a-88f0e8d37185-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-dbkrb\" (UID: \"0d8e4508-63ae-4c34-9e5a-88f0e8d37185\") " pod="openshift-insights/insights-operator-585dfdc468-dbkrb" Apr 19 15:25:52.282178 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:52.281955 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3db4c72b-149d-4caa-8a3b-7a449266aa07-installation-pull-secrets\") pod \"image-registry-569cffc76c-jl8vf\" (UID: \"3db4c72b-149d-4caa-8a3b-7a449266aa07\") " pod="openshift-image-registry/image-registry-569cffc76c-jl8vf" Apr 19 15:25:52.282178 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:52.281970 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0d8e4508-63ae-4c34-9e5a-88f0e8d37185-serving-cert\") pod \"insights-operator-585dfdc468-dbkrb\" (UID: \"0d8e4508-63ae-4c34-9e5a-88f0e8d37185\") " pod="openshift-insights/insights-operator-585dfdc468-dbkrb" Apr 19 15:25:52.282178 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:52.281988 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p6r4v\" (UniqueName: \"kubernetes.io/projected/e18d4ee3-accb-4d8b-aad0-8801d1395e00-kube-api-access-p6r4v\") pod \"kube-storage-version-migrator-operator-6769c5d45-k9hxh\" (UID: \"e18d4ee3-accb-4d8b-aad0-8801d1395e00\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-k9hxh" Apr 19 15:25:52.282178 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:52.282007 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0d8e4508-63ae-4c34-9e5a-88f0e8d37185-service-ca-bundle\") pod \"insights-operator-585dfdc468-dbkrb\" (UID: \"0d8e4508-63ae-4c34-9e5a-88f0e8d37185\") " pod="openshift-insights/insights-operator-585dfdc468-dbkrb" Apr 19 15:25:52.282178 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:52.282074 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e18d4ee3-accb-4d8b-aad0-8801d1395e00-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-k9hxh\" (UID: \"e18d4ee3-accb-4d8b-aad0-8801d1395e00\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-k9hxh" Apr 19 15:25:52.282178 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:52.282119 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0d8e4508-63ae-4c34-9e5a-88f0e8d37185-tmp\") pod \"insights-operator-585dfdc468-dbkrb\" (UID: \"0d8e4508-63ae-4c34-9e5a-88f0e8d37185\") " pod="openshift-insights/insights-operator-585dfdc468-dbkrb" Apr 19 15:25:52.282178 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:52.282147 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/0d8e4508-63ae-4c34-9e5a-88f0e8d37185-snapshots\") pod \"insights-operator-585dfdc468-dbkrb\" (UID: \"0d8e4508-63ae-4c34-9e5a-88f0e8d37185\") " pod="openshift-insights/insights-operator-585dfdc468-dbkrb" Apr 19 15:25:52.282431 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:52.282182 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nn9h9\" (UniqueName: \"kubernetes.io/projected/6185d8e4-ff61-4ce3-9885-8aaeca0c15ca-kube-api-access-nn9h9\") pod \"cluster-monitoring-operator-75587bd455-lgqmg\" (UID: \"6185d8e4-ff61-4ce3-9885-8aaeca0c15ca\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lgqmg" Apr 19 15:25:52.282431 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:52.282274 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e42822c4-a9e0-4f2e-87cb-e89414cfc72d-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-dhs7x\" (UID: \"e42822c4-a9e0-4f2e-87cb-e89414cfc72d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-dhs7x" Apr 19 15:25:52.282431 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:52.282310 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hplx6\" (UniqueName: \"kubernetes.io/projected/e42822c4-a9e0-4f2e-87cb-e89414cfc72d-kube-api-access-hplx6\") pod \"cluster-samples-operator-6dc5bdb6b4-dhs7x\" (UID: \"e42822c4-a9e0-4f2e-87cb-e89414cfc72d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-dhs7x" Apr 19 15:25:52.282431 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:52.282340 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/6185d8e4-ff61-4ce3-9885-8aaeca0c15ca-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-lgqmg\" (UID: \"6185d8e4-ff61-4ce3-9885-8aaeca0c15ca\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lgqmg" Apr 19 15:25:52.282431 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:52.282363 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e18d4ee3-accb-4d8b-aad0-8801d1395e00-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-k9hxh\" (UID: \"e18d4ee3-accb-4d8b-aad0-8801d1395e00\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-k9hxh" Apr 19 15:25:52.285562 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:52.285534 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e18d4ee3-accb-4d8b-aad0-8801d1395e00-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-k9hxh\" (UID: \"e18d4ee3-accb-4d8b-aad0-8801d1395e00\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-k9hxh" Apr 19 15:25:52.290102 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:52.290078 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6r4v\" (UniqueName: \"kubernetes.io/projected/e18d4ee3-accb-4d8b-aad0-8801d1395e00-kube-api-access-p6r4v\") pod \"kube-storage-version-migrator-operator-6769c5d45-k9hxh\" (UID: \"e18d4ee3-accb-4d8b-aad0-8801d1395e00\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-k9hxh" Apr 19 15:25:52.383654 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:52.383554 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0d8e4508-63ae-4c34-9e5a-88f0e8d37185-service-ca-bundle\") pod \"insights-operator-585dfdc468-dbkrb\" (UID: \"0d8e4508-63ae-4c34-9e5a-88f0e8d37185\") " pod="openshift-insights/insights-operator-585dfdc468-dbkrb" Apr 19 15:25:52.383654 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:52.383602 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0d8e4508-63ae-4c34-9e5a-88f0e8d37185-tmp\") pod \"insights-operator-585dfdc468-dbkrb\" (UID: \"0d8e4508-63ae-4c34-9e5a-88f0e8d37185\") " pod="openshift-insights/insights-operator-585dfdc468-dbkrb" Apr 19 15:25:52.383654 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:52.383619 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/0d8e4508-63ae-4c34-9e5a-88f0e8d37185-snapshots\") pod \"insights-operator-585dfdc468-dbkrb\" (UID: \"0d8e4508-63ae-4c34-9e5a-88f0e8d37185\") " pod="openshift-insights/insights-operator-585dfdc468-dbkrb" Apr 19 15:25:52.383957 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:52.383771 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nn9h9\" (UniqueName: \"kubernetes.io/projected/6185d8e4-ff61-4ce3-9885-8aaeca0c15ca-kube-api-access-nn9h9\") pod \"cluster-monitoring-operator-75587bd455-lgqmg\" (UID: \"6185d8e4-ff61-4ce3-9885-8aaeca0c15ca\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lgqmg" Apr 19 15:25:52.383957 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:52.383834 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e42822c4-a9e0-4f2e-87cb-e89414cfc72d-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-dhs7x\" (UID: \"e42822c4-a9e0-4f2e-87cb-e89414cfc72d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-dhs7x" Apr 19 15:25:52.383957 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:52.383861 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hplx6\" (UniqueName: \"kubernetes.io/projected/e42822c4-a9e0-4f2e-87cb-e89414cfc72d-kube-api-access-hplx6\") pod \"cluster-samples-operator-6dc5bdb6b4-dhs7x\" (UID: \"e42822c4-a9e0-4f2e-87cb-e89414cfc72d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-dhs7x" Apr 19 15:25:52.383957 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:52.383888 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/6185d8e4-ff61-4ce3-9885-8aaeca0c15ca-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-lgqmg\" (UID: \"6185d8e4-ff61-4ce3-9885-8aaeca0c15ca\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lgqmg" Apr 19 15:25:52.383957 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:52.383926 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/3db4c72b-149d-4caa-8a3b-7a449266aa07-image-registry-private-configuration\") pod \"image-registry-569cffc76c-jl8vf\" (UID: \"3db4c72b-149d-4caa-8a3b-7a449266aa07\") " pod="openshift-image-registry/image-registry-569cffc76c-jl8vf" Apr 19 15:25:52.383957 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:52.383936 2579 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 19 15:25:52.383957 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:52.383952 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3db4c72b-149d-4caa-8a3b-7a449266aa07-ca-trust-extracted\") pod \"image-registry-569cffc76c-jl8vf\" (UID: \"3db4c72b-149d-4caa-8a3b-7a449266aa07\") " pod="openshift-image-registry/image-registry-569cffc76c-jl8vf" Apr 19 15:25:52.384279 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:52.383975 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3db4c72b-149d-4caa-8a3b-7a449266aa07-registry-certificates\") pod \"image-registry-569cffc76c-jl8vf\" (UID: \"3db4c72b-149d-4caa-8a3b-7a449266aa07\") " pod="openshift-image-registry/image-registry-569cffc76c-jl8vf" Apr 19 15:25:52.384279 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:52.384004 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rx4tq\" (UniqueName: \"kubernetes.io/projected/3db4c72b-149d-4caa-8a3b-7a449266aa07-kube-api-access-rx4tq\") pod \"image-registry-569cffc76c-jl8vf\" (UID: \"3db4c72b-149d-4caa-8a3b-7a449266aa07\") " pod="openshift-image-registry/image-registry-569cffc76c-jl8vf" Apr 19 15:25:52.384279 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:52.384020 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e42822c4-a9e0-4f2e-87cb-e89414cfc72d-samples-operator-tls podName:e42822c4-a9e0-4f2e-87cb-e89414cfc72d nodeName:}" failed. No retries permitted until 2026-04-19 15:25:52.884000193 +0000 UTC m=+51.989915579 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/e42822c4-a9e0-4f2e-87cb-e89414cfc72d-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-dhs7x" (UID: "e42822c4-a9e0-4f2e-87cb-e89414cfc72d") : secret "samples-operator-tls" not found Apr 19 15:25:52.384279 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:52.384050 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sccvv\" (UniqueName: \"kubernetes.io/projected/0d8e4508-63ae-4c34-9e5a-88f0e8d37185-kube-api-access-sccvv\") pod \"insights-operator-585dfdc468-dbkrb\" (UID: \"0d8e4508-63ae-4c34-9e5a-88f0e8d37185\") " pod="openshift-insights/insights-operator-585dfdc468-dbkrb" Apr 19 15:25:52.384279 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:52.384091 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3db4c72b-149d-4caa-8a3b-7a449266aa07-trusted-ca\") pod \"image-registry-569cffc76c-jl8vf\" (UID: \"3db4c72b-149d-4caa-8a3b-7a449266aa07\") " pod="openshift-image-registry/image-registry-569cffc76c-jl8vf" Apr 19 15:25:52.384279 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:52.384118 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/6185d8e4-ff61-4ce3-9885-8aaeca0c15ca-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-lgqmg\" (UID: \"6185d8e4-ff61-4ce3-9885-8aaeca0c15ca\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lgqmg" Apr 19 15:25:52.384279 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:52.384172 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3db4c72b-149d-4caa-8a3b-7a449266aa07-registry-tls\") pod \"image-registry-569cffc76c-jl8vf\" (UID: \"3db4c72b-149d-4caa-8a3b-7a449266aa07\") " pod="openshift-image-registry/image-registry-569cffc76c-jl8vf" Apr 19 15:25:52.384279 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:52.384197 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3db4c72b-149d-4caa-8a3b-7a449266aa07-bound-sa-token\") pod \"image-registry-569cffc76c-jl8vf\" (UID: \"3db4c72b-149d-4caa-8a3b-7a449266aa07\") " pod="openshift-image-registry/image-registry-569cffc76c-jl8vf" Apr 19 15:25:52.384279 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:52.384219 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0d8e4508-63ae-4c34-9e5a-88f0e8d37185-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-dbkrb\" (UID: \"0d8e4508-63ae-4c34-9e5a-88f0e8d37185\") " pod="openshift-insights/insights-operator-585dfdc468-dbkrb" Apr 19 15:25:52.384279 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:52.384252 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/0d8e4508-63ae-4c34-9e5a-88f0e8d37185-snapshots\") pod \"insights-operator-585dfdc468-dbkrb\" (UID: \"0d8e4508-63ae-4c34-9e5a-88f0e8d37185\") " pod="openshift-insights/insights-operator-585dfdc468-dbkrb" Apr 19 15:25:52.384279 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:52.384266 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3db4c72b-149d-4caa-8a3b-7a449266aa07-installation-pull-secrets\") pod \"image-registry-569cffc76c-jl8vf\" (UID: \"3db4c72b-149d-4caa-8a3b-7a449266aa07\") " pod="openshift-image-registry/image-registry-569cffc76c-jl8vf" Apr 19 15:25:52.384870 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:52.384307 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0d8e4508-63ae-4c34-9e5a-88f0e8d37185-serving-cert\") pod \"insights-operator-585dfdc468-dbkrb\" (UID: \"0d8e4508-63ae-4c34-9e5a-88f0e8d37185\") " pod="openshift-insights/insights-operator-585dfdc468-dbkrb" Apr 19 15:25:52.384870 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:52.384310 2579 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 19 15:25:52.384870 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:52.384403 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0d8e4508-63ae-4c34-9e5a-88f0e8d37185-service-ca-bundle\") pod \"insights-operator-585dfdc468-dbkrb\" (UID: \"0d8e4508-63ae-4c34-9e5a-88f0e8d37185\") " pod="openshift-insights/insights-operator-585dfdc468-dbkrb" Apr 19 15:25:52.384870 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:52.384480 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6185d8e4-ff61-4ce3-9885-8aaeca0c15ca-cluster-monitoring-operator-tls podName:6185d8e4-ff61-4ce3-9885-8aaeca0c15ca nodeName:}" failed. No retries permitted until 2026-04-19 15:25:52.884402985 +0000 UTC m=+51.990318352 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/6185d8e4-ff61-4ce3-9885-8aaeca0c15ca-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-lgqmg" (UID: "6185d8e4-ff61-4ce3-9885-8aaeca0c15ca") : secret "cluster-monitoring-operator-tls" not found Apr 19 15:25:52.384870 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:52.384499 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3db4c72b-149d-4caa-8a3b-7a449266aa07-ca-trust-extracted\") pod \"image-registry-569cffc76c-jl8vf\" (UID: \"3db4c72b-149d-4caa-8a3b-7a449266aa07\") " pod="openshift-image-registry/image-registry-569cffc76c-jl8vf" Apr 19 15:25:52.385118 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:52.384928 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3db4c72b-149d-4caa-8a3b-7a449266aa07-registry-certificates\") pod \"image-registry-569cffc76c-jl8vf\" (UID: \"3db4c72b-149d-4caa-8a3b-7a449266aa07\") " pod="openshift-image-registry/image-registry-569cffc76c-jl8vf" Apr 19 15:25:52.385118 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:52.384937 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 19 15:25:52.385118 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:52.384974 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-569cffc76c-jl8vf: secret "image-registry-tls" not found Apr 19 15:25:52.385118 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:52.385032 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3db4c72b-149d-4caa-8a3b-7a449266aa07-registry-tls podName:3db4c72b-149d-4caa-8a3b-7a449266aa07 nodeName:}" failed. No retries permitted until 2026-04-19 15:25:52.885015636 +0000 UTC m=+51.990931006 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/3db4c72b-149d-4caa-8a3b-7a449266aa07-registry-tls") pod "image-registry-569cffc76c-jl8vf" (UID: "3db4c72b-149d-4caa-8a3b-7a449266aa07") : secret "image-registry-tls" not found Apr 19 15:25:52.385118 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:52.385095 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/6185d8e4-ff61-4ce3-9885-8aaeca0c15ca-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-lgqmg\" (UID: \"6185d8e4-ff61-4ce3-9885-8aaeca0c15ca\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lgqmg" Apr 19 15:25:52.385321 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:52.385205 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0d8e4508-63ae-4c34-9e5a-88f0e8d37185-tmp\") pod \"insights-operator-585dfdc468-dbkrb\" (UID: \"0d8e4508-63ae-4c34-9e5a-88f0e8d37185\") " pod="openshift-insights/insights-operator-585dfdc468-dbkrb" Apr 19 15:25:52.385746 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:52.385705 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3db4c72b-149d-4caa-8a3b-7a449266aa07-trusted-ca\") pod \"image-registry-569cffc76c-jl8vf\" (UID: \"3db4c72b-149d-4caa-8a3b-7a449266aa07\") " pod="openshift-image-registry/image-registry-569cffc76c-jl8vf" Apr 19 15:25:52.386880 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:52.386860 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3db4c72b-149d-4caa-8a3b-7a449266aa07-installation-pull-secrets\") pod \"image-registry-569cffc76c-jl8vf\" (UID: \"3db4c72b-149d-4caa-8a3b-7a449266aa07\") " pod="openshift-image-registry/image-registry-569cffc76c-jl8vf" Apr 19 15:25:52.387202 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:52.387184 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0d8e4508-63ae-4c34-9e5a-88f0e8d37185-serving-cert\") pod \"insights-operator-585dfdc468-dbkrb\" (UID: \"0d8e4508-63ae-4c34-9e5a-88f0e8d37185\") " pod="openshift-insights/insights-operator-585dfdc468-dbkrb" Apr 19 15:25:52.387327 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:52.387308 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/3db4c72b-149d-4caa-8a3b-7a449266aa07-image-registry-private-configuration\") pod \"image-registry-569cffc76c-jl8vf\" (UID: \"3db4c72b-149d-4caa-8a3b-7a449266aa07\") " pod="openshift-image-registry/image-registry-569cffc76c-jl8vf" Apr 19 15:25:52.392536 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:52.392072 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-k9hxh" Apr 19 15:25:52.392982 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:52.392952 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sccvv\" (UniqueName: \"kubernetes.io/projected/0d8e4508-63ae-4c34-9e5a-88f0e8d37185-kube-api-access-sccvv\") pod \"insights-operator-585dfdc468-dbkrb\" (UID: \"0d8e4508-63ae-4c34-9e5a-88f0e8d37185\") " pod="openshift-insights/insights-operator-585dfdc468-dbkrb" Apr 19 15:25:52.393115 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:52.393027 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hplx6\" (UniqueName: \"kubernetes.io/projected/e42822c4-a9e0-4f2e-87cb-e89414cfc72d-kube-api-access-hplx6\") pod \"cluster-samples-operator-6dc5bdb6b4-dhs7x\" (UID: \"e42822c4-a9e0-4f2e-87cb-e89414cfc72d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-dhs7x" Apr 19 15:25:52.393520 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:52.393501 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3db4c72b-149d-4caa-8a3b-7a449266aa07-bound-sa-token\") pod \"image-registry-569cffc76c-jl8vf\" (UID: \"3db4c72b-149d-4caa-8a3b-7a449266aa07\") " pod="openshift-image-registry/image-registry-569cffc76c-jl8vf" Apr 19 15:25:52.394075 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:52.394056 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nn9h9\" (UniqueName: \"kubernetes.io/projected/6185d8e4-ff61-4ce3-9885-8aaeca0c15ca-kube-api-access-nn9h9\") pod \"cluster-monitoring-operator-75587bd455-lgqmg\" (UID: \"6185d8e4-ff61-4ce3-9885-8aaeca0c15ca\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lgqmg" Apr 19 15:25:52.394548 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:52.394533 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rx4tq\" (UniqueName: \"kubernetes.io/projected/3db4c72b-149d-4caa-8a3b-7a449266aa07-kube-api-access-rx4tq\") pod \"image-registry-569cffc76c-jl8vf\" (UID: \"3db4c72b-149d-4caa-8a3b-7a449266aa07\") " pod="openshift-image-registry/image-registry-569cffc76c-jl8vf" Apr 19 15:25:52.397855 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:52.397835 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0d8e4508-63ae-4c34-9e5a-88f0e8d37185-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-dbkrb\" (UID: \"0d8e4508-63ae-4c34-9e5a-88f0e8d37185\") " pod="openshift-insights/insights-operator-585dfdc468-dbkrb" Apr 19 15:25:52.479559 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:52.479522 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-dbkrb" Apr 19 15:25:52.557155 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:52.557123 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-k9hxh"] Apr 19 15:25:52.608089 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:52.608058 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-dbkrb"] Apr 19 15:25:52.611275 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:25:52.611244 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0d8e4508_63ae_4c34_9e5a_88f0e8d37185.slice/crio-ef45a9e1e083abe6240cd58bf31cad4990b1f18383b062a92e9bfe96b4a82109 WatchSource:0}: Error finding container ef45a9e1e083abe6240cd58bf31cad4990b1f18383b062a92e9bfe96b4a82109: Status 404 returned error can't find the container with id ef45a9e1e083abe6240cd58bf31cad4990b1f18383b062a92e9bfe96b4a82109 Apr 19 15:25:52.819252 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:52.819214 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-k9hxh" event={"ID":"e18d4ee3-accb-4d8b-aad0-8801d1395e00","Type":"ContainerStarted","Data":"c582d20772dd591c92c8b3562475446641dccf51d989d159497eab4715ee2778"} Apr 19 15:25:52.820163 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:52.820133 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-dbkrb" event={"ID":"0d8e4508-63ae-4c34-9e5a-88f0e8d37185","Type":"ContainerStarted","Data":"ef45a9e1e083abe6240cd58bf31cad4990b1f18383b062a92e9bfe96b4a82109"} Apr 19 15:25:52.888243 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:52.888202 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3db4c72b-149d-4caa-8a3b-7a449266aa07-registry-tls\") pod \"image-registry-569cffc76c-jl8vf\" (UID: \"3db4c72b-149d-4caa-8a3b-7a449266aa07\") " pod="openshift-image-registry/image-registry-569cffc76c-jl8vf" Apr 19 15:25:52.888430 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:52.888272 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e42822c4-a9e0-4f2e-87cb-e89414cfc72d-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-dhs7x\" (UID: \"e42822c4-a9e0-4f2e-87cb-e89414cfc72d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-dhs7x" Apr 19 15:25:52.888430 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:52.888296 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/6185d8e4-ff61-4ce3-9885-8aaeca0c15ca-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-lgqmg\" (UID: \"6185d8e4-ff61-4ce3-9885-8aaeca0c15ca\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lgqmg" Apr 19 15:25:52.888430 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:52.888358 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 19 15:25:52.888430 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:52.888371 2579 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 19 15:25:52.888430 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:52.888386 2579 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 19 15:25:52.888676 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:52.888438 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6185d8e4-ff61-4ce3-9885-8aaeca0c15ca-cluster-monitoring-operator-tls podName:6185d8e4-ff61-4ce3-9885-8aaeca0c15ca nodeName:}" failed. No retries permitted until 2026-04-19 15:25:53.888425025 +0000 UTC m=+52.994340391 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/6185d8e4-ff61-4ce3-9885-8aaeca0c15ca-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-lgqmg" (UID: "6185d8e4-ff61-4ce3-9885-8aaeca0c15ca") : secret "cluster-monitoring-operator-tls" not found Apr 19 15:25:52.888676 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:52.888451 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e42822c4-a9e0-4f2e-87cb-e89414cfc72d-samples-operator-tls podName:e42822c4-a9e0-4f2e-87cb-e89414cfc72d nodeName:}" failed. No retries permitted until 2026-04-19 15:25:53.888444623 +0000 UTC m=+52.994359989 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/e42822c4-a9e0-4f2e-87cb-e89414cfc72d-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-dhs7x" (UID: "e42822c4-a9e0-4f2e-87cb-e89414cfc72d") : secret "samples-operator-tls" not found Apr 19 15:25:52.888676 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:52.888378 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-569cffc76c-jl8vf: secret "image-registry-tls" not found Apr 19 15:25:52.888676 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:52.888474 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3db4c72b-149d-4caa-8a3b-7a449266aa07-registry-tls podName:3db4c72b-149d-4caa-8a3b-7a449266aa07 nodeName:}" failed. No retries permitted until 2026-04-19 15:25:53.888469126 +0000 UTC m=+52.994384491 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/3db4c72b-149d-4caa-8a3b-7a449266aa07-registry-tls") pod "image-registry-569cffc76c-jl8vf" (UID: "3db4c72b-149d-4caa-8a3b-7a449266aa07") : secret "image-registry-tls" not found Apr 19 15:25:53.898941 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:53.898904 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e42822c4-a9e0-4f2e-87cb-e89414cfc72d-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-dhs7x\" (UID: \"e42822c4-a9e0-4f2e-87cb-e89414cfc72d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-dhs7x" Apr 19 15:25:53.898941 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:53.898942 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/6185d8e4-ff61-4ce3-9885-8aaeca0c15ca-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-lgqmg\" (UID: \"6185d8e4-ff61-4ce3-9885-8aaeca0c15ca\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lgqmg" Apr 19 15:25:53.899454 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:53.899005 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3db4c72b-149d-4caa-8a3b-7a449266aa07-registry-tls\") pod \"image-registry-569cffc76c-jl8vf\" (UID: \"3db4c72b-149d-4caa-8a3b-7a449266aa07\") " pod="openshift-image-registry/image-registry-569cffc76c-jl8vf" Apr 19 15:25:53.899454 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:53.899081 2579 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 19 15:25:53.899454 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:53.899121 2579 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 19 15:25:53.899454 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:53.899125 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 19 15:25:53.899454 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:53.899144 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-569cffc76c-jl8vf: secret "image-registry-tls" not found Apr 19 15:25:53.899454 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:53.899151 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e42822c4-a9e0-4f2e-87cb-e89414cfc72d-samples-operator-tls podName:e42822c4-a9e0-4f2e-87cb-e89414cfc72d nodeName:}" failed. No retries permitted until 2026-04-19 15:25:55.89913426 +0000 UTC m=+55.005049629 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/e42822c4-a9e0-4f2e-87cb-e89414cfc72d-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-dhs7x" (UID: "e42822c4-a9e0-4f2e-87cb-e89414cfc72d") : secret "samples-operator-tls" not found Apr 19 15:25:53.899454 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:53.899177 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6185d8e4-ff61-4ce3-9885-8aaeca0c15ca-cluster-monitoring-operator-tls podName:6185d8e4-ff61-4ce3-9885-8aaeca0c15ca nodeName:}" failed. No retries permitted until 2026-04-19 15:25:55.899158309 +0000 UTC m=+55.005073677 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/6185d8e4-ff61-4ce3-9885-8aaeca0c15ca-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-lgqmg" (UID: "6185d8e4-ff61-4ce3-9885-8aaeca0c15ca") : secret "cluster-monitoring-operator-tls" not found Apr 19 15:25:53.899454 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:53.899195 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3db4c72b-149d-4caa-8a3b-7a449266aa07-registry-tls podName:3db4c72b-149d-4caa-8a3b-7a449266aa07 nodeName:}" failed. No retries permitted until 2026-04-19 15:25:55.899187656 +0000 UTC m=+55.005103021 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/3db4c72b-149d-4caa-8a3b-7a449266aa07-registry-tls") pod "image-registry-569cffc76c-jl8vf" (UID: "3db4c72b-149d-4caa-8a3b-7a449266aa07") : secret "image-registry-tls" not found Apr 19 15:25:55.827454 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:55.827418 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-dbkrb" event={"ID":"0d8e4508-63ae-4c34-9e5a-88f0e8d37185","Type":"ContainerStarted","Data":"1673083ebe5809a9d4854b6d18310852e47f34e2640de4e90520e7c50bb8cb26"} Apr 19 15:25:55.828862 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:55.828829 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-k9hxh" event={"ID":"e18d4ee3-accb-4d8b-aad0-8801d1395e00","Type":"ContainerStarted","Data":"b669f8ea054c7bcb741a99a4c86f9a48033c35449ea459763e2ead0079b9d9a1"} Apr 19 15:25:55.846502 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:55.844153 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-dbkrb" podStartSLOduration=1.189000691 podStartE2EDuration="3.844134046s" podCreationTimestamp="2026-04-19 15:25:52 +0000 UTC" firstStartedPulling="2026-04-19 15:25:52.613208371 +0000 UTC m=+51.719123737" lastFinishedPulling="2026-04-19 15:25:55.268341723 +0000 UTC m=+54.374257092" observedRunningTime="2026-04-19 15:25:55.841554356 +0000 UTC m=+54.947469767" watchObservedRunningTime="2026-04-19 15:25:55.844134046 +0000 UTC m=+54.950049435" Apr 19 15:25:55.860549 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:55.860486 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-k9hxh" podStartSLOduration=1.157853561 podStartE2EDuration="3.860468076s" podCreationTimestamp="2026-04-19 15:25:52 +0000 UTC" firstStartedPulling="2026-04-19 15:25:52.563124602 +0000 UTC m=+51.669039968" lastFinishedPulling="2026-04-19 15:25:55.265739116 +0000 UTC m=+54.371654483" observedRunningTime="2026-04-19 15:25:55.859617858 +0000 UTC m=+54.965533247" watchObservedRunningTime="2026-04-19 15:25:55.860468076 +0000 UTC m=+54.966383465" Apr 19 15:25:55.917223 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:55.917180 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3db4c72b-149d-4caa-8a3b-7a449266aa07-registry-tls\") pod \"image-registry-569cffc76c-jl8vf\" (UID: \"3db4c72b-149d-4caa-8a3b-7a449266aa07\") " pod="openshift-image-registry/image-registry-569cffc76c-jl8vf" Apr 19 15:25:55.917403 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:55.917299 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 19 15:25:55.917403 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:55.917314 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-569cffc76c-jl8vf: secret "image-registry-tls" not found Apr 19 15:25:55.917403 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:55.917380 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3db4c72b-149d-4caa-8a3b-7a449266aa07-registry-tls podName:3db4c72b-149d-4caa-8a3b-7a449266aa07 nodeName:}" failed. No retries permitted until 2026-04-19 15:25:59.917361661 +0000 UTC m=+59.023277031 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/3db4c72b-149d-4caa-8a3b-7a449266aa07-registry-tls") pod "image-registry-569cffc76c-jl8vf" (UID: "3db4c72b-149d-4caa-8a3b-7a449266aa07") : secret "image-registry-tls" not found Apr 19 15:25:55.917572 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:55.917523 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e42822c4-a9e0-4f2e-87cb-e89414cfc72d-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-dhs7x\" (UID: \"e42822c4-a9e0-4f2e-87cb-e89414cfc72d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-dhs7x" Apr 19 15:25:55.917572 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:55.917559 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/6185d8e4-ff61-4ce3-9885-8aaeca0c15ca-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-lgqmg\" (UID: \"6185d8e4-ff61-4ce3-9885-8aaeca0c15ca\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lgqmg" Apr 19 15:25:55.917684 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:55.917668 2579 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 19 15:25:55.917786 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:55.917715 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6185d8e4-ff61-4ce3-9885-8aaeca0c15ca-cluster-monitoring-operator-tls podName:6185d8e4-ff61-4ce3-9885-8aaeca0c15ca nodeName:}" failed. No retries permitted until 2026-04-19 15:25:59.917701242 +0000 UTC m=+59.023616612 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/6185d8e4-ff61-4ce3-9885-8aaeca0c15ca-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-lgqmg" (UID: "6185d8e4-ff61-4ce3-9885-8aaeca0c15ca") : secret "cluster-monitoring-operator-tls" not found Apr 19 15:25:55.917861 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:55.917811 2579 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 19 15:25:55.917861 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:55.917846 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e42822c4-a9e0-4f2e-87cb-e89414cfc72d-samples-operator-tls podName:e42822c4-a9e0-4f2e-87cb-e89414cfc72d nodeName:}" failed. No retries permitted until 2026-04-19 15:25:59.917835066 +0000 UTC m=+59.023750435 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/e42822c4-a9e0-4f2e-87cb-e89414cfc72d-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-dhs7x" (UID: "e42822c4-a9e0-4f2e-87cb-e89414cfc72d") : secret "samples-operator-tls" not found Apr 19 15:25:59.949257 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:59.949223 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e42822c4-a9e0-4f2e-87cb-e89414cfc72d-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-dhs7x\" (UID: \"e42822c4-a9e0-4f2e-87cb-e89414cfc72d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-dhs7x" Apr 19 15:25:59.949257 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:59.949261 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/6185d8e4-ff61-4ce3-9885-8aaeca0c15ca-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-lgqmg\" (UID: \"6185d8e4-ff61-4ce3-9885-8aaeca0c15ca\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lgqmg" Apr 19 15:25:59.949666 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:59.949389 2579 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 19 15:25:59.949666 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:59.949426 2579 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 19 15:25:59.949666 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:59.949453 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e42822c4-a9e0-4f2e-87cb-e89414cfc72d-samples-operator-tls podName:e42822c4-a9e0-4f2e-87cb-e89414cfc72d nodeName:}" failed. No retries permitted until 2026-04-19 15:26:07.949435681 +0000 UTC m=+67.055351047 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/e42822c4-a9e0-4f2e-87cb-e89414cfc72d-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-dhs7x" (UID: "e42822c4-a9e0-4f2e-87cb-e89414cfc72d") : secret "samples-operator-tls" not found Apr 19 15:25:59.949666 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:59.949475 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3db4c72b-149d-4caa-8a3b-7a449266aa07-registry-tls\") pod \"image-registry-569cffc76c-jl8vf\" (UID: \"3db4c72b-149d-4caa-8a3b-7a449266aa07\") " pod="openshift-image-registry/image-registry-569cffc76c-jl8vf" Apr 19 15:25:59.949666 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:59.949507 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6185d8e4-ff61-4ce3-9885-8aaeca0c15ca-cluster-monitoring-operator-tls podName:6185d8e4-ff61-4ce3-9885-8aaeca0c15ca nodeName:}" failed. No retries permitted until 2026-04-19 15:26:07.949484612 +0000 UTC m=+67.055399981 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/6185d8e4-ff61-4ce3-9885-8aaeca0c15ca-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-lgqmg" (UID: "6185d8e4-ff61-4ce3-9885-8aaeca0c15ca") : secret "cluster-monitoring-operator-tls" not found Apr 19 15:25:59.949666 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:59.949532 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 19 15:25:59.949666 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:59.949542 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-569cffc76c-jl8vf: secret "image-registry-tls" not found Apr 19 15:25:59.949666 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:25:59.949578 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3db4c72b-149d-4caa-8a3b-7a449266aa07-registry-tls podName:3db4c72b-149d-4caa-8a3b-7a449266aa07 nodeName:}" failed. No retries permitted until 2026-04-19 15:26:07.949563861 +0000 UTC m=+67.055479246 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/3db4c72b-149d-4caa-8a3b-7a449266aa07-registry-tls") pod "image-registry-569cffc76c-jl8vf" (UID: "3db4c72b-149d-4caa-8a3b-7a449266aa07") : secret "image-registry-tls" not found Apr 19 15:25:59.954691 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:25:59.954669 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-pss7s_dac7973c-ee33-410c-8f77-093953d73a03/dns-node-resolver/0.log" Apr 19 15:26:00.771121 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:00.771094 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-4tfml_b3a083b4-d7b2-4f52-b323-b957d5ebc531/node-ca/0.log" Apr 19 15:26:01.810985 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:01.810955 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-xxqlx" Apr 19 15:26:02.557317 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:02.557278 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-k9hxh_e18d4ee3-accb-4d8b-aad0-8801d1395e00/kube-storage-version-migrator-operator/0.log" Apr 19 15:26:06.097831 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:06.097779 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/835a8643-4c16-4d4b-bbc6-4e4a5fa3a156-metrics-tls\") pod \"dns-default-rs4pv\" (UID: \"835a8643-4c16-4d4b-bbc6-4e4a5fa3a156\") " pod="openshift-dns/dns-default-rs4pv" Apr 19 15:26:06.098216 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:06.097881 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b10af7e0-ddd6-409a-bf97-0223a35bb81a-cert\") pod \"ingress-canary-2xfjp\" (UID: \"b10af7e0-ddd6-409a-bf97-0223a35bb81a\") " pod="openshift-ingress-canary/ingress-canary-2xfjp" Apr 19 15:26:06.100362 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:06.100335 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/835a8643-4c16-4d4b-bbc6-4e4a5fa3a156-metrics-tls\") pod \"dns-default-rs4pv\" (UID: \"835a8643-4c16-4d4b-bbc6-4e4a5fa3a156\") " pod="openshift-dns/dns-default-rs4pv" Apr 19 15:26:06.100502 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:06.100432 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b10af7e0-ddd6-409a-bf97-0223a35bb81a-cert\") pod \"ingress-canary-2xfjp\" (UID: \"b10af7e0-ddd6-409a-bf97-0223a35bb81a\") " pod="openshift-ingress-canary/ingress-canary-2xfjp" Apr 19 15:26:06.199229 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:06.199186 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/41bb40b9-2854-47c5-8759-3fbea6b42b53-metrics-certs\") pod \"network-metrics-daemon-8cprr\" (UID: \"41bb40b9-2854-47c5-8759-3fbea6b42b53\") " pod="openshift-multus/network-metrics-daemon-8cprr" Apr 19 15:26:06.201386 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:06.201364 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 19 15:26:06.210244 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:26:06.210219 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 19 15:26:06.210356 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:26:06.210286 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/41bb40b9-2854-47c5-8759-3fbea6b42b53-metrics-certs podName:41bb40b9-2854-47c5-8759-3fbea6b42b53 nodeName:}" failed. No retries permitted until 2026-04-19 15:27:10.210268899 +0000 UTC m=+129.316184264 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/41bb40b9-2854-47c5-8759-3fbea6b42b53-metrics-certs") pod "network-metrics-daemon-8cprr" (UID: "41bb40b9-2854-47c5-8759-3fbea6b42b53") : secret "metrics-daemon-secret" not found Apr 19 15:26:06.300530 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:06.300493 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rwmzv\" (UniqueName: \"kubernetes.io/projected/445f4ff9-7c10-4b4e-8d46-b2e4e449c5bc-kube-api-access-rwmzv\") pod \"network-check-target-r46tx\" (UID: \"445f4ff9-7c10-4b4e-8d46-b2e4e449c5bc\") " pod="openshift-network-diagnostics/network-check-target-r46tx" Apr 19 15:26:06.302951 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:06.302930 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 19 15:26:06.313457 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:06.313431 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 19 15:26:06.324765 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:06.324709 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwmzv\" (UniqueName: \"kubernetes.io/projected/445f4ff9-7c10-4b4e-8d46-b2e4e449c5bc-kube-api-access-rwmzv\") pod \"network-check-target-r46tx\" (UID: \"445f4ff9-7c10-4b4e-8d46-b2e4e449c5bc\") " pod="openshift-network-diagnostics/network-check-target-r46tx" Apr 19 15:26:06.348708 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:06.348630 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-dvb5z\"" Apr 19 15:26:06.355248 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:06.355219 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-b2zbg\"" Apr 19 15:26:06.356761 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:06.356740 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-rs4pv" Apr 19 15:26:06.363644 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:06.363622 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-r46tx" Apr 19 15:26:06.375262 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:06.375232 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-zxl4j\"" Apr 19 15:26:06.383257 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:06.383230 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-2xfjp" Apr 19 15:26:06.500382 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:06.500338 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-rs4pv"] Apr 19 15:26:06.510605 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:06.510580 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-r46tx"] Apr 19 15:26:06.519670 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:26:06.519643 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod835a8643_4c16_4d4b_bbc6_4e4a5fa3a156.slice/crio-a01c54938bdadce9ef62550d9b94b2f619ba8faa23732abd891efe461a21177e WatchSource:0}: Error finding container a01c54938bdadce9ef62550d9b94b2f619ba8faa23732abd891efe461a21177e: Status 404 returned error can't find the container with id a01c54938bdadce9ef62550d9b94b2f619ba8faa23732abd891efe461a21177e Apr 19 15:26:06.520285 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:26:06.520257 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod445f4ff9_7c10_4b4e_8d46_b2e4e449c5bc.slice/crio-3a0f797748c37fc290013c8b2a34c54689acc54087320d0725913a292513a61d WatchSource:0}: Error finding container 3a0f797748c37fc290013c8b2a34c54689acc54087320d0725913a292513a61d: Status 404 returned error can't find the container with id 3a0f797748c37fc290013c8b2a34c54689acc54087320d0725913a292513a61d Apr 19 15:26:06.526803 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:06.526784 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-2xfjp"] Apr 19 15:26:06.529958 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:26:06.529937 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb10af7e0_ddd6_409a_bf97_0223a35bb81a.slice/crio-8162d02befd4b3176a40f170dd61b1b504f9a92ad6f529618c22a43e25ba8488 WatchSource:0}: Error finding container 8162d02befd4b3176a40f170dd61b1b504f9a92ad6f529618c22a43e25ba8488: Status 404 returned error can't find the container with id 8162d02befd4b3176a40f170dd61b1b504f9a92ad6f529618c22a43e25ba8488 Apr 19 15:26:06.852069 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:06.852028 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-rs4pv" event={"ID":"835a8643-4c16-4d4b-bbc6-4e4a5fa3a156","Type":"ContainerStarted","Data":"a01c54938bdadce9ef62550d9b94b2f619ba8faa23732abd891efe461a21177e"} Apr 19 15:26:06.853060 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:06.853028 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-2xfjp" event={"ID":"b10af7e0-ddd6-409a-bf97-0223a35bb81a","Type":"ContainerStarted","Data":"8162d02befd4b3176a40f170dd61b1b504f9a92ad6f529618c22a43e25ba8488"} Apr 19 15:26:06.854023 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:06.853999 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-r46tx" event={"ID":"445f4ff9-7c10-4b4e-8d46-b2e4e449c5bc","Type":"ContainerStarted","Data":"3a0f797748c37fc290013c8b2a34c54689acc54087320d0725913a292513a61d"} Apr 19 15:26:08.017610 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:08.017573 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e42822c4-a9e0-4f2e-87cb-e89414cfc72d-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-dhs7x\" (UID: \"e42822c4-a9e0-4f2e-87cb-e89414cfc72d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-dhs7x" Apr 19 15:26:08.018088 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:08.017627 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/6185d8e4-ff61-4ce3-9885-8aaeca0c15ca-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-lgqmg\" (UID: \"6185d8e4-ff61-4ce3-9885-8aaeca0c15ca\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lgqmg" Apr 19 15:26:08.018088 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:08.017703 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3db4c72b-149d-4caa-8a3b-7a449266aa07-registry-tls\") pod \"image-registry-569cffc76c-jl8vf\" (UID: \"3db4c72b-149d-4caa-8a3b-7a449266aa07\") " pod="openshift-image-registry/image-registry-569cffc76c-jl8vf" Apr 19 15:26:08.018088 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:26:08.017827 2579 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 19 15:26:08.018088 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:26:08.017928 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6185d8e4-ff61-4ce3-9885-8aaeca0c15ca-cluster-monitoring-operator-tls podName:6185d8e4-ff61-4ce3-9885-8aaeca0c15ca nodeName:}" failed. No retries permitted until 2026-04-19 15:26:24.017885572 +0000 UTC m=+83.123800955 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/6185d8e4-ff61-4ce3-9885-8aaeca0c15ca-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-lgqmg" (UID: "6185d8e4-ff61-4ce3-9885-8aaeca0c15ca") : secret "cluster-monitoring-operator-tls" not found Apr 19 15:26:08.020563 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:08.020516 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e42822c4-a9e0-4f2e-87cb-e89414cfc72d-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-dhs7x\" (UID: \"e42822c4-a9e0-4f2e-87cb-e89414cfc72d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-dhs7x" Apr 19 15:26:08.020689 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:08.020624 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3db4c72b-149d-4caa-8a3b-7a449266aa07-registry-tls\") pod \"image-registry-569cffc76c-jl8vf\" (UID: \"3db4c72b-149d-4caa-8a3b-7a449266aa07\") " pod="openshift-image-registry/image-registry-569cffc76c-jl8vf" Apr 19 15:26:08.033004 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:08.032976 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-tqn9d\"" Apr 19 15:26:08.041935 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:08.041913 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-dhs7x" Apr 19 15:26:08.320009 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:08.319930 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-b94dq\"" Apr 19 15:26:08.327846 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:08.327813 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-569cffc76c-jl8vf" Apr 19 15:26:10.407050 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:10.407020 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-dhs7x"] Apr 19 15:26:10.425216 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:10.425185 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-569cffc76c-jl8vf"] Apr 19 15:26:10.429202 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:26:10.429136 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3db4c72b_149d_4caa_8a3b_7a449266aa07.slice/crio-89336557b37ea3af7205313bf5f116b87e62bcca753f0cd805b28054cbae3a31 WatchSource:0}: Error finding container 89336557b37ea3af7205313bf5f116b87e62bcca753f0cd805b28054cbae3a31: Status 404 returned error can't find the container with id 89336557b37ea3af7205313bf5f116b87e62bcca753f0cd805b28054cbae3a31 Apr 19 15:26:10.866348 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:10.866264 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-569cffc76c-jl8vf" event={"ID":"3db4c72b-149d-4caa-8a3b-7a449266aa07","Type":"ContainerStarted","Data":"a073d054a85685559ba14e88f86acd79cd5310e2018d66d1882de5bf1d00c09b"} Apr 19 15:26:10.866348 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:10.866304 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-569cffc76c-jl8vf" event={"ID":"3db4c72b-149d-4caa-8a3b-7a449266aa07","Type":"ContainerStarted","Data":"89336557b37ea3af7205313bf5f116b87e62bcca753f0cd805b28054cbae3a31"} Apr 19 15:26:10.866348 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:10.866349 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-569cffc76c-jl8vf" Apr 19 15:26:10.870920 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:10.870884 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-rs4pv" event={"ID":"835a8643-4c16-4d4b-bbc6-4e4a5fa3a156","Type":"ContainerStarted","Data":"ae384b9460e9260320f2a6bbcd24734256a116232c510d482ceacab9dd106178"} Apr 19 15:26:10.871081 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:10.870938 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-rs4pv" event={"ID":"835a8643-4c16-4d4b-bbc6-4e4a5fa3a156","Type":"ContainerStarted","Data":"d1406ca934be46491ae2c92681a3f19464dfcd6260e87fc895e945deb1ea1815"} Apr 19 15:26:10.871081 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:10.871058 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-rs4pv" Apr 19 15:26:10.872455 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:10.872423 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-dhs7x" event={"ID":"e42822c4-a9e0-4f2e-87cb-e89414cfc72d","Type":"ContainerStarted","Data":"a9f27765ecd2ff070dc3b5a139c790445aedea745e619d13770d8ce8c48ff663"} Apr 19 15:26:10.873882 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:10.873858 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-2xfjp" event={"ID":"b10af7e0-ddd6-409a-bf97-0223a35bb81a","Type":"ContainerStarted","Data":"4c4303c4b3fe1098fbdac29ec64789e4ec09968a8156c69d181a59be0773fbdb"} Apr 19 15:26:10.875493 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:10.875472 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-r46tx" event={"ID":"445f4ff9-7c10-4b4e-8d46-b2e4e449c5bc","Type":"ContainerStarted","Data":"182b947f100bd2445e0c366a38a3aad9ebeb9252b711ffca196a728818737799"} Apr 19 15:26:10.875618 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:10.875604 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-r46tx" Apr 19 15:26:10.885678 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:10.885633 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-569cffc76c-jl8vf" podStartSLOduration=18.885618162 podStartE2EDuration="18.885618162s" podCreationTimestamp="2026-04-19 15:25:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-19 15:26:10.884867481 +0000 UTC m=+69.990782867" watchObservedRunningTime="2026-04-19 15:26:10.885618162 +0000 UTC m=+69.991533549" Apr 19 15:26:10.898215 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:10.898146 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-2xfjp" podStartSLOduration=33.182488847 podStartE2EDuration="36.898126786s" podCreationTimestamp="2026-04-19 15:25:34 +0000 UTC" firstStartedPulling="2026-04-19 15:26:06.531759814 +0000 UTC m=+65.637675181" lastFinishedPulling="2026-04-19 15:26:10.24739775 +0000 UTC m=+69.353313120" observedRunningTime="2026-04-19 15:26:10.897919708 +0000 UTC m=+70.003835118" watchObservedRunningTime="2026-04-19 15:26:10.898126786 +0000 UTC m=+70.004042174" Apr 19 15:26:10.911086 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:10.911015 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-r46tx" podStartSLOduration=66.169565305 podStartE2EDuration="1m9.910997083s" podCreationTimestamp="2026-04-19 15:25:01 +0000 UTC" firstStartedPulling="2026-04-19 15:26:06.522241862 +0000 UTC m=+65.628157229" lastFinishedPulling="2026-04-19 15:26:10.263673639 +0000 UTC m=+69.369589007" observedRunningTime="2026-04-19 15:26:10.910854239 +0000 UTC m=+70.016769629" watchObservedRunningTime="2026-04-19 15:26:10.910997083 +0000 UTC m=+70.016912474" Apr 19 15:26:10.927226 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:10.927160 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-rs4pv" podStartSLOduration=33.201795184 podStartE2EDuration="36.927142342s" podCreationTimestamp="2026-04-19 15:25:34 +0000 UTC" firstStartedPulling="2026-04-19 15:26:06.521779274 +0000 UTC m=+65.627694640" lastFinishedPulling="2026-04-19 15:26:10.247126428 +0000 UTC m=+69.353041798" observedRunningTime="2026-04-19 15:26:10.926897999 +0000 UTC m=+70.032813388" watchObservedRunningTime="2026-04-19 15:26:10.927142342 +0000 UTC m=+70.033057733" Apr 19 15:26:12.882000 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:12.881966 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-dhs7x" event={"ID":"e42822c4-a9e0-4f2e-87cb-e89414cfc72d","Type":"ContainerStarted","Data":"148c91da32459bc1dbaf55ef45836342a6f050673836f9989addf5a9cf6225a1"} Apr 19 15:26:12.882000 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:12.882006 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-dhs7x" event={"ID":"e42822c4-a9e0-4f2e-87cb-e89414cfc72d","Type":"ContainerStarted","Data":"47115ec744c66b00af2db593c4165703c6453247927d9da8e47c003ec25da504"} Apr 19 15:26:12.897272 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:12.897224 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-dhs7x" podStartSLOduration=19.156812641 podStartE2EDuration="20.897211673s" podCreationTimestamp="2026-04-19 15:25:52 +0000 UTC" firstStartedPulling="2026-04-19 15:26:10.43924873 +0000 UTC m=+69.545164096" lastFinishedPulling="2026-04-19 15:26:12.179647752 +0000 UTC m=+71.285563128" observedRunningTime="2026-04-19 15:26:12.896911046 +0000 UTC m=+72.002826434" watchObservedRunningTime="2026-04-19 15:26:12.897211673 +0000 UTC m=+72.003127060" Apr 19 15:26:20.881123 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:20.881008 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-rs4pv" Apr 19 15:26:22.327092 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:22.327059 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-vlqlr"] Apr 19 15:26:22.331096 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:22.331076 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-vlqlr" Apr 19 15:26:22.333983 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:22.333960 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 19 15:26:22.334104 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:22.333959 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 19 15:26:22.334104 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:22.333960 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-c5wj8\"" Apr 19 15:26:22.338467 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:22.338236 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-vlqlr"] Apr 19 15:26:22.390549 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:22.390510 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-569cffc76c-jl8vf"] Apr 19 15:26:22.426183 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:22.426155 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/edae232e-010b-4d24-a5b7-d4e138925f66-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-vlqlr\" (UID: \"edae232e-010b-4d24-a5b7-d4e138925f66\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-vlqlr" Apr 19 15:26:22.426183 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:22.426195 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/edae232e-010b-4d24-a5b7-d4e138925f66-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-vlqlr\" (UID: \"edae232e-010b-4d24-a5b7-d4e138925f66\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-vlqlr" Apr 19 15:26:22.447942 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:22.447914 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-mf22j"] Apr 19 15:26:22.450951 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:22.450936 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-mf22j" Apr 19 15:26:22.453127 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:22.453106 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 19 15:26:22.453230 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:22.453210 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 19 15:26:22.453485 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:22.453463 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-jvwp2\"" Apr 19 15:26:22.461401 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:22.461379 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-mf22j"] Apr 19 15:26:22.527143 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:22.527107 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/edae232e-010b-4d24-a5b7-d4e138925f66-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-vlqlr\" (UID: \"edae232e-010b-4d24-a5b7-d4e138925f66\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-vlqlr" Apr 19 15:26:22.527143 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:22.527156 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lnft\" (UniqueName: \"kubernetes.io/projected/8aec71fb-cb0e-441f-9f19-9346759e030b-kube-api-access-9lnft\") pod \"insights-runtime-extractor-mf22j\" (UID: \"8aec71fb-cb0e-441f-9f19-9346759e030b\") " pod="openshift-insights/insights-runtime-extractor-mf22j" Apr 19 15:26:22.527415 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:22.527180 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/8aec71fb-cb0e-441f-9f19-9346759e030b-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-mf22j\" (UID: \"8aec71fb-cb0e-441f-9f19-9346759e030b\") " pod="openshift-insights/insights-runtime-extractor-mf22j" Apr 19 15:26:22.527415 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:22.527209 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/edae232e-010b-4d24-a5b7-d4e138925f66-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-vlqlr\" (UID: \"edae232e-010b-4d24-a5b7-d4e138925f66\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-vlqlr" Apr 19 15:26:22.527415 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:22.527250 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/8aec71fb-cb0e-441f-9f19-9346759e030b-data-volume\") pod \"insights-runtime-extractor-mf22j\" (UID: \"8aec71fb-cb0e-441f-9f19-9346759e030b\") " pod="openshift-insights/insights-runtime-extractor-mf22j" Apr 19 15:26:22.527415 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:22.527287 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/8aec71fb-cb0e-441f-9f19-9346759e030b-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-mf22j\" (UID: \"8aec71fb-cb0e-441f-9f19-9346759e030b\") " pod="openshift-insights/insights-runtime-extractor-mf22j" Apr 19 15:26:22.527415 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:22.527326 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/8aec71fb-cb0e-441f-9f19-9346759e030b-crio-socket\") pod \"insights-runtime-extractor-mf22j\" (UID: \"8aec71fb-cb0e-441f-9f19-9346759e030b\") " pod="openshift-insights/insights-runtime-extractor-mf22j" Apr 19 15:26:22.527927 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:22.527903 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/edae232e-010b-4d24-a5b7-d4e138925f66-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-vlqlr\" (UID: \"edae232e-010b-4d24-a5b7-d4e138925f66\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-vlqlr" Apr 19 15:26:22.529762 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:22.529745 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/edae232e-010b-4d24-a5b7-d4e138925f66-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-vlqlr\" (UID: \"edae232e-010b-4d24-a5b7-d4e138925f66\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-vlqlr" Apr 19 15:26:22.532768 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:22.532713 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-548869cb46-b9zkq"] Apr 19 15:26:22.536224 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:22.536205 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-548869cb46-b9zkq" Apr 19 15:26:22.546685 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:22.546661 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-548869cb46-b9zkq"] Apr 19 15:26:22.628788 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:22.628695 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/8aec71fb-cb0e-441f-9f19-9346759e030b-crio-socket\") pod \"insights-runtime-extractor-mf22j\" (UID: \"8aec71fb-cb0e-441f-9f19-9346759e030b\") " pod="openshift-insights/insights-runtime-extractor-mf22j" Apr 19 15:26:22.628788 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:22.628761 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/18816144-4892-4c9c-8870-c79f4966f41a-trusted-ca\") pod \"image-registry-548869cb46-b9zkq\" (UID: \"18816144-4892-4c9c-8870-c79f4966f41a\") " pod="openshift-image-registry/image-registry-548869cb46-b9zkq" Apr 19 15:26:22.628966 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:22.628822 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/8aec71fb-cb0e-441f-9f19-9346759e030b-crio-socket\") pod \"insights-runtime-extractor-mf22j\" (UID: \"8aec71fb-cb0e-441f-9f19-9346759e030b\") " pod="openshift-insights/insights-runtime-extractor-mf22j" Apr 19 15:26:22.628966 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:22.628846 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/18816144-4892-4c9c-8870-c79f4966f41a-registry-certificates\") pod \"image-registry-548869cb46-b9zkq\" (UID: \"18816144-4892-4c9c-8870-c79f4966f41a\") " pod="openshift-image-registry/image-registry-548869cb46-b9zkq" Apr 19 15:26:22.628966 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:22.628876 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/18816144-4892-4c9c-8870-c79f4966f41a-registry-tls\") pod \"image-registry-548869cb46-b9zkq\" (UID: \"18816144-4892-4c9c-8870-c79f4966f41a\") " pod="openshift-image-registry/image-registry-548869cb46-b9zkq" Apr 19 15:26:22.628966 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:22.628913 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/18816144-4892-4c9c-8870-c79f4966f41a-bound-sa-token\") pod \"image-registry-548869cb46-b9zkq\" (UID: \"18816144-4892-4c9c-8870-c79f4966f41a\") " pod="openshift-image-registry/image-registry-548869cb46-b9zkq" Apr 19 15:26:22.629101 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:22.628972 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9lnft\" (UniqueName: \"kubernetes.io/projected/8aec71fb-cb0e-441f-9f19-9346759e030b-kube-api-access-9lnft\") pod \"insights-runtime-extractor-mf22j\" (UID: \"8aec71fb-cb0e-441f-9f19-9346759e030b\") " pod="openshift-insights/insights-runtime-extractor-mf22j" Apr 19 15:26:22.629101 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:22.628991 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/18816144-4892-4c9c-8870-c79f4966f41a-image-registry-private-configuration\") pod \"image-registry-548869cb46-b9zkq\" (UID: \"18816144-4892-4c9c-8870-c79f4966f41a\") " pod="openshift-image-registry/image-registry-548869cb46-b9zkq" Apr 19 15:26:22.629101 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:22.629015 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/18816144-4892-4c9c-8870-c79f4966f41a-ca-trust-extracted\") pod \"image-registry-548869cb46-b9zkq\" (UID: \"18816144-4892-4c9c-8870-c79f4966f41a\") " pod="openshift-image-registry/image-registry-548869cb46-b9zkq" Apr 19 15:26:22.629101 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:22.629044 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/8aec71fb-cb0e-441f-9f19-9346759e030b-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-mf22j\" (UID: \"8aec71fb-cb0e-441f-9f19-9346759e030b\") " pod="openshift-insights/insights-runtime-extractor-mf22j" Apr 19 15:26:22.629101 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:22.629066 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/8aec71fb-cb0e-441f-9f19-9346759e030b-data-volume\") pod \"insights-runtime-extractor-mf22j\" (UID: \"8aec71fb-cb0e-441f-9f19-9346759e030b\") " pod="openshift-insights/insights-runtime-extractor-mf22j" Apr 19 15:26:22.629101 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:22.629084 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/8aec71fb-cb0e-441f-9f19-9346759e030b-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-mf22j\" (UID: \"8aec71fb-cb0e-441f-9f19-9346759e030b\") " pod="openshift-insights/insights-runtime-extractor-mf22j" Apr 19 15:26:22.629357 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:22.629109 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/18816144-4892-4c9c-8870-c79f4966f41a-installation-pull-secrets\") pod \"image-registry-548869cb46-b9zkq\" (UID: \"18816144-4892-4c9c-8870-c79f4966f41a\") " pod="openshift-image-registry/image-registry-548869cb46-b9zkq" Apr 19 15:26:22.629357 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:22.629129 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d895t\" (UniqueName: \"kubernetes.io/projected/18816144-4892-4c9c-8870-c79f4966f41a-kube-api-access-d895t\") pod \"image-registry-548869cb46-b9zkq\" (UID: \"18816144-4892-4c9c-8870-c79f4966f41a\") " pod="openshift-image-registry/image-registry-548869cb46-b9zkq" Apr 19 15:26:22.629512 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:22.629493 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/8aec71fb-cb0e-441f-9f19-9346759e030b-data-volume\") pod \"insights-runtime-extractor-mf22j\" (UID: \"8aec71fb-cb0e-441f-9f19-9346759e030b\") " pod="openshift-insights/insights-runtime-extractor-mf22j" Apr 19 15:26:22.629664 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:22.629646 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/8aec71fb-cb0e-441f-9f19-9346759e030b-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-mf22j\" (UID: \"8aec71fb-cb0e-441f-9f19-9346759e030b\") " pod="openshift-insights/insights-runtime-extractor-mf22j" Apr 19 15:26:22.631458 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:22.631440 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/8aec71fb-cb0e-441f-9f19-9346759e030b-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-mf22j\" (UID: \"8aec71fb-cb0e-441f-9f19-9346759e030b\") " pod="openshift-insights/insights-runtime-extractor-mf22j" Apr 19 15:26:22.640417 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:22.640396 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-vlqlr" Apr 19 15:26:22.642912 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:22.642884 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lnft\" (UniqueName: \"kubernetes.io/projected/8aec71fb-cb0e-441f-9f19-9346759e030b-kube-api-access-9lnft\") pod \"insights-runtime-extractor-mf22j\" (UID: \"8aec71fb-cb0e-441f-9f19-9346759e030b\") " pod="openshift-insights/insights-runtime-extractor-mf22j" Apr 19 15:26:22.729612 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:22.729576 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/18816144-4892-4c9c-8870-c79f4966f41a-registry-tls\") pod \"image-registry-548869cb46-b9zkq\" (UID: \"18816144-4892-4c9c-8870-c79f4966f41a\") " pod="openshift-image-registry/image-registry-548869cb46-b9zkq" Apr 19 15:26:22.729612 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:22.729613 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/18816144-4892-4c9c-8870-c79f4966f41a-bound-sa-token\") pod \"image-registry-548869cb46-b9zkq\" (UID: \"18816144-4892-4c9c-8870-c79f4966f41a\") " pod="openshift-image-registry/image-registry-548869cb46-b9zkq" Apr 19 15:26:22.729864 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:22.729649 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/18816144-4892-4c9c-8870-c79f4966f41a-image-registry-private-configuration\") pod \"image-registry-548869cb46-b9zkq\" (UID: \"18816144-4892-4c9c-8870-c79f4966f41a\") " pod="openshift-image-registry/image-registry-548869cb46-b9zkq" Apr 19 15:26:22.729864 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:22.729677 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/18816144-4892-4c9c-8870-c79f4966f41a-ca-trust-extracted\") pod \"image-registry-548869cb46-b9zkq\" (UID: \"18816144-4892-4c9c-8870-c79f4966f41a\") " pod="openshift-image-registry/image-registry-548869cb46-b9zkq" Apr 19 15:26:22.729864 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:22.729737 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/18816144-4892-4c9c-8870-c79f4966f41a-installation-pull-secrets\") pod \"image-registry-548869cb46-b9zkq\" (UID: \"18816144-4892-4c9c-8870-c79f4966f41a\") " pod="openshift-image-registry/image-registry-548869cb46-b9zkq" Apr 19 15:26:22.729864 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:22.729767 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d895t\" (UniqueName: \"kubernetes.io/projected/18816144-4892-4c9c-8870-c79f4966f41a-kube-api-access-d895t\") pod \"image-registry-548869cb46-b9zkq\" (UID: \"18816144-4892-4c9c-8870-c79f4966f41a\") " pod="openshift-image-registry/image-registry-548869cb46-b9zkq" Apr 19 15:26:22.729864 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:22.729815 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/18816144-4892-4c9c-8870-c79f4966f41a-trusted-ca\") pod \"image-registry-548869cb46-b9zkq\" (UID: \"18816144-4892-4c9c-8870-c79f4966f41a\") " pod="openshift-image-registry/image-registry-548869cb46-b9zkq" Apr 19 15:26:22.730140 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:22.729884 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/18816144-4892-4c9c-8870-c79f4966f41a-registry-certificates\") pod \"image-registry-548869cb46-b9zkq\" (UID: \"18816144-4892-4c9c-8870-c79f4966f41a\") " pod="openshift-image-registry/image-registry-548869cb46-b9zkq" Apr 19 15:26:22.730350 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:22.730321 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/18816144-4892-4c9c-8870-c79f4966f41a-ca-trust-extracted\") pod \"image-registry-548869cb46-b9zkq\" (UID: \"18816144-4892-4c9c-8870-c79f4966f41a\") " pod="openshift-image-registry/image-registry-548869cb46-b9zkq" Apr 19 15:26:22.731911 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:22.731834 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/18816144-4892-4c9c-8870-c79f4966f41a-registry-certificates\") pod \"image-registry-548869cb46-b9zkq\" (UID: \"18816144-4892-4c9c-8870-c79f4966f41a\") " pod="openshift-image-registry/image-registry-548869cb46-b9zkq" Apr 19 15:26:22.731911 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:22.731878 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/18816144-4892-4c9c-8870-c79f4966f41a-trusted-ca\") pod \"image-registry-548869cb46-b9zkq\" (UID: \"18816144-4892-4c9c-8870-c79f4966f41a\") " pod="openshift-image-registry/image-registry-548869cb46-b9zkq" Apr 19 15:26:22.733081 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:22.732811 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/18816144-4892-4c9c-8870-c79f4966f41a-registry-tls\") pod \"image-registry-548869cb46-b9zkq\" (UID: \"18816144-4892-4c9c-8870-c79f4966f41a\") " pod="openshift-image-registry/image-registry-548869cb46-b9zkq" Apr 19 15:26:22.733081 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:22.732899 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/18816144-4892-4c9c-8870-c79f4966f41a-image-registry-private-configuration\") pod \"image-registry-548869cb46-b9zkq\" (UID: \"18816144-4892-4c9c-8870-c79f4966f41a\") " pod="openshift-image-registry/image-registry-548869cb46-b9zkq" Apr 19 15:26:22.733595 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:22.733572 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/18816144-4892-4c9c-8870-c79f4966f41a-installation-pull-secrets\") pod \"image-registry-548869cb46-b9zkq\" (UID: \"18816144-4892-4c9c-8870-c79f4966f41a\") " pod="openshift-image-registry/image-registry-548869cb46-b9zkq" Apr 19 15:26:22.739268 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:22.739236 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/18816144-4892-4c9c-8870-c79f4966f41a-bound-sa-token\") pod \"image-registry-548869cb46-b9zkq\" (UID: \"18816144-4892-4c9c-8870-c79f4966f41a\") " pod="openshift-image-registry/image-registry-548869cb46-b9zkq" Apr 19 15:26:22.739362 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:22.739248 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d895t\" (UniqueName: \"kubernetes.io/projected/18816144-4892-4c9c-8870-c79f4966f41a-kube-api-access-d895t\") pod \"image-registry-548869cb46-b9zkq\" (UID: \"18816144-4892-4c9c-8870-c79f4966f41a\") " pod="openshift-image-registry/image-registry-548869cb46-b9zkq" Apr 19 15:26:22.760063 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:22.760034 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-mf22j" Apr 19 15:26:22.760232 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:22.760117 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-vlqlr"] Apr 19 15:26:22.765475 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:26:22.765446 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podedae232e_010b_4d24_a5b7_d4e138925f66.slice/crio-2960072bc1060deda180bdd6505d7e5d001bab7d4ecf5cc801be727690084d3e WatchSource:0}: Error finding container 2960072bc1060deda180bdd6505d7e5d001bab7d4ecf5cc801be727690084d3e: Status 404 returned error can't find the container with id 2960072bc1060deda180bdd6505d7e5d001bab7d4ecf5cc801be727690084d3e Apr 19 15:26:22.845173 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:22.845137 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-548869cb46-b9zkq" Apr 19 15:26:22.881795 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:22.881696 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-mf22j"] Apr 19 15:26:22.886033 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:26:22.886002 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8aec71fb_cb0e_441f_9f19_9346759e030b.slice/crio-3c932a2252e48d00edecc7aa5f628a2c231156e7a98f182c47c0182098742e50 WatchSource:0}: Error finding container 3c932a2252e48d00edecc7aa5f628a2c231156e7a98f182c47c0182098742e50: Status 404 returned error can't find the container with id 3c932a2252e48d00edecc7aa5f628a2c231156e7a98f182c47c0182098742e50 Apr 19 15:26:22.908281 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:22.908249 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-mf22j" event={"ID":"8aec71fb-cb0e-441f-9f19-9346759e030b","Type":"ContainerStarted","Data":"3c932a2252e48d00edecc7aa5f628a2c231156e7a98f182c47c0182098742e50"} Apr 19 15:26:22.909382 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:22.909357 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-vlqlr" event={"ID":"edae232e-010b-4d24-a5b7-d4e138925f66","Type":"ContainerStarted","Data":"2960072bc1060deda180bdd6505d7e5d001bab7d4ecf5cc801be727690084d3e"} Apr 19 15:26:22.972314 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:22.972261 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-548869cb46-b9zkq"] Apr 19 15:26:22.975616 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:26:22.975590 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18816144_4892_4c9c_8870_c79f4966f41a.slice/crio-d7e8e14597dda4da0336c405cb73a7ff537dd8966b8ec306c357105a761525b0 WatchSource:0}: Error finding container d7e8e14597dda4da0336c405cb73a7ff537dd8966b8ec306c357105a761525b0: Status 404 returned error can't find the container with id d7e8e14597dda4da0336c405cb73a7ff537dd8966b8ec306c357105a761525b0 Apr 19 15:26:23.913398 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:23.913363 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-mf22j" event={"ID":"8aec71fb-cb0e-441f-9f19-9346759e030b","Type":"ContainerStarted","Data":"ffb40c442dfaa93c74acecd4edeb2565847378f978e3aa078fe40b31a4e578a0"} Apr 19 15:26:23.914953 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:23.914922 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-548869cb46-b9zkq" event={"ID":"18816144-4892-4c9c-8870-c79f4966f41a","Type":"ContainerStarted","Data":"9b13d9239fe0241c1b092f85dc90e784c3fc988964c7e147a1c9a1cdd91172b7"} Apr 19 15:26:23.915065 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:23.914962 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-548869cb46-b9zkq" event={"ID":"18816144-4892-4c9c-8870-c79f4966f41a","Type":"ContainerStarted","Data":"d7e8e14597dda4da0336c405cb73a7ff537dd8966b8ec306c357105a761525b0"} Apr 19 15:26:23.915120 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:23.915072 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-548869cb46-b9zkq" Apr 19 15:26:23.932591 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:23.932550 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-548869cb46-b9zkq" podStartSLOduration=1.932535181 podStartE2EDuration="1.932535181s" podCreationTimestamp="2026-04-19 15:26:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-19 15:26:23.931654379 +0000 UTC m=+83.037569767" watchObservedRunningTime="2026-04-19 15:26:23.932535181 +0000 UTC m=+83.038450568" Apr 19 15:26:24.039080 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:24.039036 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/6185d8e4-ff61-4ce3-9885-8aaeca0c15ca-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-lgqmg\" (UID: \"6185d8e4-ff61-4ce3-9885-8aaeca0c15ca\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lgqmg" Apr 19 15:26:24.041574 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:24.041552 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/6185d8e4-ff61-4ce3-9885-8aaeca0c15ca-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-lgqmg\" (UID: \"6185d8e4-ff61-4ce3-9885-8aaeca0c15ca\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lgqmg" Apr 19 15:26:24.244748 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:24.244686 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-q7nk9\"" Apr 19 15:26:24.253348 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:24.253326 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lgqmg" Apr 19 15:26:24.369981 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:24.369950 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-lgqmg"] Apr 19 15:26:24.374353 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:26:24.374301 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6185d8e4_ff61_4ce3_9885_8aaeca0c15ca.slice/crio-c4a1251031ff612ac28933c412d123c1c7602eafaa04e6c7cd92a745430abafd WatchSource:0}: Error finding container c4a1251031ff612ac28933c412d123c1c7602eafaa04e6c7cd92a745430abafd: Status 404 returned error can't find the container with id c4a1251031ff612ac28933c412d123c1c7602eafaa04e6c7cd92a745430abafd Apr 19 15:26:24.919826 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:24.919783 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-mf22j" event={"ID":"8aec71fb-cb0e-441f-9f19-9346759e030b","Type":"ContainerStarted","Data":"25099d8b929518b73aac7531e16c823be383bf82ccf4d5e700227b29a80d7754"} Apr 19 15:26:24.921271 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:24.921240 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-vlqlr" event={"ID":"edae232e-010b-4d24-a5b7-d4e138925f66","Type":"ContainerStarted","Data":"9cb049fde8eed6121cf2f748c6d795464adae4f09678094ff83f4a820e36be43"} Apr 19 15:26:24.922477 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:24.922449 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lgqmg" event={"ID":"6185d8e4-ff61-4ce3-9885-8aaeca0c15ca","Type":"ContainerStarted","Data":"c4a1251031ff612ac28933c412d123c1c7602eafaa04e6c7cd92a745430abafd"} Apr 19 15:26:24.935089 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:24.935036 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-vlqlr" podStartSLOduration=1.802879707 podStartE2EDuration="2.935015937s" podCreationTimestamp="2026-04-19 15:26:22 +0000 UTC" firstStartedPulling="2026-04-19 15:26:22.767504817 +0000 UTC m=+81.873420186" lastFinishedPulling="2026-04-19 15:26:23.899641049 +0000 UTC m=+83.005556416" observedRunningTime="2026-04-19 15:26:24.934658521 +0000 UTC m=+84.040573910" watchObservedRunningTime="2026-04-19 15:26:24.935015937 +0000 UTC m=+84.040931326" Apr 19 15:26:26.770891 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:26.770858 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-d8927"] Apr 19 15:26:26.773481 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:26.773464 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-d8927" Apr 19 15:26:26.775566 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:26.775544 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 19 15:26:26.775692 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:26.775649 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-vzv9m\"" Apr 19 15:26:26.780603 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:26.780583 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-d8927"] Apr 19 15:26:26.865740 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:26.865675 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/20ed457f-c873-4a9b-ae0c-36282b3723a2-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-d8927\" (UID: \"20ed457f-c873-4a9b-ae0c-36282b3723a2\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-d8927" Apr 19 15:26:26.930792 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:26.930751 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-mf22j" event={"ID":"8aec71fb-cb0e-441f-9f19-9346759e030b","Type":"ContainerStarted","Data":"9fb559c52c4ba938a677691e0f8d1237757f38c784b795cde77c46080202dbc9"} Apr 19 15:26:26.932037 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:26.932015 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lgqmg" event={"ID":"6185d8e4-ff61-4ce3-9885-8aaeca0c15ca","Type":"ContainerStarted","Data":"96a2d8d15ca517c71ded41f5ef32ace3d5dd307dceb888afcd337cff7b942690"} Apr 19 15:26:26.946058 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:26.946015 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-mf22j" podStartSLOduration=1.602854259 podStartE2EDuration="4.946002523s" podCreationTimestamp="2026-04-19 15:26:22 +0000 UTC" firstStartedPulling="2026-04-19 15:26:22.937772063 +0000 UTC m=+82.043687430" lastFinishedPulling="2026-04-19 15:26:26.280920324 +0000 UTC m=+85.386835694" observedRunningTime="2026-04-19 15:26:26.945598209 +0000 UTC m=+86.051513598" watchObservedRunningTime="2026-04-19 15:26:26.946002523 +0000 UTC m=+86.051917944" Apr 19 15:26:26.958569 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:26.958526 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lgqmg" podStartSLOduration=33.033121265 podStartE2EDuration="34.958514321s" podCreationTimestamp="2026-04-19 15:25:52 +0000 UTC" firstStartedPulling="2026-04-19 15:26:24.376685742 +0000 UTC m=+83.482601108" lastFinishedPulling="2026-04-19 15:26:26.302078785 +0000 UTC m=+85.407994164" observedRunningTime="2026-04-19 15:26:26.957469151 +0000 UTC m=+86.063384551" watchObservedRunningTime="2026-04-19 15:26:26.958514321 +0000 UTC m=+86.064429702" Apr 19 15:26:26.967071 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:26.967048 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/20ed457f-c873-4a9b-ae0c-36282b3723a2-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-d8927\" (UID: \"20ed457f-c873-4a9b-ae0c-36282b3723a2\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-d8927" Apr 19 15:26:26.969596 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:26.969574 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/20ed457f-c873-4a9b-ae0c-36282b3723a2-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-d8927\" (UID: \"20ed457f-c873-4a9b-ae0c-36282b3723a2\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-d8927" Apr 19 15:26:27.082580 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:27.082491 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-d8927" Apr 19 15:26:27.217955 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:27.217925 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-d8927"] Apr 19 15:26:27.221509 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:26:27.221477 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20ed457f_c873_4a9b_ae0c_36282b3723a2.slice/crio-b3df4a22d8088a0034593b9f52287400fbfa77cb5c802ced144b8d1e3f2ae59f WatchSource:0}: Error finding container b3df4a22d8088a0034593b9f52287400fbfa77cb5c802ced144b8d1e3f2ae59f: Status 404 returned error can't find the container with id b3df4a22d8088a0034593b9f52287400fbfa77cb5c802ced144b8d1e3f2ae59f Apr 19 15:26:27.937096 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:27.937050 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-d8927" event={"ID":"20ed457f-c873-4a9b-ae0c-36282b3723a2","Type":"ContainerStarted","Data":"b3df4a22d8088a0034593b9f52287400fbfa77cb5c802ced144b8d1e3f2ae59f"} Apr 19 15:26:28.941418 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:28.941378 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-d8927" event={"ID":"20ed457f-c873-4a9b-ae0c-36282b3723a2","Type":"ContainerStarted","Data":"19e79508a8996b12505401be066b7029c7e8211562ca0fda9a544879333cb9f9"} Apr 19 15:26:28.941826 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:28.941562 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-d8927" Apr 19 15:26:28.946235 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:28.946209 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-d8927" Apr 19 15:26:28.954470 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:28.954421 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-d8927" podStartSLOduration=1.773483334 podStartE2EDuration="2.954405037s" podCreationTimestamp="2026-04-19 15:26:26 +0000 UTC" firstStartedPulling="2026-04-19 15:26:27.223416414 +0000 UTC m=+86.329331780" lastFinishedPulling="2026-04-19 15:26:28.404338113 +0000 UTC m=+87.510253483" observedRunningTime="2026-04-19 15:26:28.954004842 +0000 UTC m=+88.059920231" watchObservedRunningTime="2026-04-19 15:26:28.954405037 +0000 UTC m=+88.060320425" Apr 19 15:26:32.395747 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:32.395702 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-569cffc76c-jl8vf" Apr 19 15:26:34.163695 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:34.163657 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-x6zt4"] Apr 19 15:26:34.168506 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:34.168485 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-x6zt4" Apr 19 15:26:34.171352 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:34.171323 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 19 15:26:34.171352 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:34.171338 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-vtfzz\"" Apr 19 15:26:34.171536 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:34.171338 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 19 15:26:34.171536 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:34.171333 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 19 15:26:34.173081 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:34.173060 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-94dch"] Apr 19 15:26:34.176156 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:34.176135 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-94dch" Apr 19 15:26:34.176313 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:34.176291 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-x6zt4"] Apr 19 15:26:34.178349 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:34.178328 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-p5ljz\"" Apr 19 15:26:34.178467 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:34.178405 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 19 15:26:34.178656 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:34.178635 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 19 15:26:34.178801 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:34.178779 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 19 15:26:34.188089 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:34.188066 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-94dch"] Apr 19 15:26:34.189041 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:34.189016 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-4jl6h"] Apr 19 15:26:34.192300 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:34.192278 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-4jl6h" Apr 19 15:26:34.194692 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:34.194673 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 19 15:26:34.194928 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:34.194915 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 19 15:26:34.195555 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:34.195535 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 19 15:26:34.195636 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:34.195604 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-s2772\"" Apr 19 15:26:34.229437 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:34.229406 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/497c2a9d-d724-4361-a355-cc733a36f75f-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-x6zt4\" (UID: \"497c2a9d-d724-4361-a355-cc733a36f75f\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-x6zt4" Apr 19 15:26:34.229437 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:34.229443 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/497c2a9d-d724-4361-a355-cc733a36f75f-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-x6zt4\" (UID: \"497c2a9d-d724-4361-a355-cc733a36f75f\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-x6zt4" Apr 19 15:26:34.229640 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:34.229463 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/497c2a9d-d724-4361-a355-cc733a36f75f-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-x6zt4\" (UID: \"497c2a9d-d724-4361-a355-cc733a36f75f\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-x6zt4" Apr 19 15:26:34.229640 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:34.229489 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8rk5\" (UniqueName: \"kubernetes.io/projected/497c2a9d-d724-4361-a355-cc733a36f75f-kube-api-access-g8rk5\") pod \"openshift-state-metrics-9d44df66c-x6zt4\" (UID: \"497c2a9d-d724-4361-a355-cc733a36f75f\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-x6zt4" Apr 19 15:26:34.329978 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:34.329933 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8e99a9b5-7492-4de3-860f-2ea9f58eaf8a-metrics-client-ca\") pod \"node-exporter-4jl6h\" (UID: \"8e99a9b5-7492-4de3-860f-2ea9f58eaf8a\") " pod="openshift-monitoring/node-exporter-4jl6h" Apr 19 15:26:34.329978 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:34.329982 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/036941a2-bcef-4eb6-8c54-93c25b36bbb2-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-94dch\" (UID: \"036941a2-bcef-4eb6-8c54-93c25b36bbb2\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-94dch" Apr 19 15:26:34.330181 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:34.330039 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/036941a2-bcef-4eb6-8c54-93c25b36bbb2-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-94dch\" (UID: \"036941a2-bcef-4eb6-8c54-93c25b36bbb2\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-94dch" Apr 19 15:26:34.330181 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:34.330075 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/8e99a9b5-7492-4de3-860f-2ea9f58eaf8a-node-exporter-wtmp\") pod \"node-exporter-4jl6h\" (UID: \"8e99a9b5-7492-4de3-860f-2ea9f58eaf8a\") " pod="openshift-monitoring/node-exporter-4jl6h" Apr 19 15:26:34.330181 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:34.330104 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/497c2a9d-d724-4361-a355-cc733a36f75f-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-x6zt4\" (UID: \"497c2a9d-d724-4361-a355-cc733a36f75f\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-x6zt4" Apr 19 15:26:34.330181 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:34.330123 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/8e99a9b5-7492-4de3-860f-2ea9f58eaf8a-node-exporter-tls\") pod \"node-exporter-4jl6h\" (UID: \"8e99a9b5-7492-4de3-860f-2ea9f58eaf8a\") " pod="openshift-monitoring/node-exporter-4jl6h" Apr 19 15:26:34.330181 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:34.330142 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/497c2a9d-d724-4361-a355-cc733a36f75f-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-x6zt4\" (UID: \"497c2a9d-d724-4361-a355-cc733a36f75f\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-x6zt4" Apr 19 15:26:34.330414 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:34.330197 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/497c2a9d-d724-4361-a355-cc733a36f75f-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-x6zt4\" (UID: \"497c2a9d-d724-4361-a355-cc733a36f75f\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-x6zt4" Apr 19 15:26:34.330414 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:34.330232 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/8e99a9b5-7492-4de3-860f-2ea9f58eaf8a-node-exporter-textfile\") pod \"node-exporter-4jl6h\" (UID: \"8e99a9b5-7492-4de3-860f-2ea9f58eaf8a\") " pod="openshift-monitoring/node-exporter-4jl6h" Apr 19 15:26:34.330414 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:34.330270 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g8rk5\" (UniqueName: \"kubernetes.io/projected/497c2a9d-d724-4361-a355-cc733a36f75f-kube-api-access-g8rk5\") pod \"openshift-state-metrics-9d44df66c-x6zt4\" (UID: \"497c2a9d-d724-4361-a355-cc733a36f75f\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-x6zt4" Apr 19 15:26:34.330414 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:34.330297 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/8e99a9b5-7492-4de3-860f-2ea9f58eaf8a-root\") pod \"node-exporter-4jl6h\" (UID: \"8e99a9b5-7492-4de3-860f-2ea9f58eaf8a\") " pod="openshift-monitoring/node-exporter-4jl6h" Apr 19 15:26:34.330414 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:26:34.330313 2579 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 19 15:26:34.330414 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:34.330323 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqf7x\" (UniqueName: \"kubernetes.io/projected/8e99a9b5-7492-4de3-860f-2ea9f58eaf8a-kube-api-access-zqf7x\") pod \"node-exporter-4jl6h\" (UID: \"8e99a9b5-7492-4de3-860f-2ea9f58eaf8a\") " pod="openshift-monitoring/node-exporter-4jl6h" Apr 19 15:26:34.330414 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:34.330391 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8e99a9b5-7492-4de3-860f-2ea9f58eaf8a-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-4jl6h\" (UID: \"8e99a9b5-7492-4de3-860f-2ea9f58eaf8a\") " pod="openshift-monitoring/node-exporter-4jl6h" Apr 19 15:26:34.330414 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:26:34.330409 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/497c2a9d-d724-4361-a355-cc733a36f75f-openshift-state-metrics-tls podName:497c2a9d-d724-4361-a355-cc733a36f75f nodeName:}" failed. No retries permitted until 2026-04-19 15:26:34.830384861 +0000 UTC m=+93.936300229 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/497c2a9d-d724-4361-a355-cc733a36f75f-openshift-state-metrics-tls") pod "openshift-state-metrics-9d44df66c-x6zt4" (UID: "497c2a9d-d724-4361-a355-cc733a36f75f") : secret "openshift-state-metrics-tls" not found Apr 19 15:26:34.330812 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:34.330480 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8e99a9b5-7492-4de3-860f-2ea9f58eaf8a-sys\") pod \"node-exporter-4jl6h\" (UID: \"8e99a9b5-7492-4de3-860f-2ea9f58eaf8a\") " pod="openshift-monitoring/node-exporter-4jl6h" Apr 19 15:26:34.330812 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:34.330520 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/036941a2-bcef-4eb6-8c54-93c25b36bbb2-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-94dch\" (UID: \"036941a2-bcef-4eb6-8c54-93c25b36bbb2\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-94dch" Apr 19 15:26:34.330812 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:34.330549 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/036941a2-bcef-4eb6-8c54-93c25b36bbb2-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-94dch\" (UID: \"036941a2-bcef-4eb6-8c54-93c25b36bbb2\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-94dch" Apr 19 15:26:34.330812 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:34.330577 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/8e99a9b5-7492-4de3-860f-2ea9f58eaf8a-node-exporter-accelerators-collector-config\") pod \"node-exporter-4jl6h\" (UID: \"8e99a9b5-7492-4de3-860f-2ea9f58eaf8a\") " pod="openshift-monitoring/node-exporter-4jl6h" Apr 19 15:26:34.330812 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:34.330610 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/036941a2-bcef-4eb6-8c54-93c25b36bbb2-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-94dch\" (UID: \"036941a2-bcef-4eb6-8c54-93c25b36bbb2\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-94dch" Apr 19 15:26:34.330812 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:34.330635 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5pl2\" (UniqueName: \"kubernetes.io/projected/036941a2-bcef-4eb6-8c54-93c25b36bbb2-kube-api-access-f5pl2\") pod \"kube-state-metrics-69db897b98-94dch\" (UID: \"036941a2-bcef-4eb6-8c54-93c25b36bbb2\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-94dch" Apr 19 15:26:34.330992 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:34.330849 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/497c2a9d-d724-4361-a355-cc733a36f75f-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-x6zt4\" (UID: \"497c2a9d-d724-4361-a355-cc733a36f75f\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-x6zt4" Apr 19 15:26:34.332774 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:34.332753 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/497c2a9d-d724-4361-a355-cc733a36f75f-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-x6zt4\" (UID: \"497c2a9d-d724-4361-a355-cc733a36f75f\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-x6zt4" Apr 19 15:26:34.340750 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:34.340708 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8rk5\" (UniqueName: \"kubernetes.io/projected/497c2a9d-d724-4361-a355-cc733a36f75f-kube-api-access-g8rk5\") pod \"openshift-state-metrics-9d44df66c-x6zt4\" (UID: \"497c2a9d-d724-4361-a355-cc733a36f75f\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-x6zt4" Apr 19 15:26:34.431595 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:34.431561 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/8e99a9b5-7492-4de3-860f-2ea9f58eaf8a-node-exporter-textfile\") pod \"node-exporter-4jl6h\" (UID: \"8e99a9b5-7492-4de3-860f-2ea9f58eaf8a\") " pod="openshift-monitoring/node-exporter-4jl6h" Apr 19 15:26:34.431796 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:34.431610 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/8e99a9b5-7492-4de3-860f-2ea9f58eaf8a-root\") pod \"node-exporter-4jl6h\" (UID: \"8e99a9b5-7492-4de3-860f-2ea9f58eaf8a\") " pod="openshift-monitoring/node-exporter-4jl6h" Apr 19 15:26:34.431796 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:34.431627 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zqf7x\" (UniqueName: \"kubernetes.io/projected/8e99a9b5-7492-4de3-860f-2ea9f58eaf8a-kube-api-access-zqf7x\") pod \"node-exporter-4jl6h\" (UID: \"8e99a9b5-7492-4de3-860f-2ea9f58eaf8a\") " pod="openshift-monitoring/node-exporter-4jl6h" Apr 19 15:26:34.431796 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:34.431645 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8e99a9b5-7492-4de3-860f-2ea9f58eaf8a-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-4jl6h\" (UID: \"8e99a9b5-7492-4de3-860f-2ea9f58eaf8a\") " pod="openshift-monitoring/node-exporter-4jl6h" Apr 19 15:26:34.431796 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:34.431682 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8e99a9b5-7492-4de3-860f-2ea9f58eaf8a-sys\") pod \"node-exporter-4jl6h\" (UID: \"8e99a9b5-7492-4de3-860f-2ea9f58eaf8a\") " pod="openshift-monitoring/node-exporter-4jl6h" Apr 19 15:26:34.431796 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:34.431698 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/8e99a9b5-7492-4de3-860f-2ea9f58eaf8a-root\") pod \"node-exporter-4jl6h\" (UID: \"8e99a9b5-7492-4de3-860f-2ea9f58eaf8a\") " pod="openshift-monitoring/node-exporter-4jl6h" Apr 19 15:26:34.431796 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:34.431710 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/036941a2-bcef-4eb6-8c54-93c25b36bbb2-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-94dch\" (UID: \"036941a2-bcef-4eb6-8c54-93c25b36bbb2\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-94dch" Apr 19 15:26:34.431796 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:34.431785 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/036941a2-bcef-4eb6-8c54-93c25b36bbb2-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-94dch\" (UID: \"036941a2-bcef-4eb6-8c54-93c25b36bbb2\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-94dch" Apr 19 15:26:34.432112 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:34.431808 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8e99a9b5-7492-4de3-860f-2ea9f58eaf8a-sys\") pod \"node-exporter-4jl6h\" (UID: \"8e99a9b5-7492-4de3-860f-2ea9f58eaf8a\") " pod="openshift-monitoring/node-exporter-4jl6h" Apr 19 15:26:34.432112 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:34.431815 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/8e99a9b5-7492-4de3-860f-2ea9f58eaf8a-node-exporter-accelerators-collector-config\") pod \"node-exporter-4jl6h\" (UID: \"8e99a9b5-7492-4de3-860f-2ea9f58eaf8a\") " pod="openshift-monitoring/node-exporter-4jl6h" Apr 19 15:26:34.432112 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:34.431873 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/036941a2-bcef-4eb6-8c54-93c25b36bbb2-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-94dch\" (UID: \"036941a2-bcef-4eb6-8c54-93c25b36bbb2\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-94dch" Apr 19 15:26:34.432112 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:34.431907 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f5pl2\" (UniqueName: \"kubernetes.io/projected/036941a2-bcef-4eb6-8c54-93c25b36bbb2-kube-api-access-f5pl2\") pod \"kube-state-metrics-69db897b98-94dch\" (UID: \"036941a2-bcef-4eb6-8c54-93c25b36bbb2\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-94dch" Apr 19 15:26:34.432112 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:34.431946 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8e99a9b5-7492-4de3-860f-2ea9f58eaf8a-metrics-client-ca\") pod \"node-exporter-4jl6h\" (UID: \"8e99a9b5-7492-4de3-860f-2ea9f58eaf8a\") " pod="openshift-monitoring/node-exporter-4jl6h" Apr 19 15:26:34.432112 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:34.431975 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/036941a2-bcef-4eb6-8c54-93c25b36bbb2-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-94dch\" (UID: \"036941a2-bcef-4eb6-8c54-93c25b36bbb2\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-94dch" Apr 19 15:26:34.432112 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:34.432001 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/036941a2-bcef-4eb6-8c54-93c25b36bbb2-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-94dch\" (UID: \"036941a2-bcef-4eb6-8c54-93c25b36bbb2\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-94dch" Apr 19 15:26:34.432112 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:34.432020 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/8e99a9b5-7492-4de3-860f-2ea9f58eaf8a-node-exporter-wtmp\") pod \"node-exporter-4jl6h\" (UID: \"8e99a9b5-7492-4de3-860f-2ea9f58eaf8a\") " pod="openshift-monitoring/node-exporter-4jl6h" Apr 19 15:26:34.432112 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:34.432050 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/8e99a9b5-7492-4de3-860f-2ea9f58eaf8a-node-exporter-tls\") pod \"node-exporter-4jl6h\" (UID: \"8e99a9b5-7492-4de3-860f-2ea9f58eaf8a\") " pod="openshift-monitoring/node-exporter-4jl6h" Apr 19 15:26:34.432552 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:26:34.432185 2579 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 19 15:26:34.432552 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:26:34.432243 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8e99a9b5-7492-4de3-860f-2ea9f58eaf8a-node-exporter-tls podName:8e99a9b5-7492-4de3-860f-2ea9f58eaf8a nodeName:}" failed. No retries permitted until 2026-04-19 15:26:34.932224026 +0000 UTC m=+94.038139394 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/8e99a9b5-7492-4de3-860f-2ea9f58eaf8a-node-exporter-tls") pod "node-exporter-4jl6h" (UID: "8e99a9b5-7492-4de3-860f-2ea9f58eaf8a") : secret "node-exporter-tls" not found Apr 19 15:26:34.432552 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:26:34.432250 2579 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 19 15:26:34.432552 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:26:34.432309 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/036941a2-bcef-4eb6-8c54-93c25b36bbb2-kube-state-metrics-tls podName:036941a2-bcef-4eb6-8c54-93c25b36bbb2 nodeName:}" failed. No retries permitted until 2026-04-19 15:26:34.93228901 +0000 UTC m=+94.038204380 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/036941a2-bcef-4eb6-8c54-93c25b36bbb2-kube-state-metrics-tls") pod "kube-state-metrics-69db897b98-94dch" (UID: "036941a2-bcef-4eb6-8c54-93c25b36bbb2") : secret "kube-state-metrics-tls" not found Apr 19 15:26:34.432552 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:34.432399 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/036941a2-bcef-4eb6-8c54-93c25b36bbb2-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-94dch\" (UID: \"036941a2-bcef-4eb6-8c54-93c25b36bbb2\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-94dch" Apr 19 15:26:34.432552 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:34.432418 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/8e99a9b5-7492-4de3-860f-2ea9f58eaf8a-node-exporter-wtmp\") pod \"node-exporter-4jl6h\" (UID: \"8e99a9b5-7492-4de3-860f-2ea9f58eaf8a\") " pod="openshift-monitoring/node-exporter-4jl6h" Apr 19 15:26:34.432552 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:34.432475 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/036941a2-bcef-4eb6-8c54-93c25b36bbb2-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-94dch\" (UID: \"036941a2-bcef-4eb6-8c54-93c25b36bbb2\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-94dch" Apr 19 15:26:34.432855 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:34.432642 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/8e99a9b5-7492-4de3-860f-2ea9f58eaf8a-node-exporter-textfile\") pod \"node-exporter-4jl6h\" (UID: \"8e99a9b5-7492-4de3-860f-2ea9f58eaf8a\") " pod="openshift-monitoring/node-exporter-4jl6h" Apr 19 15:26:34.432855 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:34.432648 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/036941a2-bcef-4eb6-8c54-93c25b36bbb2-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-94dch\" (UID: \"036941a2-bcef-4eb6-8c54-93c25b36bbb2\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-94dch" Apr 19 15:26:34.432949 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:34.432931 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8e99a9b5-7492-4de3-860f-2ea9f58eaf8a-metrics-client-ca\") pod \"node-exporter-4jl6h\" (UID: \"8e99a9b5-7492-4de3-860f-2ea9f58eaf8a\") " pod="openshift-monitoring/node-exporter-4jl6h" Apr 19 15:26:34.433054 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:34.433020 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/8e99a9b5-7492-4de3-860f-2ea9f58eaf8a-node-exporter-accelerators-collector-config\") pod \"node-exporter-4jl6h\" (UID: \"8e99a9b5-7492-4de3-860f-2ea9f58eaf8a\") " pod="openshift-monitoring/node-exporter-4jl6h" Apr 19 15:26:34.434449 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:34.434428 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/036941a2-bcef-4eb6-8c54-93c25b36bbb2-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-94dch\" (UID: \"036941a2-bcef-4eb6-8c54-93c25b36bbb2\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-94dch" Apr 19 15:26:34.434651 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:34.434634 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8e99a9b5-7492-4de3-860f-2ea9f58eaf8a-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-4jl6h\" (UID: \"8e99a9b5-7492-4de3-860f-2ea9f58eaf8a\") " pod="openshift-monitoring/node-exporter-4jl6h" Apr 19 15:26:34.438999 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:34.438978 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5pl2\" (UniqueName: \"kubernetes.io/projected/036941a2-bcef-4eb6-8c54-93c25b36bbb2-kube-api-access-f5pl2\") pod \"kube-state-metrics-69db897b98-94dch\" (UID: \"036941a2-bcef-4eb6-8c54-93c25b36bbb2\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-94dch" Apr 19 15:26:34.442298 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:34.442277 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqf7x\" (UniqueName: \"kubernetes.io/projected/8e99a9b5-7492-4de3-860f-2ea9f58eaf8a-kube-api-access-zqf7x\") pod \"node-exporter-4jl6h\" (UID: \"8e99a9b5-7492-4de3-860f-2ea9f58eaf8a\") " pod="openshift-monitoring/node-exporter-4jl6h" Apr 19 15:26:34.835316 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:34.835206 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/497c2a9d-d724-4361-a355-cc733a36f75f-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-x6zt4\" (UID: \"497c2a9d-d724-4361-a355-cc733a36f75f\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-x6zt4" Apr 19 15:26:34.837833 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:34.837795 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/497c2a9d-d724-4361-a355-cc733a36f75f-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-x6zt4\" (UID: \"497c2a9d-d724-4361-a355-cc733a36f75f\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-x6zt4" Apr 19 15:26:34.936770 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:34.936708 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/036941a2-bcef-4eb6-8c54-93c25b36bbb2-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-94dch\" (UID: \"036941a2-bcef-4eb6-8c54-93c25b36bbb2\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-94dch" Apr 19 15:26:34.936940 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:34.936780 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/8e99a9b5-7492-4de3-860f-2ea9f58eaf8a-node-exporter-tls\") pod \"node-exporter-4jl6h\" (UID: \"8e99a9b5-7492-4de3-860f-2ea9f58eaf8a\") " pod="openshift-monitoring/node-exporter-4jl6h" Apr 19 15:26:34.939399 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:34.939369 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/8e99a9b5-7492-4de3-860f-2ea9f58eaf8a-node-exporter-tls\") pod \"node-exporter-4jl6h\" (UID: \"8e99a9b5-7492-4de3-860f-2ea9f58eaf8a\") " pod="openshift-monitoring/node-exporter-4jl6h" Apr 19 15:26:34.939399 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:34.939394 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/036941a2-bcef-4eb6-8c54-93c25b36bbb2-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-94dch\" (UID: \"036941a2-bcef-4eb6-8c54-93c25b36bbb2\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-94dch" Apr 19 15:26:35.078969 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:35.078931 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-x6zt4" Apr 19 15:26:35.085797 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:35.085711 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-94dch" Apr 19 15:26:35.101538 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:35.101509 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-4jl6h" Apr 19 15:26:35.113857 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:26:35.113799 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e99a9b5_7492_4de3_860f_2ea9f58eaf8a.slice/crio-ff95c1dae89bc9de9750a9158472b71fe6ea68c694d0a9b2b08af7224b41fa14 WatchSource:0}: Error finding container ff95c1dae89bc9de9750a9158472b71fe6ea68c694d0a9b2b08af7224b41fa14: Status 404 returned error can't find the container with id ff95c1dae89bc9de9750a9158472b71fe6ea68c694d0a9b2b08af7224b41fa14 Apr 19 15:26:35.229714 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:35.229514 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-x6zt4"] Apr 19 15:26:35.232245 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:26:35.232214 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod497c2a9d_d724_4361_a355_cc733a36f75f.slice/crio-ef90312ff81859e48b10dd27255d4f5f6562f0add071389aaad958cda41aa0a6 WatchSource:0}: Error finding container ef90312ff81859e48b10dd27255d4f5f6562f0add071389aaad958cda41aa0a6: Status 404 returned error can't find the container with id ef90312ff81859e48b10dd27255d4f5f6562f0add071389aaad958cda41aa0a6 Apr 19 15:26:35.248533 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:35.248499 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-94dch"] Apr 19 15:26:35.252009 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:26:35.251970 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod036941a2_bcef_4eb6_8c54_93c25b36bbb2.slice/crio-a94feb0655c8ee85ff43083138837023f5738ade028561f0ad28f47044b68e66 WatchSource:0}: Error finding container a94feb0655c8ee85ff43083138837023f5738ade028561f0ad28f47044b68e66: Status 404 returned error can't find the container with id a94feb0655c8ee85ff43083138837023f5738ade028561f0ad28f47044b68e66 Apr 19 15:26:35.961824 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:35.961754 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-4jl6h" event={"ID":"8e99a9b5-7492-4de3-860f-2ea9f58eaf8a","Type":"ContainerStarted","Data":"ff95c1dae89bc9de9750a9158472b71fe6ea68c694d0a9b2b08af7224b41fa14"} Apr 19 15:26:35.963771 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:35.963737 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-x6zt4" event={"ID":"497c2a9d-d724-4361-a355-cc733a36f75f","Type":"ContainerStarted","Data":"cd077e0700bb4531a86a42e3310fe07fbd7a29bf3397ff47b6b00cac6f7a4f61"} Apr 19 15:26:35.963771 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:35.963770 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-x6zt4" event={"ID":"497c2a9d-d724-4361-a355-cc733a36f75f","Type":"ContainerStarted","Data":"9886da899a2cca4d44a0e9b660f853451933aef3de44105d604d13c0c65b0983"} Apr 19 15:26:35.963963 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:35.963784 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-x6zt4" event={"ID":"497c2a9d-d724-4361-a355-cc733a36f75f","Type":"ContainerStarted","Data":"ef90312ff81859e48b10dd27255d4f5f6562f0add071389aaad958cda41aa0a6"} Apr 19 15:26:35.964888 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:35.964866 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-94dch" event={"ID":"036941a2-bcef-4eb6-8c54-93c25b36bbb2","Type":"ContainerStarted","Data":"a94feb0655c8ee85ff43083138837023f5738ade028561f0ad28f47044b68e66"} Apr 19 15:26:36.969312 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:36.969273 2579 generic.go:358] "Generic (PLEG): container finished" podID="8e99a9b5-7492-4de3-860f-2ea9f58eaf8a" containerID="0aa99acfaa819c0ebcaba37ac589fada1a92019920c59c456585e987374df22b" exitCode=0 Apr 19 15:26:36.969645 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:36.969361 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-4jl6h" event={"ID":"8e99a9b5-7492-4de3-860f-2ea9f58eaf8a","Type":"ContainerDied","Data":"0aa99acfaa819c0ebcaba37ac589fada1a92019920c59c456585e987374df22b"} Apr 19 15:26:37.974397 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:37.974354 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-x6zt4" event={"ID":"497c2a9d-d724-4361-a355-cc733a36f75f","Type":"ContainerStarted","Data":"b0585f804497a5a76620230481d1f23c8fdb5627ad70417a8bf29fb520a182a8"} Apr 19 15:26:37.976220 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:37.976193 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-94dch" event={"ID":"036941a2-bcef-4eb6-8c54-93c25b36bbb2","Type":"ContainerStarted","Data":"b0536f0e31ec85e17a94b72a859786e24296375be91587bb5b6e51bbc99983c4"} Apr 19 15:26:37.976327 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:37.976225 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-94dch" event={"ID":"036941a2-bcef-4eb6-8c54-93c25b36bbb2","Type":"ContainerStarted","Data":"043fb273445e620d2b34fb5a5adf7d435937f807c40e9c065c11971e6f481e70"} Apr 19 15:26:37.976327 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:37.976241 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-94dch" event={"ID":"036941a2-bcef-4eb6-8c54-93c25b36bbb2","Type":"ContainerStarted","Data":"f7814a5e16e5a141506802d52933b4fc82035947786e0dc0a337d66a561227fb"} Apr 19 15:26:37.978020 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:37.977997 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-4jl6h" event={"ID":"8e99a9b5-7492-4de3-860f-2ea9f58eaf8a","Type":"ContainerStarted","Data":"e6ebcef73738cba36bd91f47722a17e7f78281cc897c3c59ca863478cae64e19"} Apr 19 15:26:37.978097 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:37.978027 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-4jl6h" event={"ID":"8e99a9b5-7492-4de3-860f-2ea9f58eaf8a","Type":"ContainerStarted","Data":"cca8758086329cf5906344eb3c734aebfe68fb98fbbbf6d1d02c1ccfc19acf91"} Apr 19 15:26:37.991125 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:37.991080 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-x6zt4" podStartSLOduration=2.441012478 podStartE2EDuration="3.991069284s" podCreationTimestamp="2026-04-19 15:26:34 +0000 UTC" firstStartedPulling="2026-04-19 15:26:35.377955377 +0000 UTC m=+94.483870743" lastFinishedPulling="2026-04-19 15:26:36.928012169 +0000 UTC m=+96.033927549" observedRunningTime="2026-04-19 15:26:37.990509345 +0000 UTC m=+97.096424736" watchObservedRunningTime="2026-04-19 15:26:37.991069284 +0000 UTC m=+97.096984671" Apr 19 15:26:38.015966 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:38.015024 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-94dch" podStartSLOduration=2.342612303 podStartE2EDuration="4.015003402s" podCreationTimestamp="2026-04-19 15:26:34 +0000 UTC" firstStartedPulling="2026-04-19 15:26:35.253969388 +0000 UTC m=+94.359884754" lastFinishedPulling="2026-04-19 15:26:36.926360484 +0000 UTC m=+96.032275853" observedRunningTime="2026-04-19 15:26:38.013322662 +0000 UTC m=+97.119238052" watchObservedRunningTime="2026-04-19 15:26:38.015003402 +0000 UTC m=+97.120918791" Apr 19 15:26:38.035137 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:38.035084 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-4jl6h" podStartSLOduration=3.1238341099999998 podStartE2EDuration="4.035069778s" podCreationTimestamp="2026-04-19 15:26:34 +0000 UTC" firstStartedPulling="2026-04-19 15:26:35.116160618 +0000 UTC m=+94.222075987" lastFinishedPulling="2026-04-19 15:26:36.027396288 +0000 UTC m=+95.133311655" observedRunningTime="2026-04-19 15:26:38.034374828 +0000 UTC m=+97.140290218" watchObservedRunningTime="2026-04-19 15:26:38.035069778 +0000 UTC m=+97.140985165" Apr 19 15:26:38.568645 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:38.568609 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-6bbcbb9c4d-x7js9"] Apr 19 15:26:38.571693 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:38.571672 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-6bbcbb9c4d-x7js9" Apr 19 15:26:38.573999 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:38.573967 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 19 15:26:38.574098 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:38.574058 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 19 15:26:38.574322 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:38.574293 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-mqf48\"" Apr 19 15:26:38.574378 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:38.574333 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-cj835n44cm80u\"" Apr 19 15:26:38.574952 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:38.574935 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 19 15:26:38.575014 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:38.574964 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 19 15:26:38.582746 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:38.582707 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-6bbcbb9c4d-x7js9"] Apr 19 15:26:38.672469 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:38.672433 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/5317d4de-3d06-404c-a2c2-c50612bb163d-secret-metrics-server-client-certs\") pod \"metrics-server-6bbcbb9c4d-x7js9\" (UID: \"5317d4de-3d06-404c-a2c2-c50612bb163d\") " pod="openshift-monitoring/metrics-server-6bbcbb9c4d-x7js9" Apr 19 15:26:38.672625 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:38.672500 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/5317d4de-3d06-404c-a2c2-c50612bb163d-metrics-server-audit-profiles\") pod \"metrics-server-6bbcbb9c4d-x7js9\" (UID: \"5317d4de-3d06-404c-a2c2-c50612bb163d\") " pod="openshift-monitoring/metrics-server-6bbcbb9c4d-x7js9" Apr 19 15:26:38.672625 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:38.672573 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9chpf\" (UniqueName: \"kubernetes.io/projected/5317d4de-3d06-404c-a2c2-c50612bb163d-kube-api-access-9chpf\") pod \"metrics-server-6bbcbb9c4d-x7js9\" (UID: \"5317d4de-3d06-404c-a2c2-c50612bb163d\") " pod="openshift-monitoring/metrics-server-6bbcbb9c4d-x7js9" Apr 19 15:26:38.672625 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:38.672607 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/5317d4de-3d06-404c-a2c2-c50612bb163d-secret-metrics-server-tls\") pod \"metrics-server-6bbcbb9c4d-x7js9\" (UID: \"5317d4de-3d06-404c-a2c2-c50612bb163d\") " pod="openshift-monitoring/metrics-server-6bbcbb9c4d-x7js9" Apr 19 15:26:38.672787 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:38.672658 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5317d4de-3d06-404c-a2c2-c50612bb163d-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-6bbcbb9c4d-x7js9\" (UID: \"5317d4de-3d06-404c-a2c2-c50612bb163d\") " pod="openshift-monitoring/metrics-server-6bbcbb9c4d-x7js9" Apr 19 15:26:38.672787 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:38.672693 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/5317d4de-3d06-404c-a2c2-c50612bb163d-audit-log\") pod \"metrics-server-6bbcbb9c4d-x7js9\" (UID: \"5317d4de-3d06-404c-a2c2-c50612bb163d\") " pod="openshift-monitoring/metrics-server-6bbcbb9c4d-x7js9" Apr 19 15:26:38.672849 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:38.672805 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5317d4de-3d06-404c-a2c2-c50612bb163d-client-ca-bundle\") pod \"metrics-server-6bbcbb9c4d-x7js9\" (UID: \"5317d4de-3d06-404c-a2c2-c50612bb163d\") " pod="openshift-monitoring/metrics-server-6bbcbb9c4d-x7js9" Apr 19 15:26:38.774002 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:38.773964 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/5317d4de-3d06-404c-a2c2-c50612bb163d-secret-metrics-server-client-certs\") pod \"metrics-server-6bbcbb9c4d-x7js9\" (UID: \"5317d4de-3d06-404c-a2c2-c50612bb163d\") " pod="openshift-monitoring/metrics-server-6bbcbb9c4d-x7js9" Apr 19 15:26:38.774002 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:38.774006 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/5317d4de-3d06-404c-a2c2-c50612bb163d-metrics-server-audit-profiles\") pod \"metrics-server-6bbcbb9c4d-x7js9\" (UID: \"5317d4de-3d06-404c-a2c2-c50612bb163d\") " pod="openshift-monitoring/metrics-server-6bbcbb9c4d-x7js9" Apr 19 15:26:38.774230 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:38.774058 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9chpf\" (UniqueName: \"kubernetes.io/projected/5317d4de-3d06-404c-a2c2-c50612bb163d-kube-api-access-9chpf\") pod \"metrics-server-6bbcbb9c4d-x7js9\" (UID: \"5317d4de-3d06-404c-a2c2-c50612bb163d\") " pod="openshift-monitoring/metrics-server-6bbcbb9c4d-x7js9" Apr 19 15:26:38.774230 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:38.774083 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/5317d4de-3d06-404c-a2c2-c50612bb163d-secret-metrics-server-tls\") pod \"metrics-server-6bbcbb9c4d-x7js9\" (UID: \"5317d4de-3d06-404c-a2c2-c50612bb163d\") " pod="openshift-monitoring/metrics-server-6bbcbb9c4d-x7js9" Apr 19 15:26:38.774230 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:38.774130 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5317d4de-3d06-404c-a2c2-c50612bb163d-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-6bbcbb9c4d-x7js9\" (UID: \"5317d4de-3d06-404c-a2c2-c50612bb163d\") " pod="openshift-monitoring/metrics-server-6bbcbb9c4d-x7js9" Apr 19 15:26:38.774230 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:38.774156 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/5317d4de-3d06-404c-a2c2-c50612bb163d-audit-log\") pod \"metrics-server-6bbcbb9c4d-x7js9\" (UID: \"5317d4de-3d06-404c-a2c2-c50612bb163d\") " pod="openshift-monitoring/metrics-server-6bbcbb9c4d-x7js9" Apr 19 15:26:38.774429 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:38.774283 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5317d4de-3d06-404c-a2c2-c50612bb163d-client-ca-bundle\") pod \"metrics-server-6bbcbb9c4d-x7js9\" (UID: \"5317d4de-3d06-404c-a2c2-c50612bb163d\") " pod="openshift-monitoring/metrics-server-6bbcbb9c4d-x7js9" Apr 19 15:26:38.774918 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:38.774883 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/5317d4de-3d06-404c-a2c2-c50612bb163d-audit-log\") pod \"metrics-server-6bbcbb9c4d-x7js9\" (UID: \"5317d4de-3d06-404c-a2c2-c50612bb163d\") " pod="openshift-monitoring/metrics-server-6bbcbb9c4d-x7js9" Apr 19 15:26:38.775436 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:38.775404 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5317d4de-3d06-404c-a2c2-c50612bb163d-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-6bbcbb9c4d-x7js9\" (UID: \"5317d4de-3d06-404c-a2c2-c50612bb163d\") " pod="openshift-monitoring/metrics-server-6bbcbb9c4d-x7js9" Apr 19 15:26:38.775521 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:38.775421 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/5317d4de-3d06-404c-a2c2-c50612bb163d-metrics-server-audit-profiles\") pod \"metrics-server-6bbcbb9c4d-x7js9\" (UID: \"5317d4de-3d06-404c-a2c2-c50612bb163d\") " pod="openshift-monitoring/metrics-server-6bbcbb9c4d-x7js9" Apr 19 15:26:38.776984 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:38.776946 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/5317d4de-3d06-404c-a2c2-c50612bb163d-secret-metrics-server-client-certs\") pod \"metrics-server-6bbcbb9c4d-x7js9\" (UID: \"5317d4de-3d06-404c-a2c2-c50612bb163d\") " pod="openshift-monitoring/metrics-server-6bbcbb9c4d-x7js9" Apr 19 15:26:38.777067 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:38.777025 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/5317d4de-3d06-404c-a2c2-c50612bb163d-secret-metrics-server-tls\") pod \"metrics-server-6bbcbb9c4d-x7js9\" (UID: \"5317d4de-3d06-404c-a2c2-c50612bb163d\") " pod="openshift-monitoring/metrics-server-6bbcbb9c4d-x7js9" Apr 19 15:26:38.777124 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:38.777070 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5317d4de-3d06-404c-a2c2-c50612bb163d-client-ca-bundle\") pod \"metrics-server-6bbcbb9c4d-x7js9\" (UID: \"5317d4de-3d06-404c-a2c2-c50612bb163d\") " pod="openshift-monitoring/metrics-server-6bbcbb9c4d-x7js9" Apr 19 15:26:38.781425 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:38.781405 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9chpf\" (UniqueName: \"kubernetes.io/projected/5317d4de-3d06-404c-a2c2-c50612bb163d-kube-api-access-9chpf\") pod \"metrics-server-6bbcbb9c4d-x7js9\" (UID: \"5317d4de-3d06-404c-a2c2-c50612bb163d\") " pod="openshift-monitoring/metrics-server-6bbcbb9c4d-x7js9" Apr 19 15:26:38.881534 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:38.881444 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-6bbcbb9c4d-x7js9" Apr 19 15:26:38.971780 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:38.971749 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-jd7xh"] Apr 19 15:26:38.978653 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:38.978621 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-jd7xh" Apr 19 15:26:38.982829 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:38.982324 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 19 15:26:38.982829 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:38.982661 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-jpj2d\"" Apr 19 15:26:38.985799 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:38.985779 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-jd7xh"] Apr 19 15:26:39.022345 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:39.022178 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-6bbcbb9c4d-x7js9"] Apr 19 15:26:39.024993 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:26:39.024963 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5317d4de_3d06_404c_a2c2_c50612bb163d.slice/crio-93f63077e390a28dfc0884cdeb24ba36091840bc5e3c530d7bbba0e06e4a065b WatchSource:0}: Error finding container 93f63077e390a28dfc0884cdeb24ba36091840bc5e3c530d7bbba0e06e4a065b: Status 404 returned error can't find the container with id 93f63077e390a28dfc0884cdeb24ba36091840bc5e3c530d7bbba0e06e4a065b Apr 19 15:26:39.077546 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:39.077504 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/87f936be-8278-46c5-8f16-d81b0eb95086-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-jd7xh\" (UID: \"87f936be-8278-46c5-8f16-d81b0eb95086\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-jd7xh" Apr 19 15:26:39.178298 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:39.178219 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/87f936be-8278-46c5-8f16-d81b0eb95086-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-jd7xh\" (UID: \"87f936be-8278-46c5-8f16-d81b0eb95086\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-jd7xh" Apr 19 15:26:39.178432 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:26:39.178365 2579 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 19 15:26:39.178432 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:26:39.178430 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/87f936be-8278-46c5-8f16-d81b0eb95086-monitoring-plugin-cert podName:87f936be-8278-46c5-8f16-d81b0eb95086 nodeName:}" failed. No retries permitted until 2026-04-19 15:26:39.678413567 +0000 UTC m=+98.784328936 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/87f936be-8278-46c5-8f16-d81b0eb95086-monitoring-plugin-cert") pod "monitoring-plugin-7dccd58f55-jd7xh" (UID: "87f936be-8278-46c5-8f16-d81b0eb95086") : secret "monitoring-plugin-cert" not found Apr 19 15:26:39.682085 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:39.682047 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/87f936be-8278-46c5-8f16-d81b0eb95086-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-jd7xh\" (UID: \"87f936be-8278-46c5-8f16-d81b0eb95086\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-jd7xh" Apr 19 15:26:39.685020 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:39.684999 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/87f936be-8278-46c5-8f16-d81b0eb95086-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-jd7xh\" (UID: \"87f936be-8278-46c5-8f16-d81b0eb95086\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-jd7xh" Apr 19 15:26:39.893172 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:39.893136 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-jd7xh" Apr 19 15:26:39.989449 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:39.989400 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-6bbcbb9c4d-x7js9" event={"ID":"5317d4de-3d06-404c-a2c2-c50612bb163d","Type":"ContainerStarted","Data":"93f63077e390a28dfc0884cdeb24ba36091840bc5e3c530d7bbba0e06e4a065b"} Apr 19 15:26:40.026348 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:40.026309 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-jd7xh"] Apr 19 15:26:40.993356 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:40.993315 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-jd7xh" event={"ID":"87f936be-8278-46c5-8f16-d81b0eb95086","Type":"ContainerStarted","Data":"b335373f2b75a95176970a6ad5101a7af0dedae0e6429fa299340129268bb127"} Apr 19 15:26:40.994707 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:40.994679 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-6bbcbb9c4d-x7js9" event={"ID":"5317d4de-3d06-404c-a2c2-c50612bb163d","Type":"ContainerStarted","Data":"ba0a14b100a76b00e58ac1dcc06ad73a41f69b9aee5ab5d74e3d5ffff507c4da"} Apr 19 15:26:41.015436 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:41.015383 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-6bbcbb9c4d-x7js9" podStartSLOduration=1.576574704 podStartE2EDuration="3.015368864s" podCreationTimestamp="2026-04-19 15:26:38 +0000 UTC" firstStartedPulling="2026-04-19 15:26:39.026871283 +0000 UTC m=+98.132786650" lastFinishedPulling="2026-04-19 15:26:40.465665441 +0000 UTC m=+99.571580810" observedRunningTime="2026-04-19 15:26:41.014191868 +0000 UTC m=+100.120107258" watchObservedRunningTime="2026-04-19 15:26:41.015368864 +0000 UTC m=+100.121284251" Apr 19 15:26:41.880932 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:41.880877 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-r46tx" Apr 19 15:26:43.004814 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:43.004775 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-jd7xh" event={"ID":"87f936be-8278-46c5-8f16-d81b0eb95086","Type":"ContainerStarted","Data":"ba137b09d690469877aab979d6a09bfb8e933c593e3ebcca3da1efd7b80aff5f"} Apr 19 15:26:43.005275 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:43.004984 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-jd7xh" Apr 19 15:26:43.009895 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:43.009868 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-jd7xh" Apr 19 15:26:43.019427 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:43.019376 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-jd7xh" podStartSLOduration=3.383891287 podStartE2EDuration="5.019363864s" podCreationTimestamp="2026-04-19 15:26:38 +0000 UTC" firstStartedPulling="2026-04-19 15:26:40.417155292 +0000 UTC m=+99.523070659" lastFinishedPulling="2026-04-19 15:26:42.052627869 +0000 UTC m=+101.158543236" observedRunningTime="2026-04-19 15:26:43.018486007 +0000 UTC m=+102.124401400" watchObservedRunningTime="2026-04-19 15:26:43.019363864 +0000 UTC m=+102.125279252" Apr 19 15:26:44.926607 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:44.926576 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-548869cb46-b9zkq" Apr 19 15:26:46.321560 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:46.321524 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-jq6s6"] Apr 19 15:26:46.324589 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:46.324567 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-jq6s6" Apr 19 15:26:46.327051 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:46.327030 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-swcn5\"" Apr 19 15:26:46.327156 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:46.327030 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 19 15:26:46.327156 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:46.327032 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 19 15:26:46.333343 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:46.333324 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-jq6s6"] Apr 19 15:26:46.447090 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:46.447054 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9cs9\" (UniqueName: \"kubernetes.io/projected/a60042c0-48bc-4697-aa4f-73a65e8b15a5-kube-api-access-s9cs9\") pod \"downloads-6bcc868b7-jq6s6\" (UID: \"a60042c0-48bc-4697-aa4f-73a65e8b15a5\") " pod="openshift-console/downloads-6bcc868b7-jq6s6" Apr 19 15:26:46.548586 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:46.548548 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s9cs9\" (UniqueName: \"kubernetes.io/projected/a60042c0-48bc-4697-aa4f-73a65e8b15a5-kube-api-access-s9cs9\") pod \"downloads-6bcc868b7-jq6s6\" (UID: \"a60042c0-48bc-4697-aa4f-73a65e8b15a5\") " pod="openshift-console/downloads-6bcc868b7-jq6s6" Apr 19 15:26:46.556406 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:46.556374 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9cs9\" (UniqueName: \"kubernetes.io/projected/a60042c0-48bc-4697-aa4f-73a65e8b15a5-kube-api-access-s9cs9\") pod \"downloads-6bcc868b7-jq6s6\" (UID: \"a60042c0-48bc-4697-aa4f-73a65e8b15a5\") " pod="openshift-console/downloads-6bcc868b7-jq6s6" Apr 19 15:26:46.634011 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:46.633914 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-jq6s6" Apr 19 15:26:46.760215 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:46.760185 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-jq6s6"] Apr 19 15:26:46.763481 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:26:46.763456 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda60042c0_48bc_4697_aa4f_73a65e8b15a5.slice/crio-cb69e2e6cf2fe88f0945a4e775ad0599b6ad09ed5b0b41821085d98475493eb7 WatchSource:0}: Error finding container cb69e2e6cf2fe88f0945a4e775ad0599b6ad09ed5b0b41821085d98475493eb7: Status 404 returned error can't find the container with id cb69e2e6cf2fe88f0945a4e775ad0599b6ad09ed5b0b41821085d98475493eb7 Apr 19 15:26:47.017349 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:47.017309 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-jq6s6" event={"ID":"a60042c0-48bc-4697-aa4f-73a65e8b15a5","Type":"ContainerStarted","Data":"cb69e2e6cf2fe88f0945a4e775ad0599b6ad09ed5b0b41821085d98475493eb7"} Apr 19 15:26:47.409558 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:47.409447 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-569cffc76c-jl8vf" podUID="3db4c72b-149d-4caa-8a3b-7a449266aa07" containerName="registry" containerID="cri-o://a073d054a85685559ba14e88f86acd79cd5310e2018d66d1882de5bf1d00c09b" gracePeriod=30 Apr 19 15:26:47.676474 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:47.676447 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-569cffc76c-jl8vf" Apr 19 15:26:47.863265 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:47.863220 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3db4c72b-149d-4caa-8a3b-7a449266aa07-trusted-ca\") pod \"3db4c72b-149d-4caa-8a3b-7a449266aa07\" (UID: \"3db4c72b-149d-4caa-8a3b-7a449266aa07\") " Apr 19 15:26:47.863449 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:47.863287 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3db4c72b-149d-4caa-8a3b-7a449266aa07-registry-tls\") pod \"3db4c72b-149d-4caa-8a3b-7a449266aa07\" (UID: \"3db4c72b-149d-4caa-8a3b-7a449266aa07\") " Apr 19 15:26:47.863449 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:47.863331 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3db4c72b-149d-4caa-8a3b-7a449266aa07-bound-sa-token\") pod \"3db4c72b-149d-4caa-8a3b-7a449266aa07\" (UID: \"3db4c72b-149d-4caa-8a3b-7a449266aa07\") " Apr 19 15:26:47.863449 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:47.863398 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3db4c72b-149d-4caa-8a3b-7a449266aa07-ca-trust-extracted\") pod \"3db4c72b-149d-4caa-8a3b-7a449266aa07\" (UID: \"3db4c72b-149d-4caa-8a3b-7a449266aa07\") " Apr 19 15:26:47.863449 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:47.863430 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rx4tq\" (UniqueName: \"kubernetes.io/projected/3db4c72b-149d-4caa-8a3b-7a449266aa07-kube-api-access-rx4tq\") pod \"3db4c72b-149d-4caa-8a3b-7a449266aa07\" (UID: \"3db4c72b-149d-4caa-8a3b-7a449266aa07\") " Apr 19 15:26:47.863659 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:47.863561 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3db4c72b-149d-4caa-8a3b-7a449266aa07-registry-certificates\") pod \"3db4c72b-149d-4caa-8a3b-7a449266aa07\" (UID: \"3db4c72b-149d-4caa-8a3b-7a449266aa07\") " Apr 19 15:26:47.863659 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:47.863612 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3db4c72b-149d-4caa-8a3b-7a449266aa07-installation-pull-secrets\") pod \"3db4c72b-149d-4caa-8a3b-7a449266aa07\" (UID: \"3db4c72b-149d-4caa-8a3b-7a449266aa07\") " Apr 19 15:26:47.863659 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:47.863647 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/3db4c72b-149d-4caa-8a3b-7a449266aa07-image-registry-private-configuration\") pod \"3db4c72b-149d-4caa-8a3b-7a449266aa07\" (UID: \"3db4c72b-149d-4caa-8a3b-7a449266aa07\") " Apr 19 15:26:47.863832 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:47.863653 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3db4c72b-149d-4caa-8a3b-7a449266aa07-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "3db4c72b-149d-4caa-8a3b-7a449266aa07" (UID: "3db4c72b-149d-4caa-8a3b-7a449266aa07"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 19 15:26:47.864487 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:47.864170 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3db4c72b-149d-4caa-8a3b-7a449266aa07-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "3db4c72b-149d-4caa-8a3b-7a449266aa07" (UID: "3db4c72b-149d-4caa-8a3b-7a449266aa07"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 19 15:26:47.866610 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:47.866565 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3db4c72b-149d-4caa-8a3b-7a449266aa07-kube-api-access-rx4tq" (OuterVolumeSpecName: "kube-api-access-rx4tq") pod "3db4c72b-149d-4caa-8a3b-7a449266aa07" (UID: "3db4c72b-149d-4caa-8a3b-7a449266aa07"). InnerVolumeSpecName "kube-api-access-rx4tq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 15:26:47.866744 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:47.866645 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3db4c72b-149d-4caa-8a3b-7a449266aa07-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "3db4c72b-149d-4caa-8a3b-7a449266aa07" (UID: "3db4c72b-149d-4caa-8a3b-7a449266aa07"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 19 15:26:47.866852 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:47.866824 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3db4c72b-149d-4caa-8a3b-7a449266aa07-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "3db4c72b-149d-4caa-8a3b-7a449266aa07" (UID: "3db4c72b-149d-4caa-8a3b-7a449266aa07"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 15:26:47.866915 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:47.866884 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3db4c72b-149d-4caa-8a3b-7a449266aa07-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "3db4c72b-149d-4caa-8a3b-7a449266aa07" (UID: "3db4c72b-149d-4caa-8a3b-7a449266aa07"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 15:26:47.867295 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:47.867272 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3db4c72b-149d-4caa-8a3b-7a449266aa07-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "3db4c72b-149d-4caa-8a3b-7a449266aa07" (UID: "3db4c72b-149d-4caa-8a3b-7a449266aa07"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 19 15:26:47.872232 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:47.872203 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3db4c72b-149d-4caa-8a3b-7a449266aa07-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "3db4c72b-149d-4caa-8a3b-7a449266aa07" (UID: "3db4c72b-149d-4caa-8a3b-7a449266aa07"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 19 15:26:47.964645 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:47.964608 2579 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3db4c72b-149d-4caa-8a3b-7a449266aa07-ca-trust-extracted\") on node \"ip-10-0-133-218.ec2.internal\" DevicePath \"\"" Apr 19 15:26:47.964645 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:47.964645 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rx4tq\" (UniqueName: \"kubernetes.io/projected/3db4c72b-149d-4caa-8a3b-7a449266aa07-kube-api-access-rx4tq\") on node \"ip-10-0-133-218.ec2.internal\" DevicePath \"\"" Apr 19 15:26:47.964882 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:47.964662 2579 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3db4c72b-149d-4caa-8a3b-7a449266aa07-registry-certificates\") on node \"ip-10-0-133-218.ec2.internal\" DevicePath \"\"" Apr 19 15:26:47.964882 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:47.964677 2579 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3db4c72b-149d-4caa-8a3b-7a449266aa07-installation-pull-secrets\") on node \"ip-10-0-133-218.ec2.internal\" DevicePath \"\"" Apr 19 15:26:47.964882 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:47.964693 2579 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/3db4c72b-149d-4caa-8a3b-7a449266aa07-image-registry-private-configuration\") on node \"ip-10-0-133-218.ec2.internal\" DevicePath \"\"" Apr 19 15:26:47.964882 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:47.964708 2579 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3db4c72b-149d-4caa-8a3b-7a449266aa07-trusted-ca\") on node \"ip-10-0-133-218.ec2.internal\" DevicePath \"\"" Apr 19 15:26:47.964882 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:47.964740 2579 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3db4c72b-149d-4caa-8a3b-7a449266aa07-registry-tls\") on node \"ip-10-0-133-218.ec2.internal\" DevicePath \"\"" Apr 19 15:26:47.964882 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:47.964756 2579 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3db4c72b-149d-4caa-8a3b-7a449266aa07-bound-sa-token\") on node \"ip-10-0-133-218.ec2.internal\" DevicePath \"\"" Apr 19 15:26:48.021959 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:48.021919 2579 generic.go:358] "Generic (PLEG): container finished" podID="3db4c72b-149d-4caa-8a3b-7a449266aa07" containerID="a073d054a85685559ba14e88f86acd79cd5310e2018d66d1882de5bf1d00c09b" exitCode=0 Apr 19 15:26:48.022130 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:48.022008 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-569cffc76c-jl8vf" Apr 19 15:26:48.022130 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:48.022012 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-569cffc76c-jl8vf" event={"ID":"3db4c72b-149d-4caa-8a3b-7a449266aa07","Type":"ContainerDied","Data":"a073d054a85685559ba14e88f86acd79cd5310e2018d66d1882de5bf1d00c09b"} Apr 19 15:26:48.022130 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:48.022051 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-569cffc76c-jl8vf" event={"ID":"3db4c72b-149d-4caa-8a3b-7a449266aa07","Type":"ContainerDied","Data":"89336557b37ea3af7205313bf5f116b87e62bcca753f0cd805b28054cbae3a31"} Apr 19 15:26:48.022130 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:48.022070 2579 scope.go:117] "RemoveContainer" containerID="a073d054a85685559ba14e88f86acd79cd5310e2018d66d1882de5bf1d00c09b" Apr 19 15:26:48.032443 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:48.032418 2579 scope.go:117] "RemoveContainer" containerID="a073d054a85685559ba14e88f86acd79cd5310e2018d66d1882de5bf1d00c09b" Apr 19 15:26:48.032704 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:26:48.032677 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a073d054a85685559ba14e88f86acd79cd5310e2018d66d1882de5bf1d00c09b\": container with ID starting with a073d054a85685559ba14e88f86acd79cd5310e2018d66d1882de5bf1d00c09b not found: ID does not exist" containerID="a073d054a85685559ba14e88f86acd79cd5310e2018d66d1882de5bf1d00c09b" Apr 19 15:26:48.032825 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:48.032714 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a073d054a85685559ba14e88f86acd79cd5310e2018d66d1882de5bf1d00c09b"} err="failed to get container status \"a073d054a85685559ba14e88f86acd79cd5310e2018d66d1882de5bf1d00c09b\": rpc error: code = NotFound desc = could not find container \"a073d054a85685559ba14e88f86acd79cd5310e2018d66d1882de5bf1d00c09b\": container with ID starting with a073d054a85685559ba14e88f86acd79cd5310e2018d66d1882de5bf1d00c09b not found: ID does not exist" Apr 19 15:26:48.045263 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:48.045233 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-569cffc76c-jl8vf"] Apr 19 15:26:48.053130 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:48.053105 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-569cffc76c-jl8vf"] Apr 19 15:26:49.545982 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:49.545945 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3db4c72b-149d-4caa-8a3b-7a449266aa07" path="/var/lib/kubelet/pods/3db4c72b-149d-4caa-8a3b-7a449266aa07/volumes" Apr 19 15:26:56.917266 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:56.917227 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5b46876d5d-ftfw7"] Apr 19 15:26:56.917682 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:56.917592 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3db4c72b-149d-4caa-8a3b-7a449266aa07" containerName="registry" Apr 19 15:26:56.917682 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:56.917608 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="3db4c72b-149d-4caa-8a3b-7a449266aa07" containerName="registry" Apr 19 15:26:56.917682 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:56.917678 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="3db4c72b-149d-4caa-8a3b-7a449266aa07" containerName="registry" Apr 19 15:26:56.921336 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:56.921315 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5b46876d5d-ftfw7" Apr 19 15:26:56.923827 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:56.923799 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 19 15:26:56.924924 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:56.924712 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-kvtgg\"" Apr 19 15:26:56.924924 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:56.924768 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 19 15:26:56.924924 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:56.924778 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 19 15:26:56.924924 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:56.924872 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 19 15:26:56.924924 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:56.924899 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 19 15:26:56.931138 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:56.931112 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5b46876d5d-ftfw7"] Apr 19 15:26:56.950110 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:56.950079 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e2ebd73f-76c2-4762-8ce4-e0e3f46eedc0-console-serving-cert\") pod \"console-5b46876d5d-ftfw7\" (UID: \"e2ebd73f-76c2-4762-8ce4-e0e3f46eedc0\") " pod="openshift-console/console-5b46876d5d-ftfw7" Apr 19 15:26:56.950274 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:56.950175 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e2ebd73f-76c2-4762-8ce4-e0e3f46eedc0-service-ca\") pod \"console-5b46876d5d-ftfw7\" (UID: \"e2ebd73f-76c2-4762-8ce4-e0e3f46eedc0\") " pod="openshift-console/console-5b46876d5d-ftfw7" Apr 19 15:26:56.950274 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:56.950200 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5prhw\" (UniqueName: \"kubernetes.io/projected/e2ebd73f-76c2-4762-8ce4-e0e3f46eedc0-kube-api-access-5prhw\") pod \"console-5b46876d5d-ftfw7\" (UID: \"e2ebd73f-76c2-4762-8ce4-e0e3f46eedc0\") " pod="openshift-console/console-5b46876d5d-ftfw7" Apr 19 15:26:56.950274 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:56.950241 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e2ebd73f-76c2-4762-8ce4-e0e3f46eedc0-oauth-serving-cert\") pod \"console-5b46876d5d-ftfw7\" (UID: \"e2ebd73f-76c2-4762-8ce4-e0e3f46eedc0\") " pod="openshift-console/console-5b46876d5d-ftfw7" Apr 19 15:26:56.950451 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:56.950373 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e2ebd73f-76c2-4762-8ce4-e0e3f46eedc0-console-oauth-config\") pod \"console-5b46876d5d-ftfw7\" (UID: \"e2ebd73f-76c2-4762-8ce4-e0e3f46eedc0\") " pod="openshift-console/console-5b46876d5d-ftfw7" Apr 19 15:26:56.950451 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:56.950435 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e2ebd73f-76c2-4762-8ce4-e0e3f46eedc0-console-config\") pod \"console-5b46876d5d-ftfw7\" (UID: \"e2ebd73f-76c2-4762-8ce4-e0e3f46eedc0\") " pod="openshift-console/console-5b46876d5d-ftfw7" Apr 19 15:26:57.050896 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:57.050841 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e2ebd73f-76c2-4762-8ce4-e0e3f46eedc0-console-oauth-config\") pod \"console-5b46876d5d-ftfw7\" (UID: \"e2ebd73f-76c2-4762-8ce4-e0e3f46eedc0\") " pod="openshift-console/console-5b46876d5d-ftfw7" Apr 19 15:26:57.051083 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:57.050937 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e2ebd73f-76c2-4762-8ce4-e0e3f46eedc0-console-config\") pod \"console-5b46876d5d-ftfw7\" (UID: \"e2ebd73f-76c2-4762-8ce4-e0e3f46eedc0\") " pod="openshift-console/console-5b46876d5d-ftfw7" Apr 19 15:26:57.051083 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:57.050974 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e2ebd73f-76c2-4762-8ce4-e0e3f46eedc0-console-serving-cert\") pod \"console-5b46876d5d-ftfw7\" (UID: \"e2ebd73f-76c2-4762-8ce4-e0e3f46eedc0\") " pod="openshift-console/console-5b46876d5d-ftfw7" Apr 19 15:26:57.051083 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:57.051056 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e2ebd73f-76c2-4762-8ce4-e0e3f46eedc0-service-ca\") pod \"console-5b46876d5d-ftfw7\" (UID: \"e2ebd73f-76c2-4762-8ce4-e0e3f46eedc0\") " pod="openshift-console/console-5b46876d5d-ftfw7" Apr 19 15:26:57.051083 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:57.051079 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5prhw\" (UniqueName: \"kubernetes.io/projected/e2ebd73f-76c2-4762-8ce4-e0e3f46eedc0-kube-api-access-5prhw\") pod \"console-5b46876d5d-ftfw7\" (UID: \"e2ebd73f-76c2-4762-8ce4-e0e3f46eedc0\") " pod="openshift-console/console-5b46876d5d-ftfw7" Apr 19 15:26:57.051343 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:57.051106 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e2ebd73f-76c2-4762-8ce4-e0e3f46eedc0-oauth-serving-cert\") pod \"console-5b46876d5d-ftfw7\" (UID: \"e2ebd73f-76c2-4762-8ce4-e0e3f46eedc0\") " pod="openshift-console/console-5b46876d5d-ftfw7" Apr 19 15:26:57.052027 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:57.052002 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e2ebd73f-76c2-4762-8ce4-e0e3f46eedc0-oauth-serving-cert\") pod \"console-5b46876d5d-ftfw7\" (UID: \"e2ebd73f-76c2-4762-8ce4-e0e3f46eedc0\") " pod="openshift-console/console-5b46876d5d-ftfw7" Apr 19 15:26:57.052148 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:57.052118 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e2ebd73f-76c2-4762-8ce4-e0e3f46eedc0-service-ca\") pod \"console-5b46876d5d-ftfw7\" (UID: \"e2ebd73f-76c2-4762-8ce4-e0e3f46eedc0\") " pod="openshift-console/console-5b46876d5d-ftfw7" Apr 19 15:26:57.052219 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:57.052064 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e2ebd73f-76c2-4762-8ce4-e0e3f46eedc0-console-config\") pod \"console-5b46876d5d-ftfw7\" (UID: \"e2ebd73f-76c2-4762-8ce4-e0e3f46eedc0\") " pod="openshift-console/console-5b46876d5d-ftfw7" Apr 19 15:26:57.054054 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:57.054029 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e2ebd73f-76c2-4762-8ce4-e0e3f46eedc0-console-oauth-config\") pod \"console-5b46876d5d-ftfw7\" (UID: \"e2ebd73f-76c2-4762-8ce4-e0e3f46eedc0\") " pod="openshift-console/console-5b46876d5d-ftfw7" Apr 19 15:26:57.054160 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:57.054031 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e2ebd73f-76c2-4762-8ce4-e0e3f46eedc0-console-serving-cert\") pod \"console-5b46876d5d-ftfw7\" (UID: \"e2ebd73f-76c2-4762-8ce4-e0e3f46eedc0\") " pod="openshift-console/console-5b46876d5d-ftfw7" Apr 19 15:26:57.060225 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:57.060201 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5prhw\" (UniqueName: \"kubernetes.io/projected/e2ebd73f-76c2-4762-8ce4-e0e3f46eedc0-kube-api-access-5prhw\") pod \"console-5b46876d5d-ftfw7\" (UID: \"e2ebd73f-76c2-4762-8ce4-e0e3f46eedc0\") " pod="openshift-console/console-5b46876d5d-ftfw7" Apr 19 15:26:57.233556 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:57.233518 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5b46876d5d-ftfw7" Apr 19 15:26:58.882052 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:58.882012 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-6bbcbb9c4d-x7js9" Apr 19 15:26:58.882565 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:26:58.882085 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-6bbcbb9c4d-x7js9" Apr 19 15:27:03.823065 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:27:03.823034 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5b46876d5d-ftfw7"] Apr 19 15:27:03.826568 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:27:03.826528 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2ebd73f_76c2_4762_8ce4_e0e3f46eedc0.slice/crio-ec062491b19de5847f7e9652ce458d2a0f9c527713a8c7fa5b060b0deb11c74e WatchSource:0}: Error finding container ec062491b19de5847f7e9652ce458d2a0f9c527713a8c7fa5b060b0deb11c74e: Status 404 returned error can't find the container with id ec062491b19de5847f7e9652ce458d2a0f9c527713a8c7fa5b060b0deb11c74e Apr 19 15:27:04.073993 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:27:04.073893 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-jq6s6" event={"ID":"a60042c0-48bc-4697-aa4f-73a65e8b15a5","Type":"ContainerStarted","Data":"bdee2bfe07ef4990ce3a1f9946d421a25ac04ff0e0fda077a0c8d0ed82e80f78"} Apr 19 15:27:04.074162 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:27:04.074072 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-jq6s6" Apr 19 15:27:04.075137 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:27:04.075112 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5b46876d5d-ftfw7" event={"ID":"e2ebd73f-76c2-4762-8ce4-e0e3f46eedc0","Type":"ContainerStarted","Data":"ec062491b19de5847f7e9652ce458d2a0f9c527713a8c7fa5b060b0deb11c74e"} Apr 19 15:27:04.091399 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:27:04.091341 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-jq6s6" podStartSLOduration=1.08814501 podStartE2EDuration="18.091323195s" podCreationTimestamp="2026-04-19 15:26:46 +0000 UTC" firstStartedPulling="2026-04-19 15:26:46.765375414 +0000 UTC m=+105.871290780" lastFinishedPulling="2026-04-19 15:27:03.7685536 +0000 UTC m=+122.874468965" observedRunningTime="2026-04-19 15:27:04.091126761 +0000 UTC m=+123.197042150" watchObservedRunningTime="2026-04-19 15:27:04.091323195 +0000 UTC m=+123.197238583" Apr 19 15:27:04.102398 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:27:04.102371 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-jq6s6" Apr 19 15:27:06.209101 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:27:06.209058 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-c4dddf6d5-nkgrt"] Apr 19 15:27:06.240789 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:27:06.240752 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-c4dddf6d5-nkgrt"] Apr 19 15:27:06.240975 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:27:06.240909 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c4dddf6d5-nkgrt" Apr 19 15:27:06.254232 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:27:06.253971 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 19 15:27:06.337298 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:27:06.337119 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d2fc52e5-27a4-4f86-9d56-db3e1aec4727-console-config\") pod \"console-c4dddf6d5-nkgrt\" (UID: \"d2fc52e5-27a4-4f86-9d56-db3e1aec4727\") " pod="openshift-console/console-c4dddf6d5-nkgrt" Apr 19 15:27:06.337298 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:27:06.337198 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d2fc52e5-27a4-4f86-9d56-db3e1aec4727-trusted-ca-bundle\") pod \"console-c4dddf6d5-nkgrt\" (UID: \"d2fc52e5-27a4-4f86-9d56-db3e1aec4727\") " pod="openshift-console/console-c4dddf6d5-nkgrt" Apr 19 15:27:06.337298 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:27:06.337246 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d2fc52e5-27a4-4f86-9d56-db3e1aec4727-console-serving-cert\") pod \"console-c4dddf6d5-nkgrt\" (UID: \"d2fc52e5-27a4-4f86-9d56-db3e1aec4727\") " pod="openshift-console/console-c4dddf6d5-nkgrt" Apr 19 15:27:06.337615 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:27:06.337303 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d2fc52e5-27a4-4f86-9d56-db3e1aec4727-console-oauth-config\") pod \"console-c4dddf6d5-nkgrt\" (UID: \"d2fc52e5-27a4-4f86-9d56-db3e1aec4727\") " pod="openshift-console/console-c4dddf6d5-nkgrt" Apr 19 15:27:06.337615 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:27:06.337433 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkcqd\" (UniqueName: \"kubernetes.io/projected/d2fc52e5-27a4-4f86-9d56-db3e1aec4727-kube-api-access-tkcqd\") pod \"console-c4dddf6d5-nkgrt\" (UID: \"d2fc52e5-27a4-4f86-9d56-db3e1aec4727\") " pod="openshift-console/console-c4dddf6d5-nkgrt" Apr 19 15:27:06.337615 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:27:06.337491 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d2fc52e5-27a4-4f86-9d56-db3e1aec4727-oauth-serving-cert\") pod \"console-c4dddf6d5-nkgrt\" (UID: \"d2fc52e5-27a4-4f86-9d56-db3e1aec4727\") " pod="openshift-console/console-c4dddf6d5-nkgrt" Apr 19 15:27:06.337615 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:27:06.337523 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d2fc52e5-27a4-4f86-9d56-db3e1aec4727-service-ca\") pod \"console-c4dddf6d5-nkgrt\" (UID: \"d2fc52e5-27a4-4f86-9d56-db3e1aec4727\") " pod="openshift-console/console-c4dddf6d5-nkgrt" Apr 19 15:27:06.438197 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:27:06.438157 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d2fc52e5-27a4-4f86-9d56-db3e1aec4727-trusted-ca-bundle\") pod \"console-c4dddf6d5-nkgrt\" (UID: \"d2fc52e5-27a4-4f86-9d56-db3e1aec4727\") " pod="openshift-console/console-c4dddf6d5-nkgrt" Apr 19 15:27:06.438515 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:27:06.438229 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d2fc52e5-27a4-4f86-9d56-db3e1aec4727-console-serving-cert\") pod \"console-c4dddf6d5-nkgrt\" (UID: \"d2fc52e5-27a4-4f86-9d56-db3e1aec4727\") " pod="openshift-console/console-c4dddf6d5-nkgrt" Apr 19 15:27:06.438515 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:27:06.438280 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d2fc52e5-27a4-4f86-9d56-db3e1aec4727-console-oauth-config\") pod \"console-c4dddf6d5-nkgrt\" (UID: \"d2fc52e5-27a4-4f86-9d56-db3e1aec4727\") " pod="openshift-console/console-c4dddf6d5-nkgrt" Apr 19 15:27:06.438515 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:27:06.438322 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tkcqd\" (UniqueName: \"kubernetes.io/projected/d2fc52e5-27a4-4f86-9d56-db3e1aec4727-kube-api-access-tkcqd\") pod \"console-c4dddf6d5-nkgrt\" (UID: \"d2fc52e5-27a4-4f86-9d56-db3e1aec4727\") " pod="openshift-console/console-c4dddf6d5-nkgrt" Apr 19 15:27:06.439812 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:27:06.438827 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d2fc52e5-27a4-4f86-9d56-db3e1aec4727-oauth-serving-cert\") pod \"console-c4dddf6d5-nkgrt\" (UID: \"d2fc52e5-27a4-4f86-9d56-db3e1aec4727\") " pod="openshift-console/console-c4dddf6d5-nkgrt" Apr 19 15:27:06.439812 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:27:06.439223 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d2fc52e5-27a4-4f86-9d56-db3e1aec4727-service-ca\") pod \"console-c4dddf6d5-nkgrt\" (UID: \"d2fc52e5-27a4-4f86-9d56-db3e1aec4727\") " pod="openshift-console/console-c4dddf6d5-nkgrt" Apr 19 15:27:06.439812 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:27:06.439278 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d2fc52e5-27a4-4f86-9d56-db3e1aec4727-console-config\") pod \"console-c4dddf6d5-nkgrt\" (UID: \"d2fc52e5-27a4-4f86-9d56-db3e1aec4727\") " pod="openshift-console/console-c4dddf6d5-nkgrt" Apr 19 15:27:06.439812 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:27:06.439450 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d2fc52e5-27a4-4f86-9d56-db3e1aec4727-trusted-ca-bundle\") pod \"console-c4dddf6d5-nkgrt\" (UID: \"d2fc52e5-27a4-4f86-9d56-db3e1aec4727\") " pod="openshift-console/console-c4dddf6d5-nkgrt" Apr 19 15:27:06.440019 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:27:06.439929 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d2fc52e5-27a4-4f86-9d56-db3e1aec4727-service-ca\") pod \"console-c4dddf6d5-nkgrt\" (UID: \"d2fc52e5-27a4-4f86-9d56-db3e1aec4727\") " pod="openshift-console/console-c4dddf6d5-nkgrt" Apr 19 15:27:06.440623 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:27:06.440443 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d2fc52e5-27a4-4f86-9d56-db3e1aec4727-oauth-serving-cert\") pod \"console-c4dddf6d5-nkgrt\" (UID: \"d2fc52e5-27a4-4f86-9d56-db3e1aec4727\") " pod="openshift-console/console-c4dddf6d5-nkgrt" Apr 19 15:27:06.451568 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:27:06.451533 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d2fc52e5-27a4-4f86-9d56-db3e1aec4727-console-serving-cert\") pod \"console-c4dddf6d5-nkgrt\" (UID: \"d2fc52e5-27a4-4f86-9d56-db3e1aec4727\") " pod="openshift-console/console-c4dddf6d5-nkgrt" Apr 19 15:27:06.452222 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:27:06.451955 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d2fc52e5-27a4-4f86-9d56-db3e1aec4727-console-oauth-config\") pod \"console-c4dddf6d5-nkgrt\" (UID: \"d2fc52e5-27a4-4f86-9d56-db3e1aec4727\") " pod="openshift-console/console-c4dddf6d5-nkgrt" Apr 19 15:27:06.452222 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:27:06.452058 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d2fc52e5-27a4-4f86-9d56-db3e1aec4727-console-config\") pod \"console-c4dddf6d5-nkgrt\" (UID: \"d2fc52e5-27a4-4f86-9d56-db3e1aec4727\") " pod="openshift-console/console-c4dddf6d5-nkgrt" Apr 19 15:27:06.455743 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:27:06.455660 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkcqd\" (UniqueName: \"kubernetes.io/projected/d2fc52e5-27a4-4f86-9d56-db3e1aec4727-kube-api-access-tkcqd\") pod \"console-c4dddf6d5-nkgrt\" (UID: \"d2fc52e5-27a4-4f86-9d56-db3e1aec4727\") " pod="openshift-console/console-c4dddf6d5-nkgrt" Apr 19 15:27:06.562868 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:27:06.562292 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c4dddf6d5-nkgrt" Apr 19 15:27:07.423147 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:27:07.423108 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-c4dddf6d5-nkgrt"] Apr 19 15:27:07.426427 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:27:07.426394 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd2fc52e5_27a4_4f86_9d56_db3e1aec4727.slice/crio-384bee5c62e5d7005a06429ad4c3c4fe5a9abc6b7a66da720a968fb718e93b88 WatchSource:0}: Error finding container 384bee5c62e5d7005a06429ad4c3c4fe5a9abc6b7a66da720a968fb718e93b88: Status 404 returned error can't find the container with id 384bee5c62e5d7005a06429ad4c3c4fe5a9abc6b7a66da720a968fb718e93b88 Apr 19 15:27:08.090552 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:27:08.090511 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c4dddf6d5-nkgrt" event={"ID":"d2fc52e5-27a4-4f86-9d56-db3e1aec4727","Type":"ContainerStarted","Data":"49108cdc6d557736e3392e5ffc7188bcc3b72a5bafb44f86195ca1a25eddc99c"} Apr 19 15:27:08.090552 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:27:08.090556 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c4dddf6d5-nkgrt" event={"ID":"d2fc52e5-27a4-4f86-9d56-db3e1aec4727","Type":"ContainerStarted","Data":"384bee5c62e5d7005a06429ad4c3c4fe5a9abc6b7a66da720a968fb718e93b88"} Apr 19 15:27:08.092220 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:27:08.092188 2579 generic.go:358] "Generic (PLEG): container finished" podID="0d8e4508-63ae-4c34-9e5a-88f0e8d37185" containerID="1673083ebe5809a9d4854b6d18310852e47f34e2640de4e90520e7c50bb8cb26" exitCode=0 Apr 19 15:27:08.092362 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:27:08.092260 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-dbkrb" event={"ID":"0d8e4508-63ae-4c34-9e5a-88f0e8d37185","Type":"ContainerDied","Data":"1673083ebe5809a9d4854b6d18310852e47f34e2640de4e90520e7c50bb8cb26"} Apr 19 15:27:08.092644 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:27:08.092621 2579 scope.go:117] "RemoveContainer" containerID="1673083ebe5809a9d4854b6d18310852e47f34e2640de4e90520e7c50bb8cb26" Apr 19 15:27:08.093920 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:27:08.093870 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5b46876d5d-ftfw7" event={"ID":"e2ebd73f-76c2-4762-8ce4-e0e3f46eedc0","Type":"ContainerStarted","Data":"c60d9bcb6e417124a6bd2713a6df324853e1a9967fc78e5cef4fd9f4460ac5a1"} Apr 19 15:27:08.150825 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:27:08.150687 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-c4dddf6d5-nkgrt" podStartSLOduration=1.694047895 podStartE2EDuration="2.150670724s" podCreationTimestamp="2026-04-19 15:27:06 +0000 UTC" firstStartedPulling="2026-04-19 15:27:07.428685425 +0000 UTC m=+126.534600792" lastFinishedPulling="2026-04-19 15:27:07.885308241 +0000 UTC m=+126.991223621" observedRunningTime="2026-04-19 15:27:08.119046085 +0000 UTC m=+127.224961487" watchObservedRunningTime="2026-04-19 15:27:08.150670724 +0000 UTC m=+127.256586113" Apr 19 15:27:08.165794 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:27:08.165748 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5b46876d5d-ftfw7" podStartSLOduration=8.315789091 podStartE2EDuration="12.165701811s" podCreationTimestamp="2026-04-19 15:26:56 +0000 UTC" firstStartedPulling="2026-04-19 15:27:03.828379663 +0000 UTC m=+122.934295029" lastFinishedPulling="2026-04-19 15:27:07.67829238 +0000 UTC m=+126.784207749" observedRunningTime="2026-04-19 15:27:08.165574435 +0000 UTC m=+127.271489824" watchObservedRunningTime="2026-04-19 15:27:08.165701811 +0000 UTC m=+127.271617200" Apr 19 15:27:08.889921 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:27:08.889888 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-2xfjp_b10af7e0-ddd6-409a-bf97-0223a35bb81a/serve-healthcheck-canary/0.log" Apr 19 15:27:09.099475 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:27:09.099433 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-dbkrb" event={"ID":"0d8e4508-63ae-4c34-9e5a-88f0e8d37185","Type":"ContainerStarted","Data":"3e75db52678745a65c38da7a9a1bf8ea933dce4bbcf63f9c1757ea7fa561afff"} Apr 19 15:27:10.281637 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:27:10.281592 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/41bb40b9-2854-47c5-8759-3fbea6b42b53-metrics-certs\") pod \"network-metrics-daemon-8cprr\" (UID: \"41bb40b9-2854-47c5-8759-3fbea6b42b53\") " pod="openshift-multus/network-metrics-daemon-8cprr" Apr 19 15:27:10.284345 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:27:10.284311 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/41bb40b9-2854-47c5-8759-3fbea6b42b53-metrics-certs\") pod \"network-metrics-daemon-8cprr\" (UID: \"41bb40b9-2854-47c5-8759-3fbea6b42b53\") " pod="openshift-multus/network-metrics-daemon-8cprr" Apr 19 15:27:10.560833 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:27:10.560738 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-zjn96\"" Apr 19 15:27:10.569600 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:27:10.569568 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8cprr" Apr 19 15:27:10.717549 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:27:10.717515 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-8cprr"] Apr 19 15:27:10.720567 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:27:10.720517 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41bb40b9_2854_47c5_8759_3fbea6b42b53.slice/crio-8a3f436c45a1f9aa20d8e1cc6d2bd22851a577b524609b848ca279017dcf3389 WatchSource:0}: Error finding container 8a3f436c45a1f9aa20d8e1cc6d2bd22851a577b524609b848ca279017dcf3389: Status 404 returned error can't find the container with id 8a3f436c45a1f9aa20d8e1cc6d2bd22851a577b524609b848ca279017dcf3389 Apr 19 15:27:11.107645 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:27:11.107601 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-8cprr" event={"ID":"41bb40b9-2854-47c5-8759-3fbea6b42b53","Type":"ContainerStarted","Data":"8a3f436c45a1f9aa20d8e1cc6d2bd22851a577b524609b848ca279017dcf3389"} Apr 19 15:27:13.116465 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:27:13.116426 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-8cprr" event={"ID":"41bb40b9-2854-47c5-8759-3fbea6b42b53","Type":"ContainerStarted","Data":"cd67e0cf88bbadc11dd1a521533f54af9b21a2f7749dc908c74ada620167224b"} Apr 19 15:27:13.116465 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:27:13.116468 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-8cprr" event={"ID":"41bb40b9-2854-47c5-8759-3fbea6b42b53","Type":"ContainerStarted","Data":"03e4a89fa352a42333cfe2ab8203c492f61cc274b0feba2957f394d91e01a42c"} Apr 19 15:27:13.130423 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:27:13.130360 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-8cprr" podStartSLOduration=130.442422745 podStartE2EDuration="2m12.13033933s" podCreationTimestamp="2026-04-19 15:25:01 +0000 UTC" firstStartedPulling="2026-04-19 15:27:10.722790026 +0000 UTC m=+129.828705392" lastFinishedPulling="2026-04-19 15:27:12.41070659 +0000 UTC m=+131.516621977" observedRunningTime="2026-04-19 15:27:13.129505728 +0000 UTC m=+132.235421140" watchObservedRunningTime="2026-04-19 15:27:13.13033933 +0000 UTC m=+132.236254718" Apr 19 15:27:16.126014 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:27:16.125976 2579 generic.go:358] "Generic (PLEG): container finished" podID="e18d4ee3-accb-4d8b-aad0-8801d1395e00" containerID="b669f8ea054c7bcb741a99a4c86f9a48033c35449ea459763e2ead0079b9d9a1" exitCode=0 Apr 19 15:27:16.126511 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:27:16.126060 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-k9hxh" event={"ID":"e18d4ee3-accb-4d8b-aad0-8801d1395e00","Type":"ContainerDied","Data":"b669f8ea054c7bcb741a99a4c86f9a48033c35449ea459763e2ead0079b9d9a1"} Apr 19 15:27:16.126511 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:27:16.126492 2579 scope.go:117] "RemoveContainer" containerID="b669f8ea054c7bcb741a99a4c86f9a48033c35449ea459763e2ead0079b9d9a1" Apr 19 15:27:16.563338 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:27:16.563303 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-c4dddf6d5-nkgrt" Apr 19 15:27:16.563609 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:27:16.563348 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-c4dddf6d5-nkgrt" Apr 19 15:27:16.568171 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:27:16.568148 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-c4dddf6d5-nkgrt" Apr 19 15:27:17.130367 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:27:17.130330 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-k9hxh" event={"ID":"e18d4ee3-accb-4d8b-aad0-8801d1395e00","Type":"ContainerStarted","Data":"68a98d29210721cae3071ece6bded48fe68ffbd108df7a7c4d943751818c5d0d"} Apr 19 15:27:17.134447 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:27:17.134425 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-c4dddf6d5-nkgrt" Apr 19 15:27:17.187314 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:27:17.187280 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5b46876d5d-ftfw7"] Apr 19 15:27:17.234368 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:27:17.234326 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5b46876d5d-ftfw7" Apr 19 15:27:18.887655 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:27:18.887624 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-6bbcbb9c4d-x7js9" Apr 19 15:27:18.891821 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:27:18.891790 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-6bbcbb9c4d-x7js9" Apr 19 15:27:42.207025 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:27:42.206964 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5b46876d5d-ftfw7" podUID="e2ebd73f-76c2-4762-8ce4-e0e3f46eedc0" containerName="console" containerID="cri-o://c60d9bcb6e417124a6bd2713a6df324853e1a9967fc78e5cef4fd9f4460ac5a1" gracePeriod=15 Apr 19 15:27:42.517656 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:27:42.517632 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5b46876d5d-ftfw7_e2ebd73f-76c2-4762-8ce4-e0e3f46eedc0/console/0.log" Apr 19 15:27:42.517817 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:27:42.517694 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5b46876d5d-ftfw7" Apr 19 15:27:42.565273 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:27:42.565238 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e2ebd73f-76c2-4762-8ce4-e0e3f46eedc0-console-serving-cert\") pod \"e2ebd73f-76c2-4762-8ce4-e0e3f46eedc0\" (UID: \"e2ebd73f-76c2-4762-8ce4-e0e3f46eedc0\") " Apr 19 15:27:42.565466 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:27:42.565294 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e2ebd73f-76c2-4762-8ce4-e0e3f46eedc0-service-ca\") pod \"e2ebd73f-76c2-4762-8ce4-e0e3f46eedc0\" (UID: \"e2ebd73f-76c2-4762-8ce4-e0e3f46eedc0\") " Apr 19 15:27:42.565466 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:27:42.565328 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e2ebd73f-76c2-4762-8ce4-e0e3f46eedc0-console-oauth-config\") pod \"e2ebd73f-76c2-4762-8ce4-e0e3f46eedc0\" (UID: \"e2ebd73f-76c2-4762-8ce4-e0e3f46eedc0\") " Apr 19 15:27:42.565466 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:27:42.565425 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e2ebd73f-76c2-4762-8ce4-e0e3f46eedc0-oauth-serving-cert\") pod \"e2ebd73f-76c2-4762-8ce4-e0e3f46eedc0\" (UID: \"e2ebd73f-76c2-4762-8ce4-e0e3f46eedc0\") " Apr 19 15:27:42.565634 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:27:42.565486 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5prhw\" (UniqueName: \"kubernetes.io/projected/e2ebd73f-76c2-4762-8ce4-e0e3f46eedc0-kube-api-access-5prhw\") pod \"e2ebd73f-76c2-4762-8ce4-e0e3f46eedc0\" (UID: \"e2ebd73f-76c2-4762-8ce4-e0e3f46eedc0\") " Apr 19 15:27:42.565634 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:27:42.565521 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e2ebd73f-76c2-4762-8ce4-e0e3f46eedc0-console-config\") pod \"e2ebd73f-76c2-4762-8ce4-e0e3f46eedc0\" (UID: \"e2ebd73f-76c2-4762-8ce4-e0e3f46eedc0\") " Apr 19 15:27:42.565769 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:27:42.565706 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2ebd73f-76c2-4762-8ce4-e0e3f46eedc0-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "e2ebd73f-76c2-4762-8ce4-e0e3f46eedc0" (UID: "e2ebd73f-76c2-4762-8ce4-e0e3f46eedc0"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 19 15:27:42.565829 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:27:42.565761 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2ebd73f-76c2-4762-8ce4-e0e3f46eedc0-service-ca" (OuterVolumeSpecName: "service-ca") pod "e2ebd73f-76c2-4762-8ce4-e0e3f46eedc0" (UID: "e2ebd73f-76c2-4762-8ce4-e0e3f46eedc0"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 19 15:27:42.565960 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:27:42.565942 2579 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e2ebd73f-76c2-4762-8ce4-e0e3f46eedc0-oauth-serving-cert\") on node \"ip-10-0-133-218.ec2.internal\" DevicePath \"\"" Apr 19 15:27:42.566030 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:27:42.565964 2579 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e2ebd73f-76c2-4762-8ce4-e0e3f46eedc0-service-ca\") on node \"ip-10-0-133-218.ec2.internal\" DevicePath \"\"" Apr 19 15:27:42.566030 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:27:42.565966 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2ebd73f-76c2-4762-8ce4-e0e3f46eedc0-console-config" (OuterVolumeSpecName: "console-config") pod "e2ebd73f-76c2-4762-8ce4-e0e3f46eedc0" (UID: "e2ebd73f-76c2-4762-8ce4-e0e3f46eedc0"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 19 15:27:42.567811 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:27:42.567779 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2ebd73f-76c2-4762-8ce4-e0e3f46eedc0-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "e2ebd73f-76c2-4762-8ce4-e0e3f46eedc0" (UID: "e2ebd73f-76c2-4762-8ce4-e0e3f46eedc0"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 19 15:27:42.567811 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:27:42.567780 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2ebd73f-76c2-4762-8ce4-e0e3f46eedc0-kube-api-access-5prhw" (OuterVolumeSpecName: "kube-api-access-5prhw") pod "e2ebd73f-76c2-4762-8ce4-e0e3f46eedc0" (UID: "e2ebd73f-76c2-4762-8ce4-e0e3f46eedc0"). InnerVolumeSpecName "kube-api-access-5prhw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 15:27:42.567925 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:27:42.567853 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2ebd73f-76c2-4762-8ce4-e0e3f46eedc0-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "e2ebd73f-76c2-4762-8ce4-e0e3f46eedc0" (UID: "e2ebd73f-76c2-4762-8ce4-e0e3f46eedc0"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 19 15:27:42.666477 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:27:42.666432 2579 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e2ebd73f-76c2-4762-8ce4-e0e3f46eedc0-console-oauth-config\") on node \"ip-10-0-133-218.ec2.internal\" DevicePath \"\"" Apr 19 15:27:42.666477 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:27:42.666469 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5prhw\" (UniqueName: \"kubernetes.io/projected/e2ebd73f-76c2-4762-8ce4-e0e3f46eedc0-kube-api-access-5prhw\") on node \"ip-10-0-133-218.ec2.internal\" DevicePath \"\"" Apr 19 15:27:42.666477 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:27:42.666479 2579 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e2ebd73f-76c2-4762-8ce4-e0e3f46eedc0-console-config\") on node \"ip-10-0-133-218.ec2.internal\" DevicePath \"\"" Apr 19 15:27:42.666477 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:27:42.666488 2579 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e2ebd73f-76c2-4762-8ce4-e0e3f46eedc0-console-serving-cert\") on node \"ip-10-0-133-218.ec2.internal\" DevicePath \"\"" Apr 19 15:27:43.209860 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:27:43.209829 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5b46876d5d-ftfw7_e2ebd73f-76c2-4762-8ce4-e0e3f46eedc0/console/0.log" Apr 19 15:27:43.210268 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:27:43.209872 2579 generic.go:358] "Generic (PLEG): container finished" podID="e2ebd73f-76c2-4762-8ce4-e0e3f46eedc0" containerID="c60d9bcb6e417124a6bd2713a6df324853e1a9967fc78e5cef4fd9f4460ac5a1" exitCode=2 Apr 19 15:27:43.210268 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:27:43.209907 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5b46876d5d-ftfw7" event={"ID":"e2ebd73f-76c2-4762-8ce4-e0e3f46eedc0","Type":"ContainerDied","Data":"c60d9bcb6e417124a6bd2713a6df324853e1a9967fc78e5cef4fd9f4460ac5a1"} Apr 19 15:27:43.210268 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:27:43.209962 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5b46876d5d-ftfw7" Apr 19 15:27:43.210268 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:27:43.209965 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5b46876d5d-ftfw7" event={"ID":"e2ebd73f-76c2-4762-8ce4-e0e3f46eedc0","Type":"ContainerDied","Data":"ec062491b19de5847f7e9652ce458d2a0f9c527713a8c7fa5b060b0deb11c74e"} Apr 19 15:27:43.210268 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:27:43.209988 2579 scope.go:117] "RemoveContainer" containerID="c60d9bcb6e417124a6bd2713a6df324853e1a9967fc78e5cef4fd9f4460ac5a1" Apr 19 15:27:43.218188 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:27:43.218170 2579 scope.go:117] "RemoveContainer" containerID="c60d9bcb6e417124a6bd2713a6df324853e1a9967fc78e5cef4fd9f4460ac5a1" Apr 19 15:27:43.218433 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:27:43.218415 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c60d9bcb6e417124a6bd2713a6df324853e1a9967fc78e5cef4fd9f4460ac5a1\": container with ID starting with c60d9bcb6e417124a6bd2713a6df324853e1a9967fc78e5cef4fd9f4460ac5a1 not found: ID does not exist" containerID="c60d9bcb6e417124a6bd2713a6df324853e1a9967fc78e5cef4fd9f4460ac5a1" Apr 19 15:27:43.218489 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:27:43.218446 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c60d9bcb6e417124a6bd2713a6df324853e1a9967fc78e5cef4fd9f4460ac5a1"} err="failed to get container status \"c60d9bcb6e417124a6bd2713a6df324853e1a9967fc78e5cef4fd9f4460ac5a1\": rpc error: code = NotFound desc = could not find container \"c60d9bcb6e417124a6bd2713a6df324853e1a9967fc78e5cef4fd9f4460ac5a1\": container with ID starting with c60d9bcb6e417124a6bd2713a6df324853e1a9967fc78e5cef4fd9f4460ac5a1 not found: ID does not exist" Apr 19 15:27:43.228671 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:27:43.228642 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5b46876d5d-ftfw7"] Apr 19 15:27:43.230423 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:27:43.230404 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5b46876d5d-ftfw7"] Apr 19 15:27:43.545332 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:27:43.545256 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2ebd73f-76c2-4762-8ce4-e0e3f46eedc0" path="/var/lib/kubelet/pods/e2ebd73f-76c2-4762-8ce4-e0e3f46eedc0/volumes" Apr 19 15:28:05.223510 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:28:05.223469 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-579f7fb596-jpftg"] Apr 19 15:28:05.223958 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:28:05.223838 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e2ebd73f-76c2-4762-8ce4-e0e3f46eedc0" containerName="console" Apr 19 15:28:05.223958 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:28:05.223852 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2ebd73f-76c2-4762-8ce4-e0e3f46eedc0" containerName="console" Apr 19 15:28:05.223958 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:28:05.223915 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="e2ebd73f-76c2-4762-8ce4-e0e3f46eedc0" containerName="console" Apr 19 15:28:05.228486 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:28:05.228470 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-579f7fb596-jpftg" Apr 19 15:28:05.238463 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:28:05.238438 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-579f7fb596-jpftg"] Apr 19 15:28:05.357833 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:28:05.357782 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/af1accab-19c1-4b1c-a8e7-290d0f5252a4-console-serving-cert\") pod \"console-579f7fb596-jpftg\" (UID: \"af1accab-19c1-4b1c-a8e7-290d0f5252a4\") " pod="openshift-console/console-579f7fb596-jpftg" Apr 19 15:28:05.357833 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:28:05.357829 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/af1accab-19c1-4b1c-a8e7-290d0f5252a4-service-ca\") pod \"console-579f7fb596-jpftg\" (UID: \"af1accab-19c1-4b1c-a8e7-290d0f5252a4\") " pod="openshift-console/console-579f7fb596-jpftg" Apr 19 15:28:05.358052 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:28:05.357953 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2h6kl\" (UniqueName: \"kubernetes.io/projected/af1accab-19c1-4b1c-a8e7-290d0f5252a4-kube-api-access-2h6kl\") pod \"console-579f7fb596-jpftg\" (UID: \"af1accab-19c1-4b1c-a8e7-290d0f5252a4\") " pod="openshift-console/console-579f7fb596-jpftg" Apr 19 15:28:05.358052 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:28:05.358017 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/af1accab-19c1-4b1c-a8e7-290d0f5252a4-trusted-ca-bundle\") pod \"console-579f7fb596-jpftg\" (UID: \"af1accab-19c1-4b1c-a8e7-290d0f5252a4\") " pod="openshift-console/console-579f7fb596-jpftg" Apr 19 15:28:05.358130 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:28:05.358052 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/af1accab-19c1-4b1c-a8e7-290d0f5252a4-console-config\") pod \"console-579f7fb596-jpftg\" (UID: \"af1accab-19c1-4b1c-a8e7-290d0f5252a4\") " pod="openshift-console/console-579f7fb596-jpftg" Apr 19 15:28:05.358130 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:28:05.358072 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/af1accab-19c1-4b1c-a8e7-290d0f5252a4-oauth-serving-cert\") pod \"console-579f7fb596-jpftg\" (UID: \"af1accab-19c1-4b1c-a8e7-290d0f5252a4\") " pod="openshift-console/console-579f7fb596-jpftg" Apr 19 15:28:05.358130 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:28:05.358091 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/af1accab-19c1-4b1c-a8e7-290d0f5252a4-console-oauth-config\") pod \"console-579f7fb596-jpftg\" (UID: \"af1accab-19c1-4b1c-a8e7-290d0f5252a4\") " pod="openshift-console/console-579f7fb596-jpftg" Apr 19 15:28:05.459378 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:28:05.459338 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2h6kl\" (UniqueName: \"kubernetes.io/projected/af1accab-19c1-4b1c-a8e7-290d0f5252a4-kube-api-access-2h6kl\") pod \"console-579f7fb596-jpftg\" (UID: \"af1accab-19c1-4b1c-a8e7-290d0f5252a4\") " pod="openshift-console/console-579f7fb596-jpftg" Apr 19 15:28:05.459537 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:28:05.459420 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/af1accab-19c1-4b1c-a8e7-290d0f5252a4-trusted-ca-bundle\") pod \"console-579f7fb596-jpftg\" (UID: \"af1accab-19c1-4b1c-a8e7-290d0f5252a4\") " pod="openshift-console/console-579f7fb596-jpftg" Apr 19 15:28:05.459537 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:28:05.459453 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/af1accab-19c1-4b1c-a8e7-290d0f5252a4-console-config\") pod \"console-579f7fb596-jpftg\" (UID: \"af1accab-19c1-4b1c-a8e7-290d0f5252a4\") " pod="openshift-console/console-579f7fb596-jpftg" Apr 19 15:28:05.459537 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:28:05.459479 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/af1accab-19c1-4b1c-a8e7-290d0f5252a4-oauth-serving-cert\") pod \"console-579f7fb596-jpftg\" (UID: \"af1accab-19c1-4b1c-a8e7-290d0f5252a4\") " pod="openshift-console/console-579f7fb596-jpftg" Apr 19 15:28:05.459537 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:28:05.459506 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/af1accab-19c1-4b1c-a8e7-290d0f5252a4-console-oauth-config\") pod \"console-579f7fb596-jpftg\" (UID: \"af1accab-19c1-4b1c-a8e7-290d0f5252a4\") " pod="openshift-console/console-579f7fb596-jpftg" Apr 19 15:28:05.459822 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:28:05.459538 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/af1accab-19c1-4b1c-a8e7-290d0f5252a4-console-serving-cert\") pod \"console-579f7fb596-jpftg\" (UID: \"af1accab-19c1-4b1c-a8e7-290d0f5252a4\") " pod="openshift-console/console-579f7fb596-jpftg" Apr 19 15:28:05.459822 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:28:05.459561 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/af1accab-19c1-4b1c-a8e7-290d0f5252a4-service-ca\") pod \"console-579f7fb596-jpftg\" (UID: \"af1accab-19c1-4b1c-a8e7-290d0f5252a4\") " pod="openshift-console/console-579f7fb596-jpftg" Apr 19 15:28:05.460242 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:28:05.460214 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/af1accab-19c1-4b1c-a8e7-290d0f5252a4-oauth-serving-cert\") pod \"console-579f7fb596-jpftg\" (UID: \"af1accab-19c1-4b1c-a8e7-290d0f5252a4\") " pod="openshift-console/console-579f7fb596-jpftg" Apr 19 15:28:05.460393 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:28:05.460369 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/af1accab-19c1-4b1c-a8e7-290d0f5252a4-console-config\") pod \"console-579f7fb596-jpftg\" (UID: \"af1accab-19c1-4b1c-a8e7-290d0f5252a4\") " pod="openshift-console/console-579f7fb596-jpftg" Apr 19 15:28:05.460433 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:28:05.460417 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/af1accab-19c1-4b1c-a8e7-290d0f5252a4-trusted-ca-bundle\") pod \"console-579f7fb596-jpftg\" (UID: \"af1accab-19c1-4b1c-a8e7-290d0f5252a4\") " pod="openshift-console/console-579f7fb596-jpftg" Apr 19 15:28:05.460532 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:28:05.460510 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/af1accab-19c1-4b1c-a8e7-290d0f5252a4-service-ca\") pod \"console-579f7fb596-jpftg\" (UID: \"af1accab-19c1-4b1c-a8e7-290d0f5252a4\") " pod="openshift-console/console-579f7fb596-jpftg" Apr 19 15:28:05.462188 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:28:05.462156 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/af1accab-19c1-4b1c-a8e7-290d0f5252a4-console-serving-cert\") pod \"console-579f7fb596-jpftg\" (UID: \"af1accab-19c1-4b1c-a8e7-290d0f5252a4\") " pod="openshift-console/console-579f7fb596-jpftg" Apr 19 15:28:05.462276 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:28:05.462213 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/af1accab-19c1-4b1c-a8e7-290d0f5252a4-console-oauth-config\") pod \"console-579f7fb596-jpftg\" (UID: \"af1accab-19c1-4b1c-a8e7-290d0f5252a4\") " pod="openshift-console/console-579f7fb596-jpftg" Apr 19 15:28:05.466933 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:28:05.466913 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2h6kl\" (UniqueName: \"kubernetes.io/projected/af1accab-19c1-4b1c-a8e7-290d0f5252a4-kube-api-access-2h6kl\") pod \"console-579f7fb596-jpftg\" (UID: \"af1accab-19c1-4b1c-a8e7-290d0f5252a4\") " pod="openshift-console/console-579f7fb596-jpftg" Apr 19 15:28:05.538206 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:28:05.538097 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-579f7fb596-jpftg" Apr 19 15:28:05.663953 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:28:05.663921 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-579f7fb596-jpftg"] Apr 19 15:28:05.667655 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:28:05.667616 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf1accab_19c1_4b1c_a8e7_290d0f5252a4.slice/crio-8395a04ea9f5b529a029a514b9a284198b0c1c95dfaaf9cdf8490a809b0db53f WatchSource:0}: Error finding container 8395a04ea9f5b529a029a514b9a284198b0c1c95dfaaf9cdf8490a809b0db53f: Status 404 returned error can't find the container with id 8395a04ea9f5b529a029a514b9a284198b0c1c95dfaaf9cdf8490a809b0db53f Apr 19 15:28:06.276213 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:28:06.276179 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-579f7fb596-jpftg" event={"ID":"af1accab-19c1-4b1c-a8e7-290d0f5252a4","Type":"ContainerStarted","Data":"1356884c2a1eb520074dbe35a1c61971db21f64bb79009352f492fb6e596a9f1"} Apr 19 15:28:06.276213 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:28:06.276216 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-579f7fb596-jpftg" event={"ID":"af1accab-19c1-4b1c-a8e7-290d0f5252a4","Type":"ContainerStarted","Data":"8395a04ea9f5b529a029a514b9a284198b0c1c95dfaaf9cdf8490a809b0db53f"} Apr 19 15:28:06.293204 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:28:06.293155 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-579f7fb596-jpftg" podStartSLOduration=1.2931413680000001 podStartE2EDuration="1.293141368s" podCreationTimestamp="2026-04-19 15:28:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-19 15:28:06.291303255 +0000 UTC m=+185.397218665" watchObservedRunningTime="2026-04-19 15:28:06.293141368 +0000 UTC m=+185.399056756" Apr 19 15:28:15.539230 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:28:15.539186 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-579f7fb596-jpftg" Apr 19 15:28:15.539755 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:28:15.539266 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-579f7fb596-jpftg" Apr 19 15:28:15.544834 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:28:15.544812 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-579f7fb596-jpftg" Apr 19 15:28:16.308039 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:28:16.308002 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-579f7fb596-jpftg" Apr 19 15:28:16.350068 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:28:16.350031 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-c4dddf6d5-nkgrt"] Apr 19 15:28:38.412681 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:28:38.412646 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-spsbv"] Apr 19 15:28:38.415873 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:28:38.415855 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-spsbv" Apr 19 15:28:38.418293 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:28:38.418275 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 19 15:28:38.436889 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:28:38.436854 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-spsbv"] Apr 19 15:28:38.543608 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:28:38.543564 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/55899c70-f0e0-481b-94b0-9aad2305242f-kubelet-config\") pod \"global-pull-secret-syncer-spsbv\" (UID: \"55899c70-f0e0-481b-94b0-9aad2305242f\") " pod="kube-system/global-pull-secret-syncer-spsbv" Apr 19 15:28:38.543832 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:28:38.543619 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/55899c70-f0e0-481b-94b0-9aad2305242f-original-pull-secret\") pod \"global-pull-secret-syncer-spsbv\" (UID: \"55899c70-f0e0-481b-94b0-9aad2305242f\") " pod="kube-system/global-pull-secret-syncer-spsbv" Apr 19 15:28:38.543832 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:28:38.543756 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/55899c70-f0e0-481b-94b0-9aad2305242f-dbus\") pod \"global-pull-secret-syncer-spsbv\" (UID: \"55899c70-f0e0-481b-94b0-9aad2305242f\") " pod="kube-system/global-pull-secret-syncer-spsbv" Apr 19 15:28:38.644848 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:28:38.644814 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/55899c70-f0e0-481b-94b0-9aad2305242f-kubelet-config\") pod \"global-pull-secret-syncer-spsbv\" (UID: \"55899c70-f0e0-481b-94b0-9aad2305242f\") " pod="kube-system/global-pull-secret-syncer-spsbv" Apr 19 15:28:38.644988 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:28:38.644864 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/55899c70-f0e0-481b-94b0-9aad2305242f-original-pull-secret\") pod \"global-pull-secret-syncer-spsbv\" (UID: \"55899c70-f0e0-481b-94b0-9aad2305242f\") " pod="kube-system/global-pull-secret-syncer-spsbv" Apr 19 15:28:38.644988 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:28:38.644905 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/55899c70-f0e0-481b-94b0-9aad2305242f-dbus\") pod \"global-pull-secret-syncer-spsbv\" (UID: \"55899c70-f0e0-481b-94b0-9aad2305242f\") " pod="kube-system/global-pull-secret-syncer-spsbv" Apr 19 15:28:38.644988 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:28:38.644962 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/55899c70-f0e0-481b-94b0-9aad2305242f-kubelet-config\") pod \"global-pull-secret-syncer-spsbv\" (UID: \"55899c70-f0e0-481b-94b0-9aad2305242f\") " pod="kube-system/global-pull-secret-syncer-spsbv" Apr 19 15:28:38.645084 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:28:38.645057 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/55899c70-f0e0-481b-94b0-9aad2305242f-dbus\") pod \"global-pull-secret-syncer-spsbv\" (UID: \"55899c70-f0e0-481b-94b0-9aad2305242f\") " pod="kube-system/global-pull-secret-syncer-spsbv" Apr 19 15:28:38.647384 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:28:38.647356 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/55899c70-f0e0-481b-94b0-9aad2305242f-original-pull-secret\") pod \"global-pull-secret-syncer-spsbv\" (UID: \"55899c70-f0e0-481b-94b0-9aad2305242f\") " pod="kube-system/global-pull-secret-syncer-spsbv" Apr 19 15:28:38.724603 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:28:38.724562 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-spsbv" Apr 19 15:28:38.848463 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:28:38.848299 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-spsbv"] Apr 19 15:28:38.851526 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:28:38.851484 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55899c70_f0e0_481b_94b0_9aad2305242f.slice/crio-e092b431572b3d67e46acb83ccd15039d683f7eff6a1fb8c13af6cac7759f0b3 WatchSource:0}: Error finding container e092b431572b3d67e46acb83ccd15039d683f7eff6a1fb8c13af6cac7759f0b3: Status 404 returned error can't find the container with id e092b431572b3d67e46acb83ccd15039d683f7eff6a1fb8c13af6cac7759f0b3 Apr 19 15:28:39.373174 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:28:39.373140 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-spsbv" event={"ID":"55899c70-f0e0-481b-94b0-9aad2305242f","Type":"ContainerStarted","Data":"e092b431572b3d67e46acb83ccd15039d683f7eff6a1fb8c13af6cac7759f0b3"} Apr 19 15:28:41.370252 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:28:41.370164 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-c4dddf6d5-nkgrt" podUID="d2fc52e5-27a4-4f86-9d56-db3e1aec4727" containerName="console" containerID="cri-o://49108cdc6d557736e3392e5ffc7188bcc3b72a5bafb44f86195ca1a25eddc99c" gracePeriod=15 Apr 19 15:28:42.627307 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:28:42.627279 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-c4dddf6d5-nkgrt_d2fc52e5-27a4-4f86-9d56-db3e1aec4727/console/0.log" Apr 19 15:28:42.627635 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:28:42.627350 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c4dddf6d5-nkgrt" Apr 19 15:28:42.681033 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:28:42.680996 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d2fc52e5-27a4-4f86-9d56-db3e1aec4727-console-serving-cert\") pod \"d2fc52e5-27a4-4f86-9d56-db3e1aec4727\" (UID: \"d2fc52e5-27a4-4f86-9d56-db3e1aec4727\") " Apr 19 15:28:42.681197 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:28:42.681070 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d2fc52e5-27a4-4f86-9d56-db3e1aec4727-service-ca\") pod \"d2fc52e5-27a4-4f86-9d56-db3e1aec4727\" (UID: \"d2fc52e5-27a4-4f86-9d56-db3e1aec4727\") " Apr 19 15:28:42.681197 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:28:42.681172 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d2fc52e5-27a4-4f86-9d56-db3e1aec4727-oauth-serving-cert\") pod \"d2fc52e5-27a4-4f86-9d56-db3e1aec4727\" (UID: \"d2fc52e5-27a4-4f86-9d56-db3e1aec4727\") " Apr 19 15:28:42.681284 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:28:42.681219 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d2fc52e5-27a4-4f86-9d56-db3e1aec4727-trusted-ca-bundle\") pod \"d2fc52e5-27a4-4f86-9d56-db3e1aec4727\" (UID: \"d2fc52e5-27a4-4f86-9d56-db3e1aec4727\") " Apr 19 15:28:42.681284 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:28:42.681257 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d2fc52e5-27a4-4f86-9d56-db3e1aec4727-console-oauth-config\") pod \"d2fc52e5-27a4-4f86-9d56-db3e1aec4727\" (UID: \"d2fc52e5-27a4-4f86-9d56-db3e1aec4727\") " Apr 19 15:28:42.681365 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:28:42.681323 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d2fc52e5-27a4-4f86-9d56-db3e1aec4727-console-config\") pod \"d2fc52e5-27a4-4f86-9d56-db3e1aec4727\" (UID: \"d2fc52e5-27a4-4f86-9d56-db3e1aec4727\") " Apr 19 15:28:42.681417 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:28:42.681378 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tkcqd\" (UniqueName: \"kubernetes.io/projected/d2fc52e5-27a4-4f86-9d56-db3e1aec4727-kube-api-access-tkcqd\") pod \"d2fc52e5-27a4-4f86-9d56-db3e1aec4727\" (UID: \"d2fc52e5-27a4-4f86-9d56-db3e1aec4727\") " Apr 19 15:28:42.681466 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:28:42.681412 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2fc52e5-27a4-4f86-9d56-db3e1aec4727-service-ca" (OuterVolumeSpecName: "service-ca") pod "d2fc52e5-27a4-4f86-9d56-db3e1aec4727" (UID: "d2fc52e5-27a4-4f86-9d56-db3e1aec4727"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 19 15:28:42.681667 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:28:42.681528 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2fc52e5-27a4-4f86-9d56-db3e1aec4727-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "d2fc52e5-27a4-4f86-9d56-db3e1aec4727" (UID: "d2fc52e5-27a4-4f86-9d56-db3e1aec4727"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 19 15:28:42.681667 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:28:42.681614 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2fc52e5-27a4-4f86-9d56-db3e1aec4727-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "d2fc52e5-27a4-4f86-9d56-db3e1aec4727" (UID: "d2fc52e5-27a4-4f86-9d56-db3e1aec4727"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 19 15:28:42.681892 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:28:42.681678 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2fc52e5-27a4-4f86-9d56-db3e1aec4727-console-config" (OuterVolumeSpecName: "console-config") pod "d2fc52e5-27a4-4f86-9d56-db3e1aec4727" (UID: "d2fc52e5-27a4-4f86-9d56-db3e1aec4727"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 19 15:28:42.682010 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:28:42.681986 2579 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d2fc52e5-27a4-4f86-9d56-db3e1aec4727-service-ca\") on node \"ip-10-0-133-218.ec2.internal\" DevicePath \"\"" Apr 19 15:28:42.682010 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:28:42.682011 2579 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d2fc52e5-27a4-4f86-9d56-db3e1aec4727-oauth-serving-cert\") on node \"ip-10-0-133-218.ec2.internal\" DevicePath \"\"" Apr 19 15:28:42.682182 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:28:42.682022 2579 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d2fc52e5-27a4-4f86-9d56-db3e1aec4727-trusted-ca-bundle\") on node \"ip-10-0-133-218.ec2.internal\" DevicePath \"\"" Apr 19 15:28:42.682182 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:28:42.682032 2579 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d2fc52e5-27a4-4f86-9d56-db3e1aec4727-console-config\") on node \"ip-10-0-133-218.ec2.internal\" DevicePath \"\"" Apr 19 15:28:42.683521 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:28:42.683495 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2fc52e5-27a4-4f86-9d56-db3e1aec4727-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "d2fc52e5-27a4-4f86-9d56-db3e1aec4727" (UID: "d2fc52e5-27a4-4f86-9d56-db3e1aec4727"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 19 15:28:42.683646 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:28:42.683629 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2fc52e5-27a4-4f86-9d56-db3e1aec4727-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "d2fc52e5-27a4-4f86-9d56-db3e1aec4727" (UID: "d2fc52e5-27a4-4f86-9d56-db3e1aec4727"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 19 15:28:42.684117 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:28:42.684100 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2fc52e5-27a4-4f86-9d56-db3e1aec4727-kube-api-access-tkcqd" (OuterVolumeSpecName: "kube-api-access-tkcqd") pod "d2fc52e5-27a4-4f86-9d56-db3e1aec4727" (UID: "d2fc52e5-27a4-4f86-9d56-db3e1aec4727"). InnerVolumeSpecName "kube-api-access-tkcqd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 15:28:42.783000 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:28:42.782957 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tkcqd\" (UniqueName: \"kubernetes.io/projected/d2fc52e5-27a4-4f86-9d56-db3e1aec4727-kube-api-access-tkcqd\") on node \"ip-10-0-133-218.ec2.internal\" DevicePath \"\"" Apr 19 15:28:42.783000 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:28:42.783003 2579 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d2fc52e5-27a4-4f86-9d56-db3e1aec4727-console-serving-cert\") on node \"ip-10-0-133-218.ec2.internal\" DevicePath \"\"" Apr 19 15:28:42.783196 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:28:42.783019 2579 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d2fc52e5-27a4-4f86-9d56-db3e1aec4727-console-oauth-config\") on node \"ip-10-0-133-218.ec2.internal\" DevicePath \"\"" Apr 19 15:28:43.386422 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:28:43.386381 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-spsbv" event={"ID":"55899c70-f0e0-481b-94b0-9aad2305242f","Type":"ContainerStarted","Data":"b19389c45916fd7a57e4c9d1646008f9636efddccc6a15c51d35366afe569b35"} Apr 19 15:28:43.387588 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:28:43.387568 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-c4dddf6d5-nkgrt_d2fc52e5-27a4-4f86-9d56-db3e1aec4727/console/0.log" Apr 19 15:28:43.387701 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:28:43.387608 2579 generic.go:358] "Generic (PLEG): container finished" podID="d2fc52e5-27a4-4f86-9d56-db3e1aec4727" containerID="49108cdc6d557736e3392e5ffc7188bcc3b72a5bafb44f86195ca1a25eddc99c" exitCode=2 Apr 19 15:28:43.387701 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:28:43.387644 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c4dddf6d5-nkgrt" event={"ID":"d2fc52e5-27a4-4f86-9d56-db3e1aec4727","Type":"ContainerDied","Data":"49108cdc6d557736e3392e5ffc7188bcc3b72a5bafb44f86195ca1a25eddc99c"} Apr 19 15:28:43.387701 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:28:43.387664 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c4dddf6d5-nkgrt" Apr 19 15:28:43.387701 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:28:43.387691 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c4dddf6d5-nkgrt" event={"ID":"d2fc52e5-27a4-4f86-9d56-db3e1aec4727","Type":"ContainerDied","Data":"384bee5c62e5d7005a06429ad4c3c4fe5a9abc6b7a66da720a968fb718e93b88"} Apr 19 15:28:43.387866 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:28:43.387713 2579 scope.go:117] "RemoveContainer" containerID="49108cdc6d557736e3392e5ffc7188bcc3b72a5bafb44f86195ca1a25eddc99c" Apr 19 15:28:43.396382 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:28:43.396221 2579 scope.go:117] "RemoveContainer" containerID="49108cdc6d557736e3392e5ffc7188bcc3b72a5bafb44f86195ca1a25eddc99c" Apr 19 15:28:43.396519 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:28:43.396501 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49108cdc6d557736e3392e5ffc7188bcc3b72a5bafb44f86195ca1a25eddc99c\": container with ID starting with 49108cdc6d557736e3392e5ffc7188bcc3b72a5bafb44f86195ca1a25eddc99c not found: ID does not exist" containerID="49108cdc6d557736e3392e5ffc7188bcc3b72a5bafb44f86195ca1a25eddc99c" Apr 19 15:28:43.396577 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:28:43.396529 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49108cdc6d557736e3392e5ffc7188bcc3b72a5bafb44f86195ca1a25eddc99c"} err="failed to get container status \"49108cdc6d557736e3392e5ffc7188bcc3b72a5bafb44f86195ca1a25eddc99c\": rpc error: code = NotFound desc = could not find container \"49108cdc6d557736e3392e5ffc7188bcc3b72a5bafb44f86195ca1a25eddc99c\": container with ID starting with 49108cdc6d557736e3392e5ffc7188bcc3b72a5bafb44f86195ca1a25eddc99c not found: ID does not exist" Apr 19 15:28:43.400298 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:28:43.400261 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-spsbv" podStartSLOduration=1.811112303 podStartE2EDuration="5.400249834s" podCreationTimestamp="2026-04-19 15:28:38 +0000 UTC" firstStartedPulling="2026-04-19 15:28:38.853351519 +0000 UTC m=+217.959266884" lastFinishedPulling="2026-04-19 15:28:42.442489047 +0000 UTC m=+221.548404415" observedRunningTime="2026-04-19 15:28:43.399323955 +0000 UTC m=+222.505239345" watchObservedRunningTime="2026-04-19 15:28:43.400249834 +0000 UTC m=+222.506165218" Apr 19 15:28:43.412035 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:28:43.412012 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-c4dddf6d5-nkgrt"] Apr 19 15:28:43.415375 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:28:43.415351 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-c4dddf6d5-nkgrt"] Apr 19 15:28:43.545547 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:28:43.545505 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2fc52e5-27a4-4f86-9d56-db3e1aec4727" path="/var/lib/kubelet/pods/d2fc52e5-27a4-4f86-9d56-db3e1aec4727/volumes" Apr 19 15:28:52.084535 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:28:52.084497 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e58rqts"] Apr 19 15:28:52.085029 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:28:52.084864 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d2fc52e5-27a4-4f86-9d56-db3e1aec4727" containerName="console" Apr 19 15:28:52.085029 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:28:52.084879 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2fc52e5-27a4-4f86-9d56-db3e1aec4727" containerName="console" Apr 19 15:28:52.085029 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:28:52.084942 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="d2fc52e5-27a4-4f86-9d56-db3e1aec4727" containerName="console" Apr 19 15:28:52.088002 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:28:52.087977 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e58rqts" Apr 19 15:28:52.090218 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:28:52.090195 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 19 15:28:52.090218 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:28:52.090266 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-s8b9d\"" Apr 19 15:28:52.090914 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:28:52.090884 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 19 15:28:52.096697 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:28:52.096664 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e58rqts"] Apr 19 15:28:52.168130 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:28:52.168095 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a54d8642-73fb-4a98-83d5-87e7eb8341e8-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e58rqts\" (UID: \"a54d8642-73fb-4a98-83d5-87e7eb8341e8\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e58rqts" Apr 19 15:28:52.168298 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:28:52.168145 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a54d8642-73fb-4a98-83d5-87e7eb8341e8-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e58rqts\" (UID: \"a54d8642-73fb-4a98-83d5-87e7eb8341e8\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e58rqts" Apr 19 15:28:52.168298 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:28:52.168199 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8286\" (UniqueName: \"kubernetes.io/projected/a54d8642-73fb-4a98-83d5-87e7eb8341e8-kube-api-access-g8286\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e58rqts\" (UID: \"a54d8642-73fb-4a98-83d5-87e7eb8341e8\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e58rqts" Apr 19 15:28:52.268949 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:28:52.268913 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a54d8642-73fb-4a98-83d5-87e7eb8341e8-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e58rqts\" (UID: \"a54d8642-73fb-4a98-83d5-87e7eb8341e8\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e58rqts" Apr 19 15:28:52.269092 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:28:52.268973 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g8286\" (UniqueName: \"kubernetes.io/projected/a54d8642-73fb-4a98-83d5-87e7eb8341e8-kube-api-access-g8286\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e58rqts\" (UID: \"a54d8642-73fb-4a98-83d5-87e7eb8341e8\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e58rqts" Apr 19 15:28:52.269092 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:28:52.269041 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a54d8642-73fb-4a98-83d5-87e7eb8341e8-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e58rqts\" (UID: \"a54d8642-73fb-4a98-83d5-87e7eb8341e8\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e58rqts" Apr 19 15:28:52.269396 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:28:52.269369 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a54d8642-73fb-4a98-83d5-87e7eb8341e8-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e58rqts\" (UID: \"a54d8642-73fb-4a98-83d5-87e7eb8341e8\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e58rqts" Apr 19 15:28:52.269433 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:28:52.269378 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a54d8642-73fb-4a98-83d5-87e7eb8341e8-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e58rqts\" (UID: \"a54d8642-73fb-4a98-83d5-87e7eb8341e8\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e58rqts" Apr 19 15:28:52.277226 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:28:52.277190 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8286\" (UniqueName: \"kubernetes.io/projected/a54d8642-73fb-4a98-83d5-87e7eb8341e8-kube-api-access-g8286\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e58rqts\" (UID: \"a54d8642-73fb-4a98-83d5-87e7eb8341e8\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e58rqts" Apr 19 15:28:52.398268 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:28:52.398181 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e58rqts" Apr 19 15:28:52.519092 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:28:52.519067 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e58rqts"] Apr 19 15:28:52.521430 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:28:52.521403 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda54d8642_73fb_4a98_83d5_87e7eb8341e8.slice/crio-06b406b854fdbdcac3361e66c68fcd67bf9e91340ac8573711db880d8a3654d3 WatchSource:0}: Error finding container 06b406b854fdbdcac3361e66c68fcd67bf9e91340ac8573711db880d8a3654d3: Status 404 returned error can't find the container with id 06b406b854fdbdcac3361e66c68fcd67bf9e91340ac8573711db880d8a3654d3 Apr 19 15:28:53.423758 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:28:53.423701 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e58rqts" event={"ID":"a54d8642-73fb-4a98-83d5-87e7eb8341e8","Type":"ContainerStarted","Data":"06b406b854fdbdcac3361e66c68fcd67bf9e91340ac8573711db880d8a3654d3"} Apr 19 15:28:59.443586 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:28:59.443552 2579 generic.go:358] "Generic (PLEG): container finished" podID="a54d8642-73fb-4a98-83d5-87e7eb8341e8" containerID="5494805648a1ff75045acfab034d976b9722ae7b16bd2291178c0bb401e64783" exitCode=0 Apr 19 15:28:59.444000 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:28:59.443651 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e58rqts" event={"ID":"a54d8642-73fb-4a98-83d5-87e7eb8341e8","Type":"ContainerDied","Data":"5494805648a1ff75045acfab034d976b9722ae7b16bd2291178c0bb401e64783"} Apr 19 15:29:02.455943 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:02.455904 2579 generic.go:358] "Generic (PLEG): container finished" podID="a54d8642-73fb-4a98-83d5-87e7eb8341e8" containerID="fc8a31b486347f0db351cb614a65a0b7a24984583dcb81fb9908b06a06d3c5e5" exitCode=0 Apr 19 15:29:02.456350 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:02.455968 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e58rqts" event={"ID":"a54d8642-73fb-4a98-83d5-87e7eb8341e8","Type":"ContainerDied","Data":"fc8a31b486347f0db351cb614a65a0b7a24984583dcb81fb9908b06a06d3c5e5"} Apr 19 15:29:10.487379 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:10.487341 2579 generic.go:358] "Generic (PLEG): container finished" podID="a54d8642-73fb-4a98-83d5-87e7eb8341e8" containerID="ba21692670264af99daee824c0657018bcccf6cc01d24ef22b0498d099168159" exitCode=0 Apr 19 15:29:10.487761 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:10.487400 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e58rqts" event={"ID":"a54d8642-73fb-4a98-83d5-87e7eb8341e8","Type":"ContainerDied","Data":"ba21692670264af99daee824c0657018bcccf6cc01d24ef22b0498d099168159"} Apr 19 15:29:11.620404 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:11.620380 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e58rqts" Apr 19 15:29:11.635670 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:11.634880 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a54d8642-73fb-4a98-83d5-87e7eb8341e8-bundle\") pod \"a54d8642-73fb-4a98-83d5-87e7eb8341e8\" (UID: \"a54d8642-73fb-4a98-83d5-87e7eb8341e8\") " Apr 19 15:29:11.635953 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:11.635925 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g8286\" (UniqueName: \"kubernetes.io/projected/a54d8642-73fb-4a98-83d5-87e7eb8341e8-kube-api-access-g8286\") pod \"a54d8642-73fb-4a98-83d5-87e7eb8341e8\" (UID: \"a54d8642-73fb-4a98-83d5-87e7eb8341e8\") " Apr 19 15:29:11.636117 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:11.636015 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a54d8642-73fb-4a98-83d5-87e7eb8341e8-util\") pod \"a54d8642-73fb-4a98-83d5-87e7eb8341e8\" (UID: \"a54d8642-73fb-4a98-83d5-87e7eb8341e8\") " Apr 19 15:29:11.637288 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:11.637248 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a54d8642-73fb-4a98-83d5-87e7eb8341e8-bundle" (OuterVolumeSpecName: "bundle") pod "a54d8642-73fb-4a98-83d5-87e7eb8341e8" (UID: "a54d8642-73fb-4a98-83d5-87e7eb8341e8"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 19 15:29:11.640820 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:11.640796 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a54d8642-73fb-4a98-83d5-87e7eb8341e8-kube-api-access-g8286" (OuterVolumeSpecName: "kube-api-access-g8286") pod "a54d8642-73fb-4a98-83d5-87e7eb8341e8" (UID: "a54d8642-73fb-4a98-83d5-87e7eb8341e8"). InnerVolumeSpecName "kube-api-access-g8286". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 15:29:11.641907 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:11.641873 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a54d8642-73fb-4a98-83d5-87e7eb8341e8-util" (OuterVolumeSpecName: "util") pod "a54d8642-73fb-4a98-83d5-87e7eb8341e8" (UID: "a54d8642-73fb-4a98-83d5-87e7eb8341e8"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 19 15:29:11.737384 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:11.737343 2579 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a54d8642-73fb-4a98-83d5-87e7eb8341e8-bundle\") on node \"ip-10-0-133-218.ec2.internal\" DevicePath \"\"" Apr 19 15:29:11.737384 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:11.737380 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-g8286\" (UniqueName: \"kubernetes.io/projected/a54d8642-73fb-4a98-83d5-87e7eb8341e8-kube-api-access-g8286\") on node \"ip-10-0-133-218.ec2.internal\" DevicePath \"\"" Apr 19 15:29:11.737384 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:11.737391 2579 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a54d8642-73fb-4a98-83d5-87e7eb8341e8-util\") on node \"ip-10-0-133-218.ec2.internal\" DevicePath \"\"" Apr 19 15:29:12.494493 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:12.494461 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e58rqts" Apr 19 15:29:12.494493 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:12.494469 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e58rqts" event={"ID":"a54d8642-73fb-4a98-83d5-87e7eb8341e8","Type":"ContainerDied","Data":"06b406b854fdbdcac3361e66c68fcd67bf9e91340ac8573711db880d8a3654d3"} Apr 19 15:29:12.494696 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:12.494503 2579 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="06b406b854fdbdcac3361e66c68fcd67bf9e91340ac8573711db880d8a3654d3" Apr 19 15:29:19.368665 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:19.368579 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-8v6l4"] Apr 19 15:29:19.369200 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:19.368904 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a54d8642-73fb-4a98-83d5-87e7eb8341e8" containerName="pull" Apr 19 15:29:19.369200 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:19.368916 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="a54d8642-73fb-4a98-83d5-87e7eb8341e8" containerName="pull" Apr 19 15:29:19.369200 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:19.368929 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a54d8642-73fb-4a98-83d5-87e7eb8341e8" containerName="extract" Apr 19 15:29:19.369200 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:19.368934 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="a54d8642-73fb-4a98-83d5-87e7eb8341e8" containerName="extract" Apr 19 15:29:19.369200 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:19.368941 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a54d8642-73fb-4a98-83d5-87e7eb8341e8" containerName="util" Apr 19 15:29:19.369200 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:19.368947 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="a54d8642-73fb-4a98-83d5-87e7eb8341e8" containerName="util" Apr 19 15:29:19.369200 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:19.368999 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="a54d8642-73fb-4a98-83d5-87e7eb8341e8" containerName="extract" Apr 19 15:29:19.371988 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:19.371970 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-8v6l4" Apr 19 15:29:19.374171 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:19.374136 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 19 15:29:19.374296 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:19.374193 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 19 15:29:19.374353 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:19.374312 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-jjb5f\"" Apr 19 15:29:19.382507 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:19.382483 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-8v6l4"] Apr 19 15:29:19.404526 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:19.404502 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/53630047-61ff-4edb-b041-85f458c4356f-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-8v6l4\" (UID: \"53630047-61ff-4edb-b041-85f458c4356f\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-8v6l4" Apr 19 15:29:19.404653 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:19.404581 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ct6mh\" (UniqueName: \"kubernetes.io/projected/53630047-61ff-4edb-b041-85f458c4356f-kube-api-access-ct6mh\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-8v6l4\" (UID: \"53630047-61ff-4edb-b041-85f458c4356f\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-8v6l4" Apr 19 15:29:19.505643 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:19.505602 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ct6mh\" (UniqueName: \"kubernetes.io/projected/53630047-61ff-4edb-b041-85f458c4356f-kube-api-access-ct6mh\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-8v6l4\" (UID: \"53630047-61ff-4edb-b041-85f458c4356f\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-8v6l4" Apr 19 15:29:19.505907 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:19.505650 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/53630047-61ff-4edb-b041-85f458c4356f-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-8v6l4\" (UID: \"53630047-61ff-4edb-b041-85f458c4356f\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-8v6l4" Apr 19 15:29:19.506079 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:19.506059 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/53630047-61ff-4edb-b041-85f458c4356f-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-8v6l4\" (UID: \"53630047-61ff-4edb-b041-85f458c4356f\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-8v6l4" Apr 19 15:29:19.514057 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:19.514030 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ct6mh\" (UniqueName: \"kubernetes.io/projected/53630047-61ff-4edb-b041-85f458c4356f-kube-api-access-ct6mh\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-8v6l4\" (UID: \"53630047-61ff-4edb-b041-85f458c4356f\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-8v6l4" Apr 19 15:29:19.681568 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:19.681536 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-8v6l4" Apr 19 15:29:19.810399 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:19.810374 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-8v6l4"] Apr 19 15:29:19.812865 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:29:19.812834 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53630047_61ff_4edb_b041_85f458c4356f.slice/crio-44b219d000a8d44f80659509c371331584424acb9430a04f5a6909bfdbe24641 WatchSource:0}: Error finding container 44b219d000a8d44f80659509c371331584424acb9430a04f5a6909bfdbe24641: Status 404 returned error can't find the container with id 44b219d000a8d44f80659509c371331584424acb9430a04f5a6909bfdbe24641 Apr 19 15:29:20.519767 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:20.519705 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-8v6l4" event={"ID":"53630047-61ff-4edb-b041-85f458c4356f","Type":"ContainerStarted","Data":"44b219d000a8d44f80659509c371331584424acb9430a04f5a6909bfdbe24641"} Apr 19 15:29:22.528400 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:22.528359 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-8v6l4" event={"ID":"53630047-61ff-4edb-b041-85f458c4356f","Type":"ContainerStarted","Data":"b15d66d4cf4c2724d01cd7b87f7501a1afd9bdc7eeffafa5debae839d1b3b860"} Apr 19 15:29:22.549696 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:22.549639 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-8v6l4" podStartSLOduration=1.501737377 podStartE2EDuration="3.54962151s" podCreationTimestamp="2026-04-19 15:29:19 +0000 UTC" firstStartedPulling="2026-04-19 15:29:19.815334931 +0000 UTC m=+258.921250298" lastFinishedPulling="2026-04-19 15:29:21.863219065 +0000 UTC m=+260.969134431" observedRunningTime="2026-04-19 15:29:22.547893366 +0000 UTC m=+261.653808877" watchObservedRunningTime="2026-04-19 15:29:22.54962151 +0000 UTC m=+261.655536898" Apr 19 15:29:23.740714 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:23.740678 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fm2fxp"] Apr 19 15:29:23.743870 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:23.743848 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fm2fxp" Apr 19 15:29:23.745926 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:23.745897 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 19 15:29:23.745926 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:23.745908 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 19 15:29:23.746753 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:23.746733 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-s8b9d\"" Apr 19 15:29:23.751883 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:23.751861 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fm2fxp"] Apr 19 15:29:23.843111 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:23.843070 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c8c7e4d0-331e-49b0-8f27-3404705604f5-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fm2fxp\" (UID: \"c8c7e4d0-331e-49b0-8f27-3404705604f5\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fm2fxp" Apr 19 15:29:23.843111 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:23.843114 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nh97k\" (UniqueName: \"kubernetes.io/projected/c8c7e4d0-331e-49b0-8f27-3404705604f5-kube-api-access-nh97k\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fm2fxp\" (UID: \"c8c7e4d0-331e-49b0-8f27-3404705604f5\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fm2fxp" Apr 19 15:29:23.843326 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:23.843147 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c8c7e4d0-331e-49b0-8f27-3404705604f5-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fm2fxp\" (UID: \"c8c7e4d0-331e-49b0-8f27-3404705604f5\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fm2fxp" Apr 19 15:29:23.943702 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:23.943661 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c8c7e4d0-331e-49b0-8f27-3404705604f5-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fm2fxp\" (UID: \"c8c7e4d0-331e-49b0-8f27-3404705604f5\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fm2fxp" Apr 19 15:29:23.943702 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:23.943707 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nh97k\" (UniqueName: \"kubernetes.io/projected/c8c7e4d0-331e-49b0-8f27-3404705604f5-kube-api-access-nh97k\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fm2fxp\" (UID: \"c8c7e4d0-331e-49b0-8f27-3404705604f5\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fm2fxp" Apr 19 15:29:23.943959 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:23.943771 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c8c7e4d0-331e-49b0-8f27-3404705604f5-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fm2fxp\" (UID: \"c8c7e4d0-331e-49b0-8f27-3404705604f5\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fm2fxp" Apr 19 15:29:23.944072 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:23.944049 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c8c7e4d0-331e-49b0-8f27-3404705604f5-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fm2fxp\" (UID: \"c8c7e4d0-331e-49b0-8f27-3404705604f5\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fm2fxp" Apr 19 15:29:23.944145 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:23.944080 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c8c7e4d0-331e-49b0-8f27-3404705604f5-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fm2fxp\" (UID: \"c8c7e4d0-331e-49b0-8f27-3404705604f5\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fm2fxp" Apr 19 15:29:23.952274 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:23.952255 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nh97k\" (UniqueName: \"kubernetes.io/projected/c8c7e4d0-331e-49b0-8f27-3404705604f5-kube-api-access-nh97k\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fm2fxp\" (UID: \"c8c7e4d0-331e-49b0-8f27-3404705604f5\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fm2fxp" Apr 19 15:29:24.053610 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:24.053522 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fm2fxp" Apr 19 15:29:24.175778 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:24.175685 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fm2fxp"] Apr 19 15:29:24.178052 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:29:24.178022 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc8c7e4d0_331e_49b0_8f27_3404705604f5.slice/crio-3f747252dda28ff029947e9dcc8cf7bc30fc33f9a69ff9c99eb0f9eae7e4847e WatchSource:0}: Error finding container 3f747252dda28ff029947e9dcc8cf7bc30fc33f9a69ff9c99eb0f9eae7e4847e: Status 404 returned error can't find the container with id 3f747252dda28ff029947e9dcc8cf7bc30fc33f9a69ff9c99eb0f9eae7e4847e Apr 19 15:29:24.537109 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:24.537069 2579 generic.go:358] "Generic (PLEG): container finished" podID="c8c7e4d0-331e-49b0-8f27-3404705604f5" containerID="c4c809defeb0571cbee508ecf0ff915cfcfa5c419f2aad7a5c6fc8baf7ca58fb" exitCode=0 Apr 19 15:29:24.537312 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:24.537163 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fm2fxp" event={"ID":"c8c7e4d0-331e-49b0-8f27-3404705604f5","Type":"ContainerDied","Data":"c4c809defeb0571cbee508ecf0ff915cfcfa5c419f2aad7a5c6fc8baf7ca58fb"} Apr 19 15:29:24.537312 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:24.537206 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fm2fxp" event={"ID":"c8c7e4d0-331e-49b0-8f27-3404705604f5","Type":"ContainerStarted","Data":"3f747252dda28ff029947e9dcc8cf7bc30fc33f9a69ff9c99eb0f9eae7e4847e"} Apr 19 15:29:25.069250 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:25.069214 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-hvjjv"] Apr 19 15:29:25.072218 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:25.072198 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-hvjjv" Apr 19 15:29:25.075061 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:25.075041 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-zhl6z\"" Apr 19 15:29:25.075236 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:25.075220 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 19 15:29:25.075776 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:25.075763 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 19 15:29:25.086535 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:25.086510 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-hvjjv"] Apr 19 15:29:25.154324 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:25.154285 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9978\" (UniqueName: \"kubernetes.io/projected/2c024fe8-633b-420d-8037-2ba6f1caa0dc-kube-api-access-x9978\") pod \"cert-manager-webhook-597b96b99b-hvjjv\" (UID: \"2c024fe8-633b-420d-8037-2ba6f1caa0dc\") " pod="cert-manager/cert-manager-webhook-597b96b99b-hvjjv" Apr 19 15:29:25.154324 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:25.154327 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2c024fe8-633b-420d-8037-2ba6f1caa0dc-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-hvjjv\" (UID: \"2c024fe8-633b-420d-8037-2ba6f1caa0dc\") " pod="cert-manager/cert-manager-webhook-597b96b99b-hvjjv" Apr 19 15:29:25.255026 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:25.254933 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2c024fe8-633b-420d-8037-2ba6f1caa0dc-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-hvjjv\" (UID: \"2c024fe8-633b-420d-8037-2ba6f1caa0dc\") " pod="cert-manager/cert-manager-webhook-597b96b99b-hvjjv" Apr 19 15:29:25.255026 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:25.255018 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x9978\" (UniqueName: \"kubernetes.io/projected/2c024fe8-633b-420d-8037-2ba6f1caa0dc-kube-api-access-x9978\") pod \"cert-manager-webhook-597b96b99b-hvjjv\" (UID: \"2c024fe8-633b-420d-8037-2ba6f1caa0dc\") " pod="cert-manager/cert-manager-webhook-597b96b99b-hvjjv" Apr 19 15:29:25.262628 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:25.262597 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2c024fe8-633b-420d-8037-2ba6f1caa0dc-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-hvjjv\" (UID: \"2c024fe8-633b-420d-8037-2ba6f1caa0dc\") " pod="cert-manager/cert-manager-webhook-597b96b99b-hvjjv" Apr 19 15:29:25.263042 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:25.263023 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9978\" (UniqueName: \"kubernetes.io/projected/2c024fe8-633b-420d-8037-2ba6f1caa0dc-kube-api-access-x9978\") pod \"cert-manager-webhook-597b96b99b-hvjjv\" (UID: \"2c024fe8-633b-420d-8037-2ba6f1caa0dc\") " pod="cert-manager/cert-manager-webhook-597b96b99b-hvjjv" Apr 19 15:29:25.391487 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:25.391448 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-hvjjv" Apr 19 15:29:25.519270 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:25.516512 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-hvjjv"] Apr 19 15:29:25.547444 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:25.547407 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-hvjjv" event={"ID":"2c024fe8-633b-420d-8037-2ba6f1caa0dc","Type":"ContainerStarted","Data":"b3afb0d18911c027b4c1dbdbfa51a6b6d83c691dbc7697b2a0fe9163243d8e57"} Apr 19 15:29:27.555252 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:27.555153 2579 generic.go:358] "Generic (PLEG): container finished" podID="c8c7e4d0-331e-49b0-8f27-3404705604f5" containerID="bd02ab2b65746d98521a89b8e7e50646e712c89dbb8af8ba907495543bd97315" exitCode=0 Apr 19 15:29:27.555252 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:27.555211 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fm2fxp" event={"ID":"c8c7e4d0-331e-49b0-8f27-3404705604f5","Type":"ContainerDied","Data":"bd02ab2b65746d98521a89b8e7e50646e712c89dbb8af8ba907495543bd97315"} Apr 19 15:29:28.560145 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:28.560107 2579 generic.go:358] "Generic (PLEG): container finished" podID="c8c7e4d0-331e-49b0-8f27-3404705604f5" containerID="fe1c853a670e1af3c7d8f0adbf4c360bf761f5ab9c5b08b3de67ae37560a8108" exitCode=0 Apr 19 15:29:28.560536 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:28.560199 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fm2fxp" event={"ID":"c8c7e4d0-331e-49b0-8f27-3404705604f5","Type":"ContainerDied","Data":"fe1c853a670e1af3c7d8f0adbf4c360bf761f5ab9c5b08b3de67ae37560a8108"} Apr 19 15:29:30.210748 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:30.210693 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fm2fxp" Apr 19 15:29:30.301660 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:30.301634 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nh97k\" (UniqueName: \"kubernetes.io/projected/c8c7e4d0-331e-49b0-8f27-3404705604f5-kube-api-access-nh97k\") pod \"c8c7e4d0-331e-49b0-8f27-3404705604f5\" (UID: \"c8c7e4d0-331e-49b0-8f27-3404705604f5\") " Apr 19 15:29:30.301813 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:30.301672 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c8c7e4d0-331e-49b0-8f27-3404705604f5-util\") pod \"c8c7e4d0-331e-49b0-8f27-3404705604f5\" (UID: \"c8c7e4d0-331e-49b0-8f27-3404705604f5\") " Apr 19 15:29:30.301813 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:30.301750 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c8c7e4d0-331e-49b0-8f27-3404705604f5-bundle\") pod \"c8c7e4d0-331e-49b0-8f27-3404705604f5\" (UID: \"c8c7e4d0-331e-49b0-8f27-3404705604f5\") " Apr 19 15:29:30.302233 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:30.302192 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8c7e4d0-331e-49b0-8f27-3404705604f5-bundle" (OuterVolumeSpecName: "bundle") pod "c8c7e4d0-331e-49b0-8f27-3404705604f5" (UID: "c8c7e4d0-331e-49b0-8f27-3404705604f5"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 19 15:29:30.304016 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:30.303992 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8c7e4d0-331e-49b0-8f27-3404705604f5-kube-api-access-nh97k" (OuterVolumeSpecName: "kube-api-access-nh97k") pod "c8c7e4d0-331e-49b0-8f27-3404705604f5" (UID: "c8c7e4d0-331e-49b0-8f27-3404705604f5"). InnerVolumeSpecName "kube-api-access-nh97k". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 15:29:30.307027 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:30.307001 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8c7e4d0-331e-49b0-8f27-3404705604f5-util" (OuterVolumeSpecName: "util") pod "c8c7e4d0-331e-49b0-8f27-3404705604f5" (UID: "c8c7e4d0-331e-49b0-8f27-3404705604f5"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 19 15:29:30.407809 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:30.403705 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nh97k\" (UniqueName: \"kubernetes.io/projected/c8c7e4d0-331e-49b0-8f27-3404705604f5-kube-api-access-nh97k\") on node \"ip-10-0-133-218.ec2.internal\" DevicePath \"\"" Apr 19 15:29:30.407809 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:30.403784 2579 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c8c7e4d0-331e-49b0-8f27-3404705604f5-util\") on node \"ip-10-0-133-218.ec2.internal\" DevicePath \"\"" Apr 19 15:29:30.407809 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:30.403839 2579 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c8c7e4d0-331e-49b0-8f27-3404705604f5-bundle\") on node \"ip-10-0-133-218.ec2.internal\" DevicePath \"\"" Apr 19 15:29:30.568512 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:30.568408 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-hvjjv" event={"ID":"2c024fe8-633b-420d-8037-2ba6f1caa0dc","Type":"ContainerStarted","Data":"a90257b73010d262605f79e1764edd6fa9b03c053b0fae8faf3ce632a52f8c1e"} Apr 19 15:29:30.568684 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:30.568526 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-597b96b99b-hvjjv" Apr 19 15:29:30.569947 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:30.569914 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fm2fxp" event={"ID":"c8c7e4d0-331e-49b0-8f27-3404705604f5","Type":"ContainerDied","Data":"3f747252dda28ff029947e9dcc8cf7bc30fc33f9a69ff9c99eb0f9eae7e4847e"} Apr 19 15:29:30.569947 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:30.569949 2579 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3f747252dda28ff029947e9dcc8cf7bc30fc33f9a69ff9c99eb0f9eae7e4847e" Apr 19 15:29:30.570110 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:30.569999 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fm2fxp" Apr 19 15:29:30.583553 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:30.583506 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-597b96b99b-hvjjv" podStartSLOduration=0.840288573 podStartE2EDuration="5.583494998s" podCreationTimestamp="2026-04-19 15:29:25 +0000 UTC" firstStartedPulling="2026-04-19 15:29:25.520973453 +0000 UTC m=+264.626888833" lastFinishedPulling="2026-04-19 15:29:30.264179892 +0000 UTC m=+269.370095258" observedRunningTime="2026-04-19 15:29:30.581569314 +0000 UTC m=+269.687484705" watchObservedRunningTime="2026-04-19 15:29:30.583494998 +0000 UTC m=+269.689410385" Apr 19 15:29:35.273982 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:35.273947 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-8dr4d"] Apr 19 15:29:35.274339 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:35.274246 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c8c7e4d0-331e-49b0-8f27-3404705604f5" containerName="extract" Apr 19 15:29:35.274339 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:35.274256 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8c7e4d0-331e-49b0-8f27-3404705604f5" containerName="extract" Apr 19 15:29:35.274339 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:35.274276 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c8c7e4d0-331e-49b0-8f27-3404705604f5" containerName="pull" Apr 19 15:29:35.274339 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:35.274281 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8c7e4d0-331e-49b0-8f27-3404705604f5" containerName="pull" Apr 19 15:29:35.274339 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:35.274293 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c8c7e4d0-331e-49b0-8f27-3404705604f5" containerName="util" Apr 19 15:29:35.274339 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:35.274301 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8c7e4d0-331e-49b0-8f27-3404705604f5" containerName="util" Apr 19 15:29:35.274515 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:35.274344 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="c8c7e4d0-331e-49b0-8f27-3404705604f5" containerName="extract" Apr 19 15:29:35.277385 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:35.277366 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-8dr4d" Apr 19 15:29:35.279578 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:35.279557 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 19 15:29:35.280283 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:35.280256 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 19 15:29:35.280374 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:35.280262 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"openshift-lws-operator-dockercfg-4ndqj\"" Apr 19 15:29:35.284365 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:35.284342 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-8dr4d"] Apr 19 15:29:35.346017 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:35.345977 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dvnv\" (UniqueName: \"kubernetes.io/projected/92319b45-1aa1-4418-9292-8eacaf99ec5f-kube-api-access-7dvnv\") pod \"openshift-lws-operator-bfc7f696d-8dr4d\" (UID: \"92319b45-1aa1-4418-9292-8eacaf99ec5f\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-8dr4d" Apr 19 15:29:35.346163 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:35.346040 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/92319b45-1aa1-4418-9292-8eacaf99ec5f-tmp\") pod \"openshift-lws-operator-bfc7f696d-8dr4d\" (UID: \"92319b45-1aa1-4418-9292-8eacaf99ec5f\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-8dr4d" Apr 19 15:29:35.447180 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:35.447130 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7dvnv\" (UniqueName: \"kubernetes.io/projected/92319b45-1aa1-4418-9292-8eacaf99ec5f-kube-api-access-7dvnv\") pod \"openshift-lws-operator-bfc7f696d-8dr4d\" (UID: \"92319b45-1aa1-4418-9292-8eacaf99ec5f\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-8dr4d" Apr 19 15:29:35.447344 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:35.447232 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/92319b45-1aa1-4418-9292-8eacaf99ec5f-tmp\") pod \"openshift-lws-operator-bfc7f696d-8dr4d\" (UID: \"92319b45-1aa1-4418-9292-8eacaf99ec5f\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-8dr4d" Apr 19 15:29:35.447636 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:35.447616 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/92319b45-1aa1-4418-9292-8eacaf99ec5f-tmp\") pod \"openshift-lws-operator-bfc7f696d-8dr4d\" (UID: \"92319b45-1aa1-4418-9292-8eacaf99ec5f\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-8dr4d" Apr 19 15:29:35.454674 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:35.454639 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dvnv\" (UniqueName: \"kubernetes.io/projected/92319b45-1aa1-4418-9292-8eacaf99ec5f-kube-api-access-7dvnv\") pod \"openshift-lws-operator-bfc7f696d-8dr4d\" (UID: \"92319b45-1aa1-4418-9292-8eacaf99ec5f\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-8dr4d" Apr 19 15:29:35.587931 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:35.587850 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-8dr4d" Apr 19 15:29:35.710916 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:35.710848 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-8dr4d"] Apr 19 15:29:35.713221 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:29:35.713193 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod92319b45_1aa1_4418_9292_8eacaf99ec5f.slice/crio-7f6064e640a8ac9480d14b7b8d3cffe312ff27dfa44c12d5351f94909b087c01 WatchSource:0}: Error finding container 7f6064e640a8ac9480d14b7b8d3cffe312ff27dfa44c12d5351f94909b087c01: Status 404 returned error can't find the container with id 7f6064e640a8ac9480d14b7b8d3cffe312ff27dfa44c12d5351f94909b087c01 Apr 19 15:29:36.574406 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:36.574372 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-597b96b99b-hvjjv" Apr 19 15:29:36.591595 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:36.591557 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-8dr4d" event={"ID":"92319b45-1aa1-4418-9292-8eacaf99ec5f","Type":"ContainerStarted","Data":"7f6064e640a8ac9480d14b7b8d3cffe312ff27dfa44c12d5351f94909b087c01"} Apr 19 15:29:38.599460 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:38.599364 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-8dr4d" event={"ID":"92319b45-1aa1-4418-9292-8eacaf99ec5f","Type":"ContainerStarted","Data":"2c9400ed7dafe4321a28f3a16c4a7fdf01ca4d94dc17b0374e33b0261f99713c"} Apr 19 15:29:38.618289 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:38.618235 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-8dr4d" podStartSLOduration=1.147271678 podStartE2EDuration="3.618220265s" podCreationTimestamp="2026-04-19 15:29:35 +0000 UTC" firstStartedPulling="2026-04-19 15:29:35.71479667 +0000 UTC m=+274.820712042" lastFinishedPulling="2026-04-19 15:29:38.185745244 +0000 UTC m=+277.291660629" observedRunningTime="2026-04-19 15:29:38.61600107 +0000 UTC m=+277.721916449" watchObservedRunningTime="2026-04-19 15:29:38.618220265 +0000 UTC m=+277.724135653" Apr 19 15:29:40.860812 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:40.860777 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5v8v6n"] Apr 19 15:29:40.864559 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:40.864542 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5v8v6n" Apr 19 15:29:40.866967 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:40.866946 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 19 15:29:40.867079 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:40.866990 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 19 15:29:40.867079 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:40.867027 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-s8b9d\"" Apr 19 15:29:40.872611 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:40.872587 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5v8v6n"] Apr 19 15:29:40.991849 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:40.991811 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c30fe241-a17d-4dfb-bd78-178ab1fea627-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5v8v6n\" (UID: \"c30fe241-a17d-4dfb-bd78-178ab1fea627\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5v8v6n" Apr 19 15:29:40.992024 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:40.991920 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clqp4\" (UniqueName: \"kubernetes.io/projected/c30fe241-a17d-4dfb-bd78-178ab1fea627-kube-api-access-clqp4\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5v8v6n\" (UID: \"c30fe241-a17d-4dfb-bd78-178ab1fea627\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5v8v6n" Apr 19 15:29:40.992024 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:40.991948 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c30fe241-a17d-4dfb-bd78-178ab1fea627-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5v8v6n\" (UID: \"c30fe241-a17d-4dfb-bd78-178ab1fea627\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5v8v6n" Apr 19 15:29:41.093020 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:41.092978 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c30fe241-a17d-4dfb-bd78-178ab1fea627-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5v8v6n\" (UID: \"c30fe241-a17d-4dfb-bd78-178ab1fea627\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5v8v6n" Apr 19 15:29:41.093181 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:41.093062 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-clqp4\" (UniqueName: \"kubernetes.io/projected/c30fe241-a17d-4dfb-bd78-178ab1fea627-kube-api-access-clqp4\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5v8v6n\" (UID: \"c30fe241-a17d-4dfb-bd78-178ab1fea627\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5v8v6n" Apr 19 15:29:41.093181 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:41.093085 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c30fe241-a17d-4dfb-bd78-178ab1fea627-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5v8v6n\" (UID: \"c30fe241-a17d-4dfb-bd78-178ab1fea627\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5v8v6n" Apr 19 15:29:41.093378 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:41.093353 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c30fe241-a17d-4dfb-bd78-178ab1fea627-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5v8v6n\" (UID: \"c30fe241-a17d-4dfb-bd78-178ab1fea627\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5v8v6n" Apr 19 15:29:41.093437 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:41.093407 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c30fe241-a17d-4dfb-bd78-178ab1fea627-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5v8v6n\" (UID: \"c30fe241-a17d-4dfb-bd78-178ab1fea627\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5v8v6n" Apr 19 15:29:41.100440 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:41.100413 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-clqp4\" (UniqueName: \"kubernetes.io/projected/c30fe241-a17d-4dfb-bd78-178ab1fea627-kube-api-access-clqp4\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5v8v6n\" (UID: \"c30fe241-a17d-4dfb-bd78-178ab1fea627\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5v8v6n" Apr 19 15:29:41.174466 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:41.174357 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5v8v6n" Apr 19 15:29:41.302943 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:41.302917 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5v8v6n"] Apr 19 15:29:41.305248 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:29:41.305208 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc30fe241_a17d_4dfb_bd78_178ab1fea627.slice/crio-3f4f76d20a6aa7c6f23d357fbaec79de26e04c951ae4097ff34efe8e84b7d2f6 WatchSource:0}: Error finding container 3f4f76d20a6aa7c6f23d357fbaec79de26e04c951ae4097ff34efe8e84b7d2f6: Status 404 returned error can't find the container with id 3f4f76d20a6aa7c6f23d357fbaec79de26e04c951ae4097ff34efe8e84b7d2f6 Apr 19 15:29:41.610063 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:41.610022 2579 generic.go:358] "Generic (PLEG): container finished" podID="c30fe241-a17d-4dfb-bd78-178ab1fea627" containerID="4047a972144c43cb70239af695e067fa058009f6788ab6103bb13f87b589f84f" exitCode=0 Apr 19 15:29:41.610292 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:41.610068 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5v8v6n" event={"ID":"c30fe241-a17d-4dfb-bd78-178ab1fea627","Type":"ContainerDied","Data":"4047a972144c43cb70239af695e067fa058009f6788ab6103bb13f87b589f84f"} Apr 19 15:29:41.610292 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:41.610095 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5v8v6n" event={"ID":"c30fe241-a17d-4dfb-bd78-178ab1fea627","Type":"ContainerStarted","Data":"3f4f76d20a6aa7c6f23d357fbaec79de26e04c951ae4097ff34efe8e84b7d2f6"} Apr 19 15:29:42.618708 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:42.618668 2579 generic.go:358] "Generic (PLEG): container finished" podID="c30fe241-a17d-4dfb-bd78-178ab1fea627" containerID="db5f58e333d0a8b6dbfc02143d132c5a5f7a978859f812e1abe790d2f4fa0db1" exitCode=0 Apr 19 15:29:42.619099 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:42.618759 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5v8v6n" event={"ID":"c30fe241-a17d-4dfb-bd78-178ab1fea627","Type":"ContainerDied","Data":"db5f58e333d0a8b6dbfc02143d132c5a5f7a978859f812e1abe790d2f4fa0db1"} Apr 19 15:29:43.624431 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:43.624396 2579 generic.go:358] "Generic (PLEG): container finished" podID="c30fe241-a17d-4dfb-bd78-178ab1fea627" containerID="5c56c5186bf1785a66ab3f10adbadbb9008e71999cf223f5790944f7d2e5236f" exitCode=0 Apr 19 15:29:43.624817 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:43.624470 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5v8v6n" event={"ID":"c30fe241-a17d-4dfb-bd78-178ab1fea627","Type":"ContainerDied","Data":"5c56c5186bf1785a66ab3f10adbadbb9008e71999cf223f5790944f7d2e5236f"} Apr 19 15:29:43.794224 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:43.794182 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-759f64656b-mxrzg"] Apr 19 15:29:43.834943 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:43.834901 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-mxrzg"] Apr 19 15:29:43.835077 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:43.835037 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-mxrzg" Apr 19 15:29:43.837750 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:43.837712 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-n68jv\"" Apr 19 15:29:43.920209 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:43.920123 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4r99p\" (UniqueName: \"kubernetes.io/projected/a1a5ead9-4433-4552-87df-9d7839e26744-kube-api-access-4r99p\") pod \"cert-manager-759f64656b-mxrzg\" (UID: \"a1a5ead9-4433-4552-87df-9d7839e26744\") " pod="cert-manager/cert-manager-759f64656b-mxrzg" Apr 19 15:29:43.920209 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:43.920173 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a1a5ead9-4433-4552-87df-9d7839e26744-bound-sa-token\") pod \"cert-manager-759f64656b-mxrzg\" (UID: \"a1a5ead9-4433-4552-87df-9d7839e26744\") " pod="cert-manager/cert-manager-759f64656b-mxrzg" Apr 19 15:29:44.021091 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:44.021047 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4r99p\" (UniqueName: \"kubernetes.io/projected/a1a5ead9-4433-4552-87df-9d7839e26744-kube-api-access-4r99p\") pod \"cert-manager-759f64656b-mxrzg\" (UID: \"a1a5ead9-4433-4552-87df-9d7839e26744\") " pod="cert-manager/cert-manager-759f64656b-mxrzg" Apr 19 15:29:44.021245 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:44.021105 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a1a5ead9-4433-4552-87df-9d7839e26744-bound-sa-token\") pod \"cert-manager-759f64656b-mxrzg\" (UID: \"a1a5ead9-4433-4552-87df-9d7839e26744\") " pod="cert-manager/cert-manager-759f64656b-mxrzg" Apr 19 15:29:44.028402 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:44.028368 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a1a5ead9-4433-4552-87df-9d7839e26744-bound-sa-token\") pod \"cert-manager-759f64656b-mxrzg\" (UID: \"a1a5ead9-4433-4552-87df-9d7839e26744\") " pod="cert-manager/cert-manager-759f64656b-mxrzg" Apr 19 15:29:44.028402 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:44.028387 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4r99p\" (UniqueName: \"kubernetes.io/projected/a1a5ead9-4433-4552-87df-9d7839e26744-kube-api-access-4r99p\") pod \"cert-manager-759f64656b-mxrzg\" (UID: \"a1a5ead9-4433-4552-87df-9d7839e26744\") " pod="cert-manager/cert-manager-759f64656b-mxrzg" Apr 19 15:29:44.144337 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:44.144296 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-mxrzg" Apr 19 15:29:44.265024 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:44.264872 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-mxrzg"] Apr 19 15:29:44.267543 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:29:44.267516 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda1a5ead9_4433_4552_87df_9d7839e26744.slice/crio-f6adfcc1e50e540c917fc5782f506bc403ed1d9c0faf242c8134bc75b501d061 WatchSource:0}: Error finding container f6adfcc1e50e540c917fc5782f506bc403ed1d9c0faf242c8134bc75b501d061: Status 404 returned error can't find the container with id f6adfcc1e50e540c917fc5782f506bc403ed1d9c0faf242c8134bc75b501d061 Apr 19 15:29:44.629949 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:44.629834 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-mxrzg" event={"ID":"a1a5ead9-4433-4552-87df-9d7839e26744","Type":"ContainerStarted","Data":"2ca0f8c9d26866d5f0c14f4d5c8a617ef70a2661cb44a7ebcb1b09f19d15c33f"} Apr 19 15:29:44.629949 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:44.629882 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-mxrzg" event={"ID":"a1a5ead9-4433-4552-87df-9d7839e26744","Type":"ContainerStarted","Data":"f6adfcc1e50e540c917fc5782f506bc403ed1d9c0faf242c8134bc75b501d061"} Apr 19 15:29:44.644816 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:44.644767 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-759f64656b-mxrzg" podStartSLOduration=1.6447491379999999 podStartE2EDuration="1.644749138s" podCreationTimestamp="2026-04-19 15:29:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-19 15:29:44.643140815 +0000 UTC m=+283.749056203" watchObservedRunningTime="2026-04-19 15:29:44.644749138 +0000 UTC m=+283.750664527" Apr 19 15:29:44.760605 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:44.760581 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5v8v6n" Apr 19 15:29:44.929956 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:44.929921 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-clqp4\" (UniqueName: \"kubernetes.io/projected/c30fe241-a17d-4dfb-bd78-178ab1fea627-kube-api-access-clqp4\") pod \"c30fe241-a17d-4dfb-bd78-178ab1fea627\" (UID: \"c30fe241-a17d-4dfb-bd78-178ab1fea627\") " Apr 19 15:29:44.930127 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:44.930022 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c30fe241-a17d-4dfb-bd78-178ab1fea627-util\") pod \"c30fe241-a17d-4dfb-bd78-178ab1fea627\" (UID: \"c30fe241-a17d-4dfb-bd78-178ab1fea627\") " Apr 19 15:29:44.930127 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:44.930065 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c30fe241-a17d-4dfb-bd78-178ab1fea627-bundle\") pod \"c30fe241-a17d-4dfb-bd78-178ab1fea627\" (UID: \"c30fe241-a17d-4dfb-bd78-178ab1fea627\") " Apr 19 15:29:44.930840 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:44.930812 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c30fe241-a17d-4dfb-bd78-178ab1fea627-bundle" (OuterVolumeSpecName: "bundle") pod "c30fe241-a17d-4dfb-bd78-178ab1fea627" (UID: "c30fe241-a17d-4dfb-bd78-178ab1fea627"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 19 15:29:44.932201 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:44.932171 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c30fe241-a17d-4dfb-bd78-178ab1fea627-kube-api-access-clqp4" (OuterVolumeSpecName: "kube-api-access-clqp4") pod "c30fe241-a17d-4dfb-bd78-178ab1fea627" (UID: "c30fe241-a17d-4dfb-bd78-178ab1fea627"). InnerVolumeSpecName "kube-api-access-clqp4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 15:29:44.935235 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:44.935202 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c30fe241-a17d-4dfb-bd78-178ab1fea627-util" (OuterVolumeSpecName: "util") pod "c30fe241-a17d-4dfb-bd78-178ab1fea627" (UID: "c30fe241-a17d-4dfb-bd78-178ab1fea627"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 19 15:29:45.031467 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:45.031434 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-clqp4\" (UniqueName: \"kubernetes.io/projected/c30fe241-a17d-4dfb-bd78-178ab1fea627-kube-api-access-clqp4\") on node \"ip-10-0-133-218.ec2.internal\" DevicePath \"\"" Apr 19 15:29:45.031467 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:45.031465 2579 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c30fe241-a17d-4dfb-bd78-178ab1fea627-util\") on node \"ip-10-0-133-218.ec2.internal\" DevicePath \"\"" Apr 19 15:29:45.031644 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:45.031479 2579 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c30fe241-a17d-4dfb-bd78-178ab1fea627-bundle\") on node \"ip-10-0-133-218.ec2.internal\" DevicePath \"\"" Apr 19 15:29:45.634855 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:45.634817 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5v8v6n" event={"ID":"c30fe241-a17d-4dfb-bd78-178ab1fea627","Type":"ContainerDied","Data":"3f4f76d20a6aa7c6f23d357fbaec79de26e04c951ae4097ff34efe8e84b7d2f6"} Apr 19 15:29:45.634855 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:45.634863 2579 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3f4f76d20a6aa7c6f23d357fbaec79de26e04c951ae4097ff34efe8e84b7d2f6" Apr 19 15:29:45.635288 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:45.634836 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5v8v6n" Apr 19 15:29:54.808329 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:54.808289 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-67944f454b-2kv92"] Apr 19 15:29:54.808807 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:54.808607 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c30fe241-a17d-4dfb-bd78-178ab1fea627" containerName="util" Apr 19 15:29:54.808807 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:54.808618 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="c30fe241-a17d-4dfb-bd78-178ab1fea627" containerName="util" Apr 19 15:29:54.808807 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:54.808631 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c30fe241-a17d-4dfb-bd78-178ab1fea627" containerName="pull" Apr 19 15:29:54.808807 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:54.808636 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="c30fe241-a17d-4dfb-bd78-178ab1fea627" containerName="pull" Apr 19 15:29:54.808807 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:54.808653 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c30fe241-a17d-4dfb-bd78-178ab1fea627" containerName="extract" Apr 19 15:29:54.808807 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:54.808660 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="c30fe241-a17d-4dfb-bd78-178ab1fea627" containerName="extract" Apr 19 15:29:54.808807 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:54.808705 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="c30fe241-a17d-4dfb-bd78-178ab1fea627" containerName="extract" Apr 19 15:29:54.811801 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:54.811784 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-67944f454b-2kv92" Apr 19 15:29:54.814684 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:54.814662 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 19 15:29:54.814822 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:54.814781 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-pcf6s\"" Apr 19 15:29:54.815808 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:54.815794 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 19 15:29:54.816156 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:54.816144 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 19 15:29:54.819696 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:54.819678 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 19 15:29:54.823905 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:54.823885 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dsj58\" (UniqueName: \"kubernetes.io/projected/9ff7b2f4-5dfd-491b-ab80-88ce950644c4-kube-api-access-dsj58\") pod \"opendatahub-operator-controller-manager-67944f454b-2kv92\" (UID: \"9ff7b2f4-5dfd-491b-ab80-88ce950644c4\") " pod="opendatahub/opendatahub-operator-controller-manager-67944f454b-2kv92" Apr 19 15:29:54.823978 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:54.823918 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9ff7b2f4-5dfd-491b-ab80-88ce950644c4-webhook-cert\") pod \"opendatahub-operator-controller-manager-67944f454b-2kv92\" (UID: \"9ff7b2f4-5dfd-491b-ab80-88ce950644c4\") " pod="opendatahub/opendatahub-operator-controller-manager-67944f454b-2kv92" Apr 19 15:29:54.824027 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:54.824011 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9ff7b2f4-5dfd-491b-ab80-88ce950644c4-apiservice-cert\") pod \"opendatahub-operator-controller-manager-67944f454b-2kv92\" (UID: \"9ff7b2f4-5dfd-491b-ab80-88ce950644c4\") " pod="opendatahub/opendatahub-operator-controller-manager-67944f454b-2kv92" Apr 19 15:29:54.830251 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:54.830226 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-67944f454b-2kv92"] Apr 19 15:29:54.924687 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:54.924648 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9ff7b2f4-5dfd-491b-ab80-88ce950644c4-apiservice-cert\") pod \"opendatahub-operator-controller-manager-67944f454b-2kv92\" (UID: \"9ff7b2f4-5dfd-491b-ab80-88ce950644c4\") " pod="opendatahub/opendatahub-operator-controller-manager-67944f454b-2kv92" Apr 19 15:29:54.924903 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:54.924749 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dsj58\" (UniqueName: \"kubernetes.io/projected/9ff7b2f4-5dfd-491b-ab80-88ce950644c4-kube-api-access-dsj58\") pod \"opendatahub-operator-controller-manager-67944f454b-2kv92\" (UID: \"9ff7b2f4-5dfd-491b-ab80-88ce950644c4\") " pod="opendatahub/opendatahub-operator-controller-manager-67944f454b-2kv92" Apr 19 15:29:54.924903 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:54.924785 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9ff7b2f4-5dfd-491b-ab80-88ce950644c4-webhook-cert\") pod \"opendatahub-operator-controller-manager-67944f454b-2kv92\" (UID: \"9ff7b2f4-5dfd-491b-ab80-88ce950644c4\") " pod="opendatahub/opendatahub-operator-controller-manager-67944f454b-2kv92" Apr 19 15:29:54.927432 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:54.927399 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9ff7b2f4-5dfd-491b-ab80-88ce950644c4-apiservice-cert\") pod \"opendatahub-operator-controller-manager-67944f454b-2kv92\" (UID: \"9ff7b2f4-5dfd-491b-ab80-88ce950644c4\") " pod="opendatahub/opendatahub-operator-controller-manager-67944f454b-2kv92" Apr 19 15:29:54.927432 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:54.927429 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9ff7b2f4-5dfd-491b-ab80-88ce950644c4-webhook-cert\") pod \"opendatahub-operator-controller-manager-67944f454b-2kv92\" (UID: \"9ff7b2f4-5dfd-491b-ab80-88ce950644c4\") " pod="opendatahub/opendatahub-operator-controller-manager-67944f454b-2kv92" Apr 19 15:29:54.932873 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:54.932854 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dsj58\" (UniqueName: \"kubernetes.io/projected/9ff7b2f4-5dfd-491b-ab80-88ce950644c4-kube-api-access-dsj58\") pod \"opendatahub-operator-controller-manager-67944f454b-2kv92\" (UID: \"9ff7b2f4-5dfd-491b-ab80-88ce950644c4\") " pod="opendatahub/opendatahub-operator-controller-manager-67944f454b-2kv92" Apr 19 15:29:55.122050 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:55.121925 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-67944f454b-2kv92" Apr 19 15:29:55.260576 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:55.260553 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-67944f454b-2kv92"] Apr 19 15:29:55.263899 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:29:55.263866 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ff7b2f4_5dfd_491b_ab80_88ce950644c4.slice/crio-2673a1f202b6012b815a4b03b7655f1e2c1259be7f1e4495b20257701f64fdb3 WatchSource:0}: Error finding container 2673a1f202b6012b815a4b03b7655f1e2c1259be7f1e4495b20257701f64fdb3: Status 404 returned error can't find the container with id 2673a1f202b6012b815a4b03b7655f1e2c1259be7f1e4495b20257701f64fdb3 Apr 19 15:29:55.272072 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:55.272043 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9dkw2x"] Apr 19 15:29:55.277435 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:55.277413 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9dkw2x" Apr 19 15:29:55.279893 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:55.279873 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 19 15:29:55.279893 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:55.279884 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-s8b9d\"" Apr 19 15:29:55.280055 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:55.279866 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 19 15:29:55.283680 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:55.283652 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9dkw2x"] Apr 19 15:29:55.328607 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:55.328571 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3067ef1d-65d8-4e57-8f13-00074224d8fa-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9dkw2x\" (UID: \"3067ef1d-65d8-4e57-8f13-00074224d8fa\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9dkw2x" Apr 19 15:29:55.328820 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:55.328618 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3067ef1d-65d8-4e57-8f13-00074224d8fa-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9dkw2x\" (UID: \"3067ef1d-65d8-4e57-8f13-00074224d8fa\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9dkw2x" Apr 19 15:29:55.328820 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:55.328696 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjrc4\" (UniqueName: \"kubernetes.io/projected/3067ef1d-65d8-4e57-8f13-00074224d8fa-kube-api-access-wjrc4\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9dkw2x\" (UID: \"3067ef1d-65d8-4e57-8f13-00074224d8fa\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9dkw2x" Apr 19 15:29:55.429558 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:55.429516 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wjrc4\" (UniqueName: \"kubernetes.io/projected/3067ef1d-65d8-4e57-8f13-00074224d8fa-kube-api-access-wjrc4\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9dkw2x\" (UID: \"3067ef1d-65d8-4e57-8f13-00074224d8fa\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9dkw2x" Apr 19 15:29:55.429714 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:55.429611 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3067ef1d-65d8-4e57-8f13-00074224d8fa-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9dkw2x\" (UID: \"3067ef1d-65d8-4e57-8f13-00074224d8fa\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9dkw2x" Apr 19 15:29:55.429714 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:55.429647 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3067ef1d-65d8-4e57-8f13-00074224d8fa-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9dkw2x\" (UID: \"3067ef1d-65d8-4e57-8f13-00074224d8fa\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9dkw2x" Apr 19 15:29:55.430038 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:55.430016 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3067ef1d-65d8-4e57-8f13-00074224d8fa-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9dkw2x\" (UID: \"3067ef1d-65d8-4e57-8f13-00074224d8fa\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9dkw2x" Apr 19 15:29:55.430109 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:55.430049 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3067ef1d-65d8-4e57-8f13-00074224d8fa-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9dkw2x\" (UID: \"3067ef1d-65d8-4e57-8f13-00074224d8fa\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9dkw2x" Apr 19 15:29:55.442989 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:55.442956 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjrc4\" (UniqueName: \"kubernetes.io/projected/3067ef1d-65d8-4e57-8f13-00074224d8fa-kube-api-access-wjrc4\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9dkw2x\" (UID: \"3067ef1d-65d8-4e57-8f13-00074224d8fa\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9dkw2x" Apr 19 15:29:55.589162 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:55.589126 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9dkw2x" Apr 19 15:29:55.669146 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:55.669106 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-67944f454b-2kv92" event={"ID":"9ff7b2f4-5dfd-491b-ab80-88ce950644c4","Type":"ContainerStarted","Data":"2673a1f202b6012b815a4b03b7655f1e2c1259be7f1e4495b20257701f64fdb3"} Apr 19 15:29:55.715248 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:55.715220 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9dkw2x"] Apr 19 15:29:55.717332 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:29:55.717302 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3067ef1d_65d8_4e57_8f13_00074224d8fa.slice/crio-afefadd77b8e6b46606d026c2feb5d11b590f6c28e006b317e9dc1ccbbdf7ab7 WatchSource:0}: Error finding container afefadd77b8e6b46606d026c2feb5d11b590f6c28e006b317e9dc1ccbbdf7ab7: Status 404 returned error can't find the container with id afefadd77b8e6b46606d026c2feb5d11b590f6c28e006b317e9dc1ccbbdf7ab7 Apr 19 15:29:56.675151 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:56.675098 2579 generic.go:358] "Generic (PLEG): container finished" podID="3067ef1d-65d8-4e57-8f13-00074224d8fa" containerID="5ff1f5fbb10b82808451109d8c85efe1673d0e08aee9f9a941524ca03f6dcc7a" exitCode=0 Apr 19 15:29:56.675617 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:56.675270 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9dkw2x" event={"ID":"3067ef1d-65d8-4e57-8f13-00074224d8fa","Type":"ContainerDied","Data":"5ff1f5fbb10b82808451109d8c85efe1673d0e08aee9f9a941524ca03f6dcc7a"} Apr 19 15:29:56.675617 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:56.675304 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9dkw2x" event={"ID":"3067ef1d-65d8-4e57-8f13-00074224d8fa","Type":"ContainerStarted","Data":"afefadd77b8e6b46606d026c2feb5d11b590f6c28e006b317e9dc1ccbbdf7ab7"} Apr 19 15:29:58.685097 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:58.685061 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-67944f454b-2kv92" event={"ID":"9ff7b2f4-5dfd-491b-ab80-88ce950644c4","Type":"ContainerStarted","Data":"c3419938cd2e0adb003c878e3b4df20c84b5281c09d67853b8601ad93a64f629"} Apr 19 15:29:58.685511 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:58.685210 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-67944f454b-2kv92" Apr 19 15:29:58.686809 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:58.686783 2579 generic.go:358] "Generic (PLEG): container finished" podID="3067ef1d-65d8-4e57-8f13-00074224d8fa" containerID="a8c6015f96bbb966c4016d3022c63118c9a855003163deee00c165c41343a4f2" exitCode=0 Apr 19 15:29:58.686931 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:58.686818 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9dkw2x" event={"ID":"3067ef1d-65d8-4e57-8f13-00074224d8fa","Type":"ContainerDied","Data":"a8c6015f96bbb966c4016d3022c63118c9a855003163deee00c165c41343a4f2"} Apr 19 15:29:58.705873 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:58.705806 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-67944f454b-2kv92" podStartSLOduration=2.005973977 podStartE2EDuration="4.705785764s" podCreationTimestamp="2026-04-19 15:29:54 +0000 UTC" firstStartedPulling="2026-04-19 15:29:55.265785124 +0000 UTC m=+294.371700490" lastFinishedPulling="2026-04-19 15:29:57.965596906 +0000 UTC m=+297.071512277" observedRunningTime="2026-04-19 15:29:58.704316664 +0000 UTC m=+297.810232051" watchObservedRunningTime="2026-04-19 15:29:58.705785764 +0000 UTC m=+297.811701154" Apr 19 15:29:59.693080 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:59.693038 2579 generic.go:358] "Generic (PLEG): container finished" podID="3067ef1d-65d8-4e57-8f13-00074224d8fa" containerID="8410df9b4785d833652ed61a7f3cabdb09ec529db8671e8f85c269e95342c509" exitCode=0 Apr 19 15:29:59.693480 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:29:59.693127 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9dkw2x" event={"ID":"3067ef1d-65d8-4e57-8f13-00074224d8fa","Type":"ContainerDied","Data":"8410df9b4785d833652ed61a7f3cabdb09ec529db8671e8f85c269e95342c509"} Apr 19 15:30:00.824660 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:00.824632 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9dkw2x" Apr 19 15:30:00.868937 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:00.868901 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3067ef1d-65d8-4e57-8f13-00074224d8fa-bundle\") pod \"3067ef1d-65d8-4e57-8f13-00074224d8fa\" (UID: \"3067ef1d-65d8-4e57-8f13-00074224d8fa\") " Apr 19 15:30:00.869115 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:00.868952 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wjrc4\" (UniqueName: \"kubernetes.io/projected/3067ef1d-65d8-4e57-8f13-00074224d8fa-kube-api-access-wjrc4\") pod \"3067ef1d-65d8-4e57-8f13-00074224d8fa\" (UID: \"3067ef1d-65d8-4e57-8f13-00074224d8fa\") " Apr 19 15:30:00.869115 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:00.868983 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3067ef1d-65d8-4e57-8f13-00074224d8fa-util\") pod \"3067ef1d-65d8-4e57-8f13-00074224d8fa\" (UID: \"3067ef1d-65d8-4e57-8f13-00074224d8fa\") " Apr 19 15:30:00.869612 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:00.869589 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3067ef1d-65d8-4e57-8f13-00074224d8fa-bundle" (OuterVolumeSpecName: "bundle") pod "3067ef1d-65d8-4e57-8f13-00074224d8fa" (UID: "3067ef1d-65d8-4e57-8f13-00074224d8fa"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 19 15:30:00.871227 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:00.871201 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3067ef1d-65d8-4e57-8f13-00074224d8fa-kube-api-access-wjrc4" (OuterVolumeSpecName: "kube-api-access-wjrc4") pod "3067ef1d-65d8-4e57-8f13-00074224d8fa" (UID: "3067ef1d-65d8-4e57-8f13-00074224d8fa"). InnerVolumeSpecName "kube-api-access-wjrc4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 15:30:00.874307 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:00.874287 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3067ef1d-65d8-4e57-8f13-00074224d8fa-util" (OuterVolumeSpecName: "util") pod "3067ef1d-65d8-4e57-8f13-00074224d8fa" (UID: "3067ef1d-65d8-4e57-8f13-00074224d8fa"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 19 15:30:00.969946 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:00.969849 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wjrc4\" (UniqueName: \"kubernetes.io/projected/3067ef1d-65d8-4e57-8f13-00074224d8fa-kube-api-access-wjrc4\") on node \"ip-10-0-133-218.ec2.internal\" DevicePath \"\"" Apr 19 15:30:00.969946 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:00.969881 2579 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3067ef1d-65d8-4e57-8f13-00074224d8fa-util\") on node \"ip-10-0-133-218.ec2.internal\" DevicePath \"\"" Apr 19 15:30:00.969946 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:00.969892 2579 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3067ef1d-65d8-4e57-8f13-00074224d8fa-bundle\") on node \"ip-10-0-133-218.ec2.internal\" DevicePath \"\"" Apr 19 15:30:01.421435 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:01.421345 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xxqlx_73514b32-300b-4466-b414-022b4c2e1f8e/ovn-acl-logging/0.log" Apr 19 15:30:01.421596 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:01.421503 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xxqlx_73514b32-300b-4466-b414-022b4c2e1f8e/ovn-acl-logging/0.log" Apr 19 15:30:01.425713 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:01.425691 2579 kubelet.go:1628] "Image garbage collection succeeded" Apr 19 15:30:01.701175 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:01.701132 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9dkw2x" event={"ID":"3067ef1d-65d8-4e57-8f13-00074224d8fa","Type":"ContainerDied","Data":"afefadd77b8e6b46606d026c2feb5d11b590f6c28e006b317e9dc1ccbbdf7ab7"} Apr 19 15:30:01.701175 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:01.701180 2579 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="afefadd77b8e6b46606d026c2feb5d11b590f6c28e006b317e9dc1ccbbdf7ab7" Apr 19 15:30:01.701414 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:01.701184 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9dkw2x" Apr 19 15:30:09.695823 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:09.695794 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-67944f454b-2kv92" Apr 19 15:30:12.308126 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:12.308089 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-65b77f8fcd-m49lh"] Apr 19 15:30:12.308525 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:12.308420 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3067ef1d-65d8-4e57-8f13-00074224d8fa" containerName="pull" Apr 19 15:30:12.308525 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:12.308431 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="3067ef1d-65d8-4e57-8f13-00074224d8fa" containerName="pull" Apr 19 15:30:12.308525 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:12.308446 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3067ef1d-65d8-4e57-8f13-00074224d8fa" containerName="util" Apr 19 15:30:12.308525 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:12.308452 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="3067ef1d-65d8-4e57-8f13-00074224d8fa" containerName="util" Apr 19 15:30:12.308525 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:12.308457 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3067ef1d-65d8-4e57-8f13-00074224d8fa" containerName="extract" Apr 19 15:30:12.308525 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:12.308463 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="3067ef1d-65d8-4e57-8f13-00074224d8fa" containerName="extract" Apr 19 15:30:12.308525 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:12.308514 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="3067ef1d-65d8-4e57-8f13-00074224d8fa" containerName="extract" Apr 19 15:30:12.312862 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:12.312843 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-65b77f8fcd-m49lh" Apr 19 15:30:12.323907 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:12.323884 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 19 15:30:12.324015 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:12.323883 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 19 15:30:12.324015 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:12.323933 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-8s5vj\"" Apr 19 15:30:12.324015 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:12.323890 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 19 15:30:12.343140 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:12.343105 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-65b77f8fcd-m49lh"] Apr 19 15:30:12.366230 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:12.366197 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/667a5519-f1d2-40e9-b3bc-46d649ba3525-manager-config\") pod \"lws-controller-manager-65b77f8fcd-m49lh\" (UID: \"667a5519-f1d2-40e9-b3bc-46d649ba3525\") " pod="openshift-lws-operator/lws-controller-manager-65b77f8fcd-m49lh" Apr 19 15:30:12.366375 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:12.366257 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/667a5519-f1d2-40e9-b3bc-46d649ba3525-cert\") pod \"lws-controller-manager-65b77f8fcd-m49lh\" (UID: \"667a5519-f1d2-40e9-b3bc-46d649ba3525\") " pod="openshift-lws-operator/lws-controller-manager-65b77f8fcd-m49lh" Apr 19 15:30:12.366375 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:12.366293 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmzwq\" (UniqueName: \"kubernetes.io/projected/667a5519-f1d2-40e9-b3bc-46d649ba3525-kube-api-access-vmzwq\") pod \"lws-controller-manager-65b77f8fcd-m49lh\" (UID: \"667a5519-f1d2-40e9-b3bc-46d649ba3525\") " pod="openshift-lws-operator/lws-controller-manager-65b77f8fcd-m49lh" Apr 19 15:30:12.366375 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:12.366345 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/667a5519-f1d2-40e9-b3bc-46d649ba3525-metrics-cert\") pod \"lws-controller-manager-65b77f8fcd-m49lh\" (UID: \"667a5519-f1d2-40e9-b3bc-46d649ba3525\") " pod="openshift-lws-operator/lws-controller-manager-65b77f8fcd-m49lh" Apr 19 15:30:12.467683 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:12.467632 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/667a5519-f1d2-40e9-b3bc-46d649ba3525-cert\") pod \"lws-controller-manager-65b77f8fcd-m49lh\" (UID: \"667a5519-f1d2-40e9-b3bc-46d649ba3525\") " pod="openshift-lws-operator/lws-controller-manager-65b77f8fcd-m49lh" Apr 19 15:30:12.467876 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:12.467696 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vmzwq\" (UniqueName: \"kubernetes.io/projected/667a5519-f1d2-40e9-b3bc-46d649ba3525-kube-api-access-vmzwq\") pod \"lws-controller-manager-65b77f8fcd-m49lh\" (UID: \"667a5519-f1d2-40e9-b3bc-46d649ba3525\") " pod="openshift-lws-operator/lws-controller-manager-65b77f8fcd-m49lh" Apr 19 15:30:12.467876 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:12.467784 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/667a5519-f1d2-40e9-b3bc-46d649ba3525-metrics-cert\") pod \"lws-controller-manager-65b77f8fcd-m49lh\" (UID: \"667a5519-f1d2-40e9-b3bc-46d649ba3525\") " pod="openshift-lws-operator/lws-controller-manager-65b77f8fcd-m49lh" Apr 19 15:30:12.467876 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:12.467818 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/667a5519-f1d2-40e9-b3bc-46d649ba3525-manager-config\") pod \"lws-controller-manager-65b77f8fcd-m49lh\" (UID: \"667a5519-f1d2-40e9-b3bc-46d649ba3525\") " pod="openshift-lws-operator/lws-controller-manager-65b77f8fcd-m49lh" Apr 19 15:30:12.468413 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:12.468382 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/667a5519-f1d2-40e9-b3bc-46d649ba3525-manager-config\") pod \"lws-controller-manager-65b77f8fcd-m49lh\" (UID: \"667a5519-f1d2-40e9-b3bc-46d649ba3525\") " pod="openshift-lws-operator/lws-controller-manager-65b77f8fcd-m49lh" Apr 19 15:30:12.470440 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:12.470421 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/667a5519-f1d2-40e9-b3bc-46d649ba3525-metrics-cert\") pod \"lws-controller-manager-65b77f8fcd-m49lh\" (UID: \"667a5519-f1d2-40e9-b3bc-46d649ba3525\") " pod="openshift-lws-operator/lws-controller-manager-65b77f8fcd-m49lh" Apr 19 15:30:12.470500 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:12.470443 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/667a5519-f1d2-40e9-b3bc-46d649ba3525-cert\") pod \"lws-controller-manager-65b77f8fcd-m49lh\" (UID: \"667a5519-f1d2-40e9-b3bc-46d649ba3525\") " pod="openshift-lws-operator/lws-controller-manager-65b77f8fcd-m49lh" Apr 19 15:30:12.478357 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:12.478336 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmzwq\" (UniqueName: \"kubernetes.io/projected/667a5519-f1d2-40e9-b3bc-46d649ba3525-kube-api-access-vmzwq\") pod \"lws-controller-manager-65b77f8fcd-m49lh\" (UID: \"667a5519-f1d2-40e9-b3bc-46d649ba3525\") " pod="openshift-lws-operator/lws-controller-manager-65b77f8fcd-m49lh" Apr 19 15:30:12.622125 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:12.622029 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-65b77f8fcd-m49lh" Apr 19 15:30:12.760163 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:12.760139 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-65b77f8fcd-m49lh"] Apr 19 15:30:12.762509 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:30:12.762481 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod667a5519_f1d2_40e9_b3bc_46d649ba3525.slice/crio-f1be6e2dda03fa937b667c17678ae72b7ff4bbd43229a052e71a7462cd969181 WatchSource:0}: Error finding container f1be6e2dda03fa937b667c17678ae72b7ff4bbd43229a052e71a7462cd969181: Status 404 returned error can't find the container with id f1be6e2dda03fa937b667c17678ae72b7ff4bbd43229a052e71a7462cd969181 Apr 19 15:30:12.764387 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:12.764370 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 19 15:30:13.747053 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:13.747012 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-65b77f8fcd-m49lh" event={"ID":"667a5519-f1d2-40e9-b3bc-46d649ba3525","Type":"ContainerStarted","Data":"f1be6e2dda03fa937b667c17678ae72b7ff4bbd43229a052e71a7462cd969181"} Apr 19 15:30:15.757924 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:15.757880 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-65b77f8fcd-m49lh" event={"ID":"667a5519-f1d2-40e9-b3bc-46d649ba3525","Type":"ContainerStarted","Data":"87fcfaea092aa6abf5b2eb5b69d7aa8cac107c87d3c0a3808566b1911cc1cf29"} Apr 19 15:30:15.758318 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:15.758014 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-65b77f8fcd-m49lh" Apr 19 15:30:15.774916 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:15.774857 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-65b77f8fcd-m49lh" podStartSLOduration=1.847861526 podStartE2EDuration="3.774837697s" podCreationTimestamp="2026-04-19 15:30:12 +0000 UTC" firstStartedPulling="2026-04-19 15:30:12.76450139 +0000 UTC m=+311.870416759" lastFinishedPulling="2026-04-19 15:30:14.69147756 +0000 UTC m=+313.797392930" observedRunningTime="2026-04-19 15:30:15.772607855 +0000 UTC m=+314.878523247" watchObservedRunningTime="2026-04-19 15:30:15.774837697 +0000 UTC m=+314.880753086" Apr 19 15:30:18.039995 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:18.039958 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358g5d2"] Apr 19 15:30:18.043339 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:18.043321 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358g5d2" Apr 19 15:30:18.045537 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:18.045506 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 19 15:30:18.045672 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:18.045538 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-s8b9d\"" Apr 19 15:30:18.046256 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:18.046240 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 19 15:30:18.050293 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:18.050266 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358g5d2"] Apr 19 15:30:18.120144 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:18.120112 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/320b1121-0f45-4093-a07c-86fc0999e213-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358g5d2\" (UID: \"320b1121-0f45-4093-a07c-86fc0999e213\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358g5d2" Apr 19 15:30:18.120340 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:18.120162 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/320b1121-0f45-4093-a07c-86fc0999e213-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358g5d2\" (UID: \"320b1121-0f45-4093-a07c-86fc0999e213\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358g5d2" Apr 19 15:30:18.120340 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:18.120259 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwqtr\" (UniqueName: \"kubernetes.io/projected/320b1121-0f45-4093-a07c-86fc0999e213-kube-api-access-rwqtr\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358g5d2\" (UID: \"320b1121-0f45-4093-a07c-86fc0999e213\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358g5d2" Apr 19 15:30:18.221705 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:18.221656 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/320b1121-0f45-4093-a07c-86fc0999e213-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358g5d2\" (UID: \"320b1121-0f45-4093-a07c-86fc0999e213\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358g5d2" Apr 19 15:30:18.221920 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:18.221779 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rwqtr\" (UniqueName: \"kubernetes.io/projected/320b1121-0f45-4093-a07c-86fc0999e213-kube-api-access-rwqtr\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358g5d2\" (UID: \"320b1121-0f45-4093-a07c-86fc0999e213\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358g5d2" Apr 19 15:30:18.221920 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:18.221848 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/320b1121-0f45-4093-a07c-86fc0999e213-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358g5d2\" (UID: \"320b1121-0f45-4093-a07c-86fc0999e213\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358g5d2" Apr 19 15:30:18.222144 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:18.222118 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/320b1121-0f45-4093-a07c-86fc0999e213-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358g5d2\" (UID: \"320b1121-0f45-4093-a07c-86fc0999e213\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358g5d2" Apr 19 15:30:18.222284 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:18.222219 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/320b1121-0f45-4093-a07c-86fc0999e213-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358g5d2\" (UID: \"320b1121-0f45-4093-a07c-86fc0999e213\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358g5d2" Apr 19 15:30:18.229304 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:18.229275 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwqtr\" (UniqueName: \"kubernetes.io/projected/320b1121-0f45-4093-a07c-86fc0999e213-kube-api-access-rwqtr\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358g5d2\" (UID: \"320b1121-0f45-4093-a07c-86fc0999e213\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358g5d2" Apr 19 15:30:18.354547 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:18.354444 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358g5d2" Apr 19 15:30:18.486332 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:18.486288 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358g5d2"] Apr 19 15:30:18.491000 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:30:18.490967 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod320b1121_0f45_4093_a07c_86fc0999e213.slice/crio-02c3cc5de74322c2d41c6da6324250e297e4df9552ed77c68073b7d21c7e06ae WatchSource:0}: Error finding container 02c3cc5de74322c2d41c6da6324250e297e4df9552ed77c68073b7d21c7e06ae: Status 404 returned error can't find the container with id 02c3cc5de74322c2d41c6da6324250e297e4df9552ed77c68073b7d21c7e06ae Apr 19 15:30:18.769355 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:18.769323 2579 generic.go:358] "Generic (PLEG): container finished" podID="320b1121-0f45-4093-a07c-86fc0999e213" containerID="1ff1e433b0535c678f59a479f4eba9246c40b1d93335da96fbc8c46b5bc03673" exitCode=0 Apr 19 15:30:18.769501 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:18.769467 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358g5d2" event={"ID":"320b1121-0f45-4093-a07c-86fc0999e213","Type":"ContainerDied","Data":"1ff1e433b0535c678f59a479f4eba9246c40b1d93335da96fbc8c46b5bc03673"} Apr 19 15:30:18.769693 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:18.769621 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358g5d2" event={"ID":"320b1121-0f45-4093-a07c-86fc0999e213","Type":"ContainerStarted","Data":"02c3cc5de74322c2d41c6da6324250e297e4df9552ed77c68073b7d21c7e06ae"} Apr 19 15:30:20.778217 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:20.778184 2579 generic.go:358] "Generic (PLEG): container finished" podID="320b1121-0f45-4093-a07c-86fc0999e213" containerID="4bf152ea37e6a9c5e371856feaf79b0c2c2342f2a2376f19dab1af6c4da2c302" exitCode=0 Apr 19 15:30:20.778653 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:20.778280 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358g5d2" event={"ID":"320b1121-0f45-4093-a07c-86fc0999e213","Type":"ContainerDied","Data":"4bf152ea37e6a9c5e371856feaf79b0c2c2342f2a2376f19dab1af6c4da2c302"} Apr 19 15:30:21.783761 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:21.783707 2579 generic.go:358] "Generic (PLEG): container finished" podID="320b1121-0f45-4093-a07c-86fc0999e213" containerID="8521d84abde12b06089be24d2c1a3243b64968fc7f70d9468b1b4737c2fd1349" exitCode=0 Apr 19 15:30:21.784138 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:21.783770 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358g5d2" event={"ID":"320b1121-0f45-4093-a07c-86fc0999e213","Type":"ContainerDied","Data":"8521d84abde12b06089be24d2c1a3243b64968fc7f70d9468b1b4737c2fd1349"} Apr 19 15:30:22.911803 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:22.911778 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358g5d2" Apr 19 15:30:22.963114 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:22.963083 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/320b1121-0f45-4093-a07c-86fc0999e213-bundle\") pod \"320b1121-0f45-4093-a07c-86fc0999e213\" (UID: \"320b1121-0f45-4093-a07c-86fc0999e213\") " Apr 19 15:30:22.963305 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:22.963129 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rwqtr\" (UniqueName: \"kubernetes.io/projected/320b1121-0f45-4093-a07c-86fc0999e213-kube-api-access-rwqtr\") pod \"320b1121-0f45-4093-a07c-86fc0999e213\" (UID: \"320b1121-0f45-4093-a07c-86fc0999e213\") " Apr 19 15:30:22.963305 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:22.963266 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/320b1121-0f45-4093-a07c-86fc0999e213-util\") pod \"320b1121-0f45-4093-a07c-86fc0999e213\" (UID: \"320b1121-0f45-4093-a07c-86fc0999e213\") " Apr 19 15:30:22.964135 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:22.964101 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/320b1121-0f45-4093-a07c-86fc0999e213-bundle" (OuterVolumeSpecName: "bundle") pod "320b1121-0f45-4093-a07c-86fc0999e213" (UID: "320b1121-0f45-4093-a07c-86fc0999e213"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 19 15:30:22.965499 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:22.965473 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/320b1121-0f45-4093-a07c-86fc0999e213-kube-api-access-rwqtr" (OuterVolumeSpecName: "kube-api-access-rwqtr") pod "320b1121-0f45-4093-a07c-86fc0999e213" (UID: "320b1121-0f45-4093-a07c-86fc0999e213"). InnerVolumeSpecName "kube-api-access-rwqtr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 15:30:22.971325 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:22.971299 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/320b1121-0f45-4093-a07c-86fc0999e213-util" (OuterVolumeSpecName: "util") pod "320b1121-0f45-4093-a07c-86fc0999e213" (UID: "320b1121-0f45-4093-a07c-86fc0999e213"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 19 15:30:23.064443 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:23.064362 2579 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/320b1121-0f45-4093-a07c-86fc0999e213-util\") on node \"ip-10-0-133-218.ec2.internal\" DevicePath \"\"" Apr 19 15:30:23.064443 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:23.064394 2579 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/320b1121-0f45-4093-a07c-86fc0999e213-bundle\") on node \"ip-10-0-133-218.ec2.internal\" DevicePath \"\"" Apr 19 15:30:23.064443 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:23.064404 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rwqtr\" (UniqueName: \"kubernetes.io/projected/320b1121-0f45-4093-a07c-86fc0999e213-kube-api-access-rwqtr\") on node \"ip-10-0-133-218.ec2.internal\" DevicePath \"\"" Apr 19 15:30:23.793290 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:23.793258 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358g5d2" Apr 19 15:30:23.793290 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:23.793279 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358g5d2" event={"ID":"320b1121-0f45-4093-a07c-86fc0999e213","Type":"ContainerDied","Data":"02c3cc5de74322c2d41c6da6324250e297e4df9552ed77c68073b7d21c7e06ae"} Apr 19 15:30:23.793492 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:23.793319 2579 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="02c3cc5de74322c2d41c6da6324250e297e4df9552ed77c68073b7d21c7e06ae" Apr 19 15:30:26.764545 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:26.764511 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-65b77f8fcd-m49lh" Apr 19 15:30:32.129907 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:32.129865 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2fbj8g"] Apr 19 15:30:32.130396 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:32.130378 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="320b1121-0f45-4093-a07c-86fc0999e213" containerName="extract" Apr 19 15:30:32.130450 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:32.130401 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="320b1121-0f45-4093-a07c-86fc0999e213" containerName="extract" Apr 19 15:30:32.130450 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:32.130433 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="320b1121-0f45-4093-a07c-86fc0999e213" containerName="util" Apr 19 15:30:32.130450 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:32.130443 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="320b1121-0f45-4093-a07c-86fc0999e213" containerName="util" Apr 19 15:30:32.130561 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:32.130459 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="320b1121-0f45-4093-a07c-86fc0999e213" containerName="pull" Apr 19 15:30:32.130561 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:32.130468 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="320b1121-0f45-4093-a07c-86fc0999e213" containerName="pull" Apr 19 15:30:32.130561 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:32.130556 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="320b1121-0f45-4093-a07c-86fc0999e213" containerName="extract" Apr 19 15:30:32.140882 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:32.140853 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2fbj8g" Apr 19 15:30:32.143939 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:32.143917 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-s8b9d\"" Apr 19 15:30:32.144421 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:32.144406 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 19 15:30:32.144958 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:32.144935 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 19 15:30:32.156056 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:32.156032 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2fbj8g"] Apr 19 15:30:32.250458 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:32.250420 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/90e1ab2c-772c-4262-8714-3160b840e702-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2fbj8g\" (UID: \"90e1ab2c-772c-4262-8714-3160b840e702\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2fbj8g" Apr 19 15:30:32.250631 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:32.250486 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/90e1ab2c-772c-4262-8714-3160b840e702-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2fbj8g\" (UID: \"90e1ab2c-772c-4262-8714-3160b840e702\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2fbj8g" Apr 19 15:30:32.250631 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:32.250534 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dprg7\" (UniqueName: \"kubernetes.io/projected/90e1ab2c-772c-4262-8714-3160b840e702-kube-api-access-dprg7\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2fbj8g\" (UID: \"90e1ab2c-772c-4262-8714-3160b840e702\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2fbj8g" Apr 19 15:30:32.351219 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:32.351176 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/90e1ab2c-772c-4262-8714-3160b840e702-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2fbj8g\" (UID: \"90e1ab2c-772c-4262-8714-3160b840e702\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2fbj8g" Apr 19 15:30:32.351396 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:32.351234 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/90e1ab2c-772c-4262-8714-3160b840e702-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2fbj8g\" (UID: \"90e1ab2c-772c-4262-8714-3160b840e702\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2fbj8g" Apr 19 15:30:32.351396 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:32.351263 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dprg7\" (UniqueName: \"kubernetes.io/projected/90e1ab2c-772c-4262-8714-3160b840e702-kube-api-access-dprg7\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2fbj8g\" (UID: \"90e1ab2c-772c-4262-8714-3160b840e702\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2fbj8g" Apr 19 15:30:32.351612 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:32.351590 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/90e1ab2c-772c-4262-8714-3160b840e702-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2fbj8g\" (UID: \"90e1ab2c-772c-4262-8714-3160b840e702\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2fbj8g" Apr 19 15:30:32.351651 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:32.351607 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/90e1ab2c-772c-4262-8714-3160b840e702-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2fbj8g\" (UID: \"90e1ab2c-772c-4262-8714-3160b840e702\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2fbj8g" Apr 19 15:30:32.361675 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:32.361652 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dprg7\" (UniqueName: \"kubernetes.io/projected/90e1ab2c-772c-4262-8714-3160b840e702-kube-api-access-dprg7\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2fbj8g\" (UID: \"90e1ab2c-772c-4262-8714-3160b840e702\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2fbj8g" Apr 19 15:30:32.451675 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:32.451641 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2fbj8g" Apr 19 15:30:32.584307 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:32.584282 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2fbj8g"] Apr 19 15:30:32.587030 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:30:32.586996 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod90e1ab2c_772c_4262_8714_3160b840e702.slice/crio-32c169fc8b8efd45945f8a30e273aeb00c2cd91c31c7ea5fc5266f796bfbe1bf WatchSource:0}: Error finding container 32c169fc8b8efd45945f8a30e273aeb00c2cd91c31c7ea5fc5266f796bfbe1bf: Status 404 returned error can't find the container with id 32c169fc8b8efd45945f8a30e273aeb00c2cd91c31c7ea5fc5266f796bfbe1bf Apr 19 15:30:32.826019 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:32.825917 2579 generic.go:358] "Generic (PLEG): container finished" podID="90e1ab2c-772c-4262-8714-3160b840e702" containerID="5cd50f9fc7d37d0a3a3176b97f80fbfb61bbc17a65a91600a825dae18fcd969b" exitCode=0 Apr 19 15:30:32.826019 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:32.826007 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2fbj8g" event={"ID":"90e1ab2c-772c-4262-8714-3160b840e702","Type":"ContainerDied","Data":"5cd50f9fc7d37d0a3a3176b97f80fbfb61bbc17a65a91600a825dae18fcd969b"} Apr 19 15:30:32.826188 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:32.826042 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2fbj8g" event={"ID":"90e1ab2c-772c-4262-8714-3160b840e702","Type":"ContainerStarted","Data":"32c169fc8b8efd45945f8a30e273aeb00c2cd91c31c7ea5fc5266f796bfbe1bf"} Apr 19 15:30:33.831426 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:33.831383 2579 generic.go:358] "Generic (PLEG): container finished" podID="90e1ab2c-772c-4262-8714-3160b840e702" containerID="0bd51f345912a802587125e88498d42dd95c90ebf0d7a2eb5eef28e7f39fcbe3" exitCode=0 Apr 19 15:30:33.831836 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:33.831468 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2fbj8g" event={"ID":"90e1ab2c-772c-4262-8714-3160b840e702","Type":"ContainerDied","Data":"0bd51f345912a802587125e88498d42dd95c90ebf0d7a2eb5eef28e7f39fcbe3"} Apr 19 15:30:34.836748 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:34.836683 2579 generic.go:358] "Generic (PLEG): container finished" podID="90e1ab2c-772c-4262-8714-3160b840e702" containerID="4607fc36d437e0b37d21f2ecd87f455575c4d3795f8ead61cb6e354e55363b1d" exitCode=0 Apr 19 15:30:34.837186 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:34.836765 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2fbj8g" event={"ID":"90e1ab2c-772c-4262-8714-3160b840e702","Type":"ContainerDied","Data":"4607fc36d437e0b37d21f2ecd87f455575c4d3795f8ead61cb6e354e55363b1d"} Apr 19 15:30:35.977465 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:35.977439 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2fbj8g" Apr 19 15:30:36.085572 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:36.085537 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/90e1ab2c-772c-4262-8714-3160b840e702-util\") pod \"90e1ab2c-772c-4262-8714-3160b840e702\" (UID: \"90e1ab2c-772c-4262-8714-3160b840e702\") " Apr 19 15:30:36.085784 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:36.085600 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/90e1ab2c-772c-4262-8714-3160b840e702-bundle\") pod \"90e1ab2c-772c-4262-8714-3160b840e702\" (UID: \"90e1ab2c-772c-4262-8714-3160b840e702\") " Apr 19 15:30:36.085784 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:36.085674 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dprg7\" (UniqueName: \"kubernetes.io/projected/90e1ab2c-772c-4262-8714-3160b840e702-kube-api-access-dprg7\") pod \"90e1ab2c-772c-4262-8714-3160b840e702\" (UID: \"90e1ab2c-772c-4262-8714-3160b840e702\") " Apr 19 15:30:36.086519 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:36.086488 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90e1ab2c-772c-4262-8714-3160b840e702-bundle" (OuterVolumeSpecName: "bundle") pod "90e1ab2c-772c-4262-8714-3160b840e702" (UID: "90e1ab2c-772c-4262-8714-3160b840e702"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 19 15:30:36.087941 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:36.087915 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90e1ab2c-772c-4262-8714-3160b840e702-kube-api-access-dprg7" (OuterVolumeSpecName: "kube-api-access-dprg7") pod "90e1ab2c-772c-4262-8714-3160b840e702" (UID: "90e1ab2c-772c-4262-8714-3160b840e702"). InnerVolumeSpecName "kube-api-access-dprg7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 15:30:36.091339 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:36.091262 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90e1ab2c-772c-4262-8714-3160b840e702-util" (OuterVolumeSpecName: "util") pod "90e1ab2c-772c-4262-8714-3160b840e702" (UID: "90e1ab2c-772c-4262-8714-3160b840e702"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 19 15:30:36.186591 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:36.186531 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dprg7\" (UniqueName: \"kubernetes.io/projected/90e1ab2c-772c-4262-8714-3160b840e702-kube-api-access-dprg7\") on node \"ip-10-0-133-218.ec2.internal\" DevicePath \"\"" Apr 19 15:30:36.186591 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:36.186584 2579 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/90e1ab2c-772c-4262-8714-3160b840e702-util\") on node \"ip-10-0-133-218.ec2.internal\" DevicePath \"\"" Apr 19 15:30:36.186591 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:36.186599 2579 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/90e1ab2c-772c-4262-8714-3160b840e702-bundle\") on node \"ip-10-0-133-218.ec2.internal\" DevicePath \"\"" Apr 19 15:30:36.846288 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:36.846260 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2fbj8g" Apr 19 15:30:36.846435 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:36.846258 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2fbj8g" event={"ID":"90e1ab2c-772c-4262-8714-3160b840e702","Type":"ContainerDied","Data":"32c169fc8b8efd45945f8a30e273aeb00c2cd91c31c7ea5fc5266f796bfbe1bf"} Apr 19 15:30:36.846435 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:36.846368 2579 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="32c169fc8b8efd45945f8a30e273aeb00c2cd91c31c7ea5fc5266f796bfbe1bf" Apr 19 15:30:52.858553 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:52.858474 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwmhkc"] Apr 19 15:30:52.859082 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:52.859062 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="90e1ab2c-772c-4262-8714-3160b840e702" containerName="util" Apr 19 15:30:52.859145 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:52.859088 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="90e1ab2c-772c-4262-8714-3160b840e702" containerName="util" Apr 19 15:30:52.859145 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:52.859102 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="90e1ab2c-772c-4262-8714-3160b840e702" containerName="pull" Apr 19 15:30:52.859145 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:52.859110 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="90e1ab2c-772c-4262-8714-3160b840e702" containerName="pull" Apr 19 15:30:52.859145 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:52.859128 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="90e1ab2c-772c-4262-8714-3160b840e702" containerName="extract" Apr 19 15:30:52.859145 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:52.859138 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="90e1ab2c-772c-4262-8714-3160b840e702" containerName="extract" Apr 19 15:30:52.859375 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:52.859205 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="90e1ab2c-772c-4262-8714-3160b840e702" containerName="extract" Apr 19 15:30:52.862397 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:52.862380 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwmhkc" Apr 19 15:30:52.864629 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:52.864604 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 19 15:30:52.864629 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:52.864622 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"data-science-gateway-data-science-gateway-class-dockercfg-k9w2k\"" Apr 19 15:30:52.864898 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:52.864604 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 19 15:30:52.865429 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:52.865411 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 19 15:30:52.874307 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:52.874283 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwmhkc"] Apr 19 15:30:52.934461 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:52.934423 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/3bfb65c0-effa-41d5-bdd1-2e41eb93cdbb-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfwmhkc\" (UID: \"3bfb65c0-effa-41d5-bdd1-2e41eb93cdbb\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwmhkc" Apr 19 15:30:52.934637 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:52.934474 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/3bfb65c0-effa-41d5-bdd1-2e41eb93cdbb-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfwmhkc\" (UID: \"3bfb65c0-effa-41d5-bdd1-2e41eb93cdbb\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwmhkc" Apr 19 15:30:52.934637 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:52.934513 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/3bfb65c0-effa-41d5-bdd1-2e41eb93cdbb-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfwmhkc\" (UID: \"3bfb65c0-effa-41d5-bdd1-2e41eb93cdbb\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwmhkc" Apr 19 15:30:52.934637 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:52.934532 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gg6b8\" (UniqueName: \"kubernetes.io/projected/3bfb65c0-effa-41d5-bdd1-2e41eb93cdbb-kube-api-access-gg6b8\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfwmhkc\" (UID: \"3bfb65c0-effa-41d5-bdd1-2e41eb93cdbb\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwmhkc" Apr 19 15:30:52.934637 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:52.934554 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/3bfb65c0-effa-41d5-bdd1-2e41eb93cdbb-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfwmhkc\" (UID: \"3bfb65c0-effa-41d5-bdd1-2e41eb93cdbb\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwmhkc" Apr 19 15:30:52.934637 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:52.934608 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/3bfb65c0-effa-41d5-bdd1-2e41eb93cdbb-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfwmhkc\" (UID: \"3bfb65c0-effa-41d5-bdd1-2e41eb93cdbb\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwmhkc" Apr 19 15:30:52.934825 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:52.934666 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/3bfb65c0-effa-41d5-bdd1-2e41eb93cdbb-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfwmhkc\" (UID: \"3bfb65c0-effa-41d5-bdd1-2e41eb93cdbb\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwmhkc" Apr 19 15:30:52.934825 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:52.934685 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/3bfb65c0-effa-41d5-bdd1-2e41eb93cdbb-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfwmhkc\" (UID: \"3bfb65c0-effa-41d5-bdd1-2e41eb93cdbb\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwmhkc" Apr 19 15:30:52.934825 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:52.934703 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/3bfb65c0-effa-41d5-bdd1-2e41eb93cdbb-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfwmhkc\" (UID: \"3bfb65c0-effa-41d5-bdd1-2e41eb93cdbb\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwmhkc" Apr 19 15:30:53.035672 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:53.035635 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/3bfb65c0-effa-41d5-bdd1-2e41eb93cdbb-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfwmhkc\" (UID: \"3bfb65c0-effa-41d5-bdd1-2e41eb93cdbb\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwmhkc" Apr 19 15:30:53.035672 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:53.035679 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/3bfb65c0-effa-41d5-bdd1-2e41eb93cdbb-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfwmhkc\" (UID: \"3bfb65c0-effa-41d5-bdd1-2e41eb93cdbb\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwmhkc" Apr 19 15:30:53.036007 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:53.035702 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/3bfb65c0-effa-41d5-bdd1-2e41eb93cdbb-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfwmhkc\" (UID: \"3bfb65c0-effa-41d5-bdd1-2e41eb93cdbb\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwmhkc" Apr 19 15:30:53.036007 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:53.035775 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/3bfb65c0-effa-41d5-bdd1-2e41eb93cdbb-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfwmhkc\" (UID: \"3bfb65c0-effa-41d5-bdd1-2e41eb93cdbb\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwmhkc" Apr 19 15:30:53.036007 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:53.035822 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/3bfb65c0-effa-41d5-bdd1-2e41eb93cdbb-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfwmhkc\" (UID: \"3bfb65c0-effa-41d5-bdd1-2e41eb93cdbb\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwmhkc" Apr 19 15:30:53.036007 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:53.035838 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/3bfb65c0-effa-41d5-bdd1-2e41eb93cdbb-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfwmhkc\" (UID: \"3bfb65c0-effa-41d5-bdd1-2e41eb93cdbb\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwmhkc" Apr 19 15:30:53.036007 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:53.035872 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gg6b8\" (UniqueName: \"kubernetes.io/projected/3bfb65c0-effa-41d5-bdd1-2e41eb93cdbb-kube-api-access-gg6b8\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfwmhkc\" (UID: \"3bfb65c0-effa-41d5-bdd1-2e41eb93cdbb\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwmhkc" Apr 19 15:30:53.036007 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:53.035915 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/3bfb65c0-effa-41d5-bdd1-2e41eb93cdbb-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfwmhkc\" (UID: \"3bfb65c0-effa-41d5-bdd1-2e41eb93cdbb\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwmhkc" Apr 19 15:30:53.036007 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:53.035943 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/3bfb65c0-effa-41d5-bdd1-2e41eb93cdbb-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfwmhkc\" (UID: \"3bfb65c0-effa-41d5-bdd1-2e41eb93cdbb\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwmhkc" Apr 19 15:30:53.036321 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:53.036096 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/3bfb65c0-effa-41d5-bdd1-2e41eb93cdbb-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfwmhkc\" (UID: \"3bfb65c0-effa-41d5-bdd1-2e41eb93cdbb\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwmhkc" Apr 19 15:30:53.036321 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:53.036182 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/3bfb65c0-effa-41d5-bdd1-2e41eb93cdbb-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfwmhkc\" (UID: \"3bfb65c0-effa-41d5-bdd1-2e41eb93cdbb\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwmhkc" Apr 19 15:30:53.036484 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:53.036457 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/3bfb65c0-effa-41d5-bdd1-2e41eb93cdbb-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfwmhkc\" (UID: \"3bfb65c0-effa-41d5-bdd1-2e41eb93cdbb\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwmhkc" Apr 19 15:30:53.036612 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:53.036537 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/3bfb65c0-effa-41d5-bdd1-2e41eb93cdbb-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfwmhkc\" (UID: \"3bfb65c0-effa-41d5-bdd1-2e41eb93cdbb\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwmhkc" Apr 19 15:30:53.036758 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:53.036708 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/3bfb65c0-effa-41d5-bdd1-2e41eb93cdbb-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfwmhkc\" (UID: \"3bfb65c0-effa-41d5-bdd1-2e41eb93cdbb\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwmhkc" Apr 19 15:30:53.038436 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:53.038414 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/3bfb65c0-effa-41d5-bdd1-2e41eb93cdbb-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfwmhkc\" (UID: \"3bfb65c0-effa-41d5-bdd1-2e41eb93cdbb\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwmhkc" Apr 19 15:30:53.038502 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:53.038435 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/3bfb65c0-effa-41d5-bdd1-2e41eb93cdbb-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfwmhkc\" (UID: \"3bfb65c0-effa-41d5-bdd1-2e41eb93cdbb\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwmhkc" Apr 19 15:30:53.043104 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:53.043077 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/3bfb65c0-effa-41d5-bdd1-2e41eb93cdbb-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfwmhkc\" (UID: \"3bfb65c0-effa-41d5-bdd1-2e41eb93cdbb\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwmhkc" Apr 19 15:30:53.043198 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:53.043118 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gg6b8\" (UniqueName: \"kubernetes.io/projected/3bfb65c0-effa-41d5-bdd1-2e41eb93cdbb-kube-api-access-gg6b8\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfwmhkc\" (UID: \"3bfb65c0-effa-41d5-bdd1-2e41eb93cdbb\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwmhkc" Apr 19 15:30:53.176759 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:53.176631 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwmhkc" Apr 19 15:30:53.305773 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:53.305746 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwmhkc"] Apr 19 15:30:53.307352 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:30:53.307321 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3bfb65c0_effa_41d5_bdd1_2e41eb93cdbb.slice/crio-ef8d9dacb80a3b559cfb7588ff78e4aa21705d381ba7220ab02dbac4e4ca4b51 WatchSource:0}: Error finding container ef8d9dacb80a3b559cfb7588ff78e4aa21705d381ba7220ab02dbac4e4ca4b51: Status 404 returned error can't find the container with id ef8d9dacb80a3b559cfb7588ff78e4aa21705d381ba7220ab02dbac4e4ca4b51 Apr 19 15:30:53.910707 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:53.910668 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwmhkc" event={"ID":"3bfb65c0-effa-41d5-bdd1-2e41eb93cdbb","Type":"ContainerStarted","Data":"ef8d9dacb80a3b559cfb7588ff78e4aa21705d381ba7220ab02dbac4e4ca4b51"} Apr 19 15:30:56.114483 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:56.114439 2579 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892160Ki","pods":"250"} Apr 19 15:30:56.114738 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:56.114521 2579 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892160Ki","pods":"250"} Apr 19 15:30:56.114738 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:56.114549 2579 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892160Ki","pods":"250"} Apr 19 15:30:56.928333 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:56.928296 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwmhkc" event={"ID":"3bfb65c0-effa-41d5-bdd1-2e41eb93cdbb","Type":"ContainerStarted","Data":"eb7474355ae8073b830350805094b3284ddd57ca48b91af5df38971234b1e66e"} Apr 19 15:30:56.948047 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:56.947985 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwmhkc" podStartSLOduration=2.143055953 podStartE2EDuration="4.947969543s" podCreationTimestamp="2026-04-19 15:30:52 +0000 UTC" firstStartedPulling="2026-04-19 15:30:53.309252882 +0000 UTC m=+352.415168252" lastFinishedPulling="2026-04-19 15:30:56.114166473 +0000 UTC m=+355.220081842" observedRunningTime="2026-04-19 15:30:56.94590706 +0000 UTC m=+356.051822461" watchObservedRunningTime="2026-04-19 15:30:56.947969543 +0000 UTC m=+356.053884931" Apr 19 15:30:57.177807 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:57.177766 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwmhkc" Apr 19 15:30:57.182636 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:57.182609 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwmhkc" Apr 19 15:30:57.932035 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:57.932001 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwmhkc" Apr 19 15:30:57.933246 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:30:57.933226 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwmhkc" Apr 19 15:31:12.740201 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:12.740161 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-s4t8q"] Apr 19 15:31:12.745400 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:12.745376 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-s4t8q" Apr 19 15:31:12.747860 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:12.747675 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 19 15:31:12.747860 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:12.747776 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-catalog-dockercfg-lvmz2\"" Apr 19 15:31:12.748269 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:12.748175 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 19 15:31:12.750854 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:12.750830 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-s4t8q"] Apr 19 15:31:12.814739 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:12.814675 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfrpl\" (UniqueName: \"kubernetes.io/projected/f277f78d-2c61-44fc-a4ae-337ee6b1947e-kube-api-access-zfrpl\") pod \"kuadrant-operator-catalog-s4t8q\" (UID: \"f277f78d-2c61-44fc-a4ae-337ee6b1947e\") " pod="kuadrant-system/kuadrant-operator-catalog-s4t8q" Apr 19 15:31:12.915875 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:12.915832 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zfrpl\" (UniqueName: \"kubernetes.io/projected/f277f78d-2c61-44fc-a4ae-337ee6b1947e-kube-api-access-zfrpl\") pod \"kuadrant-operator-catalog-s4t8q\" (UID: \"f277f78d-2c61-44fc-a4ae-337ee6b1947e\") " pod="kuadrant-system/kuadrant-operator-catalog-s4t8q" Apr 19 15:31:12.923682 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:12.923654 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfrpl\" (UniqueName: \"kubernetes.io/projected/f277f78d-2c61-44fc-a4ae-337ee6b1947e-kube-api-access-zfrpl\") pod \"kuadrant-operator-catalog-s4t8q\" (UID: \"f277f78d-2c61-44fc-a4ae-337ee6b1947e\") " pod="kuadrant-system/kuadrant-operator-catalog-s4t8q" Apr 19 15:31:13.056561 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:13.056472 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-s4t8q" Apr 19 15:31:13.113135 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:13.113103 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-s4t8q"] Apr 19 15:31:13.182702 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:13.182672 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-s4t8q"] Apr 19 15:31:13.184522 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:31:13.184491 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf277f78d_2c61_44fc_a4ae_337ee6b1947e.slice/crio-75a7935e7b2ba4504f1c1d6cf91f6f7edd7794de1e604a0b367d6fcae8014e8f WatchSource:0}: Error finding container 75a7935e7b2ba4504f1c1d6cf91f6f7edd7794de1e604a0b367d6fcae8014e8f: Status 404 returned error can't find the container with id 75a7935e7b2ba4504f1c1d6cf91f6f7edd7794de1e604a0b367d6fcae8014e8f Apr 19 15:31:13.320194 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:13.320111 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-2r9zr"] Apr 19 15:31:13.325211 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:13.325191 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-2r9zr" Apr 19 15:31:13.328774 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:13.328749 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-2r9zr"] Apr 19 15:31:13.420124 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:13.420074 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvv9v\" (UniqueName: \"kubernetes.io/projected/4b6aecfb-f463-4b12-8f75-9252f1a3d5ee-kube-api-access-nvv9v\") pod \"kuadrant-operator-catalog-2r9zr\" (UID: \"4b6aecfb-f463-4b12-8f75-9252f1a3d5ee\") " pod="kuadrant-system/kuadrant-operator-catalog-2r9zr" Apr 19 15:31:13.520896 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:13.520861 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nvv9v\" (UniqueName: \"kubernetes.io/projected/4b6aecfb-f463-4b12-8f75-9252f1a3d5ee-kube-api-access-nvv9v\") pod \"kuadrant-operator-catalog-2r9zr\" (UID: \"4b6aecfb-f463-4b12-8f75-9252f1a3d5ee\") " pod="kuadrant-system/kuadrant-operator-catalog-2r9zr" Apr 19 15:31:13.528173 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:13.528140 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvv9v\" (UniqueName: \"kubernetes.io/projected/4b6aecfb-f463-4b12-8f75-9252f1a3d5ee-kube-api-access-nvv9v\") pod \"kuadrant-operator-catalog-2r9zr\" (UID: \"4b6aecfb-f463-4b12-8f75-9252f1a3d5ee\") " pod="kuadrant-system/kuadrant-operator-catalog-2r9zr" Apr 19 15:31:13.636871 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:13.636766 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-2r9zr" Apr 19 15:31:13.779656 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:13.779634 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-2r9zr"] Apr 19 15:31:13.781394 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:31:13.781364 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b6aecfb_f463_4b12_8f75_9252f1a3d5ee.slice/crio-2acb041dc20410ed736cc53aed4c8b53c9103aba29d3a678b90efd17d6ebd612 WatchSource:0}: Error finding container 2acb041dc20410ed736cc53aed4c8b53c9103aba29d3a678b90efd17d6ebd612: Status 404 returned error can't find the container with id 2acb041dc20410ed736cc53aed4c8b53c9103aba29d3a678b90efd17d6ebd612 Apr 19 15:31:13.997086 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:13.997050 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-2r9zr" event={"ID":"4b6aecfb-f463-4b12-8f75-9252f1a3d5ee","Type":"ContainerStarted","Data":"2acb041dc20410ed736cc53aed4c8b53c9103aba29d3a678b90efd17d6ebd612"} Apr 19 15:31:13.998310 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:13.998279 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-s4t8q" event={"ID":"f277f78d-2c61-44fc-a4ae-337ee6b1947e","Type":"ContainerStarted","Data":"75a7935e7b2ba4504f1c1d6cf91f6f7edd7794de1e604a0b367d6fcae8014e8f"} Apr 19 15:31:16.008919 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:16.008862 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-2r9zr" event={"ID":"4b6aecfb-f463-4b12-8f75-9252f1a3d5ee","Type":"ContainerStarted","Data":"53bbe621dfebcd887a11aa864c1fff3d6cf6ca75a00ad5835b6495b8a5994064"} Apr 19 15:31:16.016701 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:16.016669 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-s4t8q" event={"ID":"f277f78d-2c61-44fc-a4ae-337ee6b1947e","Type":"ContainerStarted","Data":"f692bbec3332a2e373dacbeb726e42133335d6cc5cb5a9acb9544eabbec841c1"} Apr 19 15:31:16.016878 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:16.016818 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-catalog-s4t8q" podUID="f277f78d-2c61-44fc-a4ae-337ee6b1947e" containerName="registry-server" containerID="cri-o://f692bbec3332a2e373dacbeb726e42133335d6cc5cb5a9acb9544eabbec841c1" gracePeriod=2 Apr 19 15:31:16.028129 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:16.028083 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-2r9zr" podStartSLOduration=1.45726954 podStartE2EDuration="3.028066457s" podCreationTimestamp="2026-04-19 15:31:13 +0000 UTC" firstStartedPulling="2026-04-19 15:31:13.782884209 +0000 UTC m=+372.888799574" lastFinishedPulling="2026-04-19 15:31:15.353681117 +0000 UTC m=+374.459596491" observedRunningTime="2026-04-19 15:31:16.026280688 +0000 UTC m=+375.132196075" watchObservedRunningTime="2026-04-19 15:31:16.028066457 +0000 UTC m=+375.133981839" Apr 19 15:31:16.040335 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:16.040289 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-s4t8q" podStartSLOduration=1.87532993 podStartE2EDuration="4.040274452s" podCreationTimestamp="2026-04-19 15:31:12 +0000 UTC" firstStartedPulling="2026-04-19 15:31:13.185885471 +0000 UTC m=+372.291800837" lastFinishedPulling="2026-04-19 15:31:15.35082999 +0000 UTC m=+374.456745359" observedRunningTime="2026-04-19 15:31:16.038760316 +0000 UTC m=+375.144675702" watchObservedRunningTime="2026-04-19 15:31:16.040274452 +0000 UTC m=+375.146189840" Apr 19 15:31:16.256161 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:16.256137 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-s4t8q" Apr 19 15:31:16.346870 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:16.346765 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zfrpl\" (UniqueName: \"kubernetes.io/projected/f277f78d-2c61-44fc-a4ae-337ee6b1947e-kube-api-access-zfrpl\") pod \"f277f78d-2c61-44fc-a4ae-337ee6b1947e\" (UID: \"f277f78d-2c61-44fc-a4ae-337ee6b1947e\") " Apr 19 15:31:16.349148 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:16.349120 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f277f78d-2c61-44fc-a4ae-337ee6b1947e-kube-api-access-zfrpl" (OuterVolumeSpecName: "kube-api-access-zfrpl") pod "f277f78d-2c61-44fc-a4ae-337ee6b1947e" (UID: "f277f78d-2c61-44fc-a4ae-337ee6b1947e"). InnerVolumeSpecName "kube-api-access-zfrpl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 15:31:16.447413 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:16.447371 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zfrpl\" (UniqueName: \"kubernetes.io/projected/f277f78d-2c61-44fc-a4ae-337ee6b1947e-kube-api-access-zfrpl\") on node \"ip-10-0-133-218.ec2.internal\" DevicePath \"\"" Apr 19 15:31:17.021676 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:17.021640 2579 generic.go:358] "Generic (PLEG): container finished" podID="f277f78d-2c61-44fc-a4ae-337ee6b1947e" containerID="f692bbec3332a2e373dacbeb726e42133335d6cc5cb5a9acb9544eabbec841c1" exitCode=0 Apr 19 15:31:17.022185 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:17.021701 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-s4t8q" Apr 19 15:31:17.022185 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:17.021740 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-s4t8q" event={"ID":"f277f78d-2c61-44fc-a4ae-337ee6b1947e","Type":"ContainerDied","Data":"f692bbec3332a2e373dacbeb726e42133335d6cc5cb5a9acb9544eabbec841c1"} Apr 19 15:31:17.022185 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:17.021773 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-s4t8q" event={"ID":"f277f78d-2c61-44fc-a4ae-337ee6b1947e","Type":"ContainerDied","Data":"75a7935e7b2ba4504f1c1d6cf91f6f7edd7794de1e604a0b367d6fcae8014e8f"} Apr 19 15:31:17.022185 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:17.021790 2579 scope.go:117] "RemoveContainer" containerID="f692bbec3332a2e373dacbeb726e42133335d6cc5cb5a9acb9544eabbec841c1" Apr 19 15:31:17.031659 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:17.031642 2579 scope.go:117] "RemoveContainer" containerID="f692bbec3332a2e373dacbeb726e42133335d6cc5cb5a9acb9544eabbec841c1" Apr 19 15:31:17.031939 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:31:17.031922 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f692bbec3332a2e373dacbeb726e42133335d6cc5cb5a9acb9544eabbec841c1\": container with ID starting with f692bbec3332a2e373dacbeb726e42133335d6cc5cb5a9acb9544eabbec841c1 not found: ID does not exist" containerID="f692bbec3332a2e373dacbeb726e42133335d6cc5cb5a9acb9544eabbec841c1" Apr 19 15:31:17.032009 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:17.031950 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f692bbec3332a2e373dacbeb726e42133335d6cc5cb5a9acb9544eabbec841c1"} err="failed to get container status \"f692bbec3332a2e373dacbeb726e42133335d6cc5cb5a9acb9544eabbec841c1\": rpc error: code = NotFound desc = could not find container \"f692bbec3332a2e373dacbeb726e42133335d6cc5cb5a9acb9544eabbec841c1\": container with ID starting with f692bbec3332a2e373dacbeb726e42133335d6cc5cb5a9acb9544eabbec841c1 not found: ID does not exist" Apr 19 15:31:17.043409 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:17.043385 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-s4t8q"] Apr 19 15:31:17.049471 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:17.049441 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-s4t8q"] Apr 19 15:31:17.545591 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:17.545554 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f277f78d-2c61-44fc-a4ae-337ee6b1947e" path="/var/lib/kubelet/pods/f277f78d-2c61-44fc-a4ae-337ee6b1947e/volumes" Apr 19 15:31:23.637426 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:23.637390 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-catalog-2r9zr" Apr 19 15:31:23.637883 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:23.637442 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kuadrant-system/kuadrant-operator-catalog-2r9zr" Apr 19 15:31:23.659342 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:23.659317 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kuadrant-system/kuadrant-operator-catalog-2r9zr" Apr 19 15:31:24.068439 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:24.068412 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-catalog-2r9zr" Apr 19 15:31:28.151734 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:28.151695 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1wpbws"] Apr 19 15:31:28.152177 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:28.152085 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f277f78d-2c61-44fc-a4ae-337ee6b1947e" containerName="registry-server" Apr 19 15:31:28.152177 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:28.152096 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="f277f78d-2c61-44fc-a4ae-337ee6b1947e" containerName="registry-server" Apr 19 15:31:28.152177 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:28.152162 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="f277f78d-2c61-44fc-a4ae-337ee6b1947e" containerName="registry-server" Apr 19 15:31:28.155424 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:28.155405 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1wpbws" Apr 19 15:31:28.157407 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:28.157373 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-qzbc7\"" Apr 19 15:31:28.161885 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:28.161861 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1wpbws"] Apr 19 15:31:28.255990 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:28.255944 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltd7x\" (UniqueName: \"kubernetes.io/projected/29a72e0e-4e70-414b-95a6-6ab81cf351b6-kube-api-access-ltd7x\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1wpbws\" (UID: \"29a72e0e-4e70-414b-95a6-6ab81cf351b6\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1wpbws" Apr 19 15:31:28.256164 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:28.256103 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/29a72e0e-4e70-414b-95a6-6ab81cf351b6-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1wpbws\" (UID: \"29a72e0e-4e70-414b-95a6-6ab81cf351b6\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1wpbws" Apr 19 15:31:28.256207 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:28.256170 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/29a72e0e-4e70-414b-95a6-6ab81cf351b6-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1wpbws\" (UID: \"29a72e0e-4e70-414b-95a6-6ab81cf351b6\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1wpbws" Apr 19 15:31:28.357034 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:28.356992 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/29a72e0e-4e70-414b-95a6-6ab81cf351b6-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1wpbws\" (UID: \"29a72e0e-4e70-414b-95a6-6ab81cf351b6\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1wpbws" Apr 19 15:31:28.357181 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:28.357049 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/29a72e0e-4e70-414b-95a6-6ab81cf351b6-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1wpbws\" (UID: \"29a72e0e-4e70-414b-95a6-6ab81cf351b6\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1wpbws" Apr 19 15:31:28.357181 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:28.357099 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ltd7x\" (UniqueName: \"kubernetes.io/projected/29a72e0e-4e70-414b-95a6-6ab81cf351b6-kube-api-access-ltd7x\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1wpbws\" (UID: \"29a72e0e-4e70-414b-95a6-6ab81cf351b6\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1wpbws" Apr 19 15:31:28.357490 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:28.357466 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/29a72e0e-4e70-414b-95a6-6ab81cf351b6-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1wpbws\" (UID: \"29a72e0e-4e70-414b-95a6-6ab81cf351b6\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1wpbws" Apr 19 15:31:28.357523 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:28.357478 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/29a72e0e-4e70-414b-95a6-6ab81cf351b6-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1wpbws\" (UID: \"29a72e0e-4e70-414b-95a6-6ab81cf351b6\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1wpbws" Apr 19 15:31:28.365448 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:28.365416 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltd7x\" (UniqueName: \"kubernetes.io/projected/29a72e0e-4e70-414b-95a6-6ab81cf351b6-kube-api-access-ltd7x\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1wpbws\" (UID: \"29a72e0e-4e70-414b-95a6-6ab81cf351b6\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1wpbws" Apr 19 15:31:28.466838 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:28.466804 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1wpbws" Apr 19 15:31:28.595836 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:28.595811 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1wpbws"] Apr 19 15:31:28.598321 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:31:28.598292 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29a72e0e_4e70_414b_95a6_6ab81cf351b6.slice/crio-16af38b3abb4cd42cd2759c2a483afe98bc1593a27d788b4a4c3d3d28d453ed6 WatchSource:0}: Error finding container 16af38b3abb4cd42cd2759c2a483afe98bc1593a27d788b4a4c3d3d28d453ed6: Status 404 returned error can't find the container with id 16af38b3abb4cd42cd2759c2a483afe98bc1593a27d788b4a4c3d3d28d453ed6 Apr 19 15:31:28.752922 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:28.752883 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73gx7bw"] Apr 19 15:31:28.756384 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:28.756368 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73gx7bw" Apr 19 15:31:28.762612 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:28.762577 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73gx7bw"] Apr 19 15:31:28.863406 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:28.863360 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6fa5de8e-282f-472b-bc6d-b56bf8c2fc82-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73gx7bw\" (UID: \"6fa5de8e-282f-472b-bc6d-b56bf8c2fc82\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73gx7bw" Apr 19 15:31:28.863580 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:28.863424 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6fa5de8e-282f-472b-bc6d-b56bf8c2fc82-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73gx7bw\" (UID: \"6fa5de8e-282f-472b-bc6d-b56bf8c2fc82\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73gx7bw" Apr 19 15:31:28.863580 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:28.863454 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mh9rc\" (UniqueName: \"kubernetes.io/projected/6fa5de8e-282f-472b-bc6d-b56bf8c2fc82-kube-api-access-mh9rc\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73gx7bw\" (UID: \"6fa5de8e-282f-472b-bc6d-b56bf8c2fc82\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73gx7bw" Apr 19 15:31:28.964362 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:28.964323 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6fa5de8e-282f-472b-bc6d-b56bf8c2fc82-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73gx7bw\" (UID: \"6fa5de8e-282f-472b-bc6d-b56bf8c2fc82\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73gx7bw" Apr 19 15:31:28.964539 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:28.964383 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6fa5de8e-282f-472b-bc6d-b56bf8c2fc82-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73gx7bw\" (UID: \"6fa5de8e-282f-472b-bc6d-b56bf8c2fc82\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73gx7bw" Apr 19 15:31:28.964539 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:28.964427 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mh9rc\" (UniqueName: \"kubernetes.io/projected/6fa5de8e-282f-472b-bc6d-b56bf8c2fc82-kube-api-access-mh9rc\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73gx7bw\" (UID: \"6fa5de8e-282f-472b-bc6d-b56bf8c2fc82\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73gx7bw" Apr 19 15:31:28.964709 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:28.964688 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6fa5de8e-282f-472b-bc6d-b56bf8c2fc82-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73gx7bw\" (UID: \"6fa5de8e-282f-472b-bc6d-b56bf8c2fc82\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73gx7bw" Apr 19 15:31:28.964790 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:28.964715 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6fa5de8e-282f-472b-bc6d-b56bf8c2fc82-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73gx7bw\" (UID: \"6fa5de8e-282f-472b-bc6d-b56bf8c2fc82\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73gx7bw" Apr 19 15:31:28.972056 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:28.972010 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mh9rc\" (UniqueName: \"kubernetes.io/projected/6fa5de8e-282f-472b-bc6d-b56bf8c2fc82-kube-api-access-mh9rc\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73gx7bw\" (UID: \"6fa5de8e-282f-472b-bc6d-b56bf8c2fc82\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73gx7bw" Apr 19 15:31:29.065250 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:29.065213 2579 generic.go:358] "Generic (PLEG): container finished" podID="29a72e0e-4e70-414b-95a6-6ab81cf351b6" containerID="d27c2793ae06a8f106a187834c1e1f469e0c1bcf29d829090dba59fb88aaa38d" exitCode=0 Apr 19 15:31:29.065426 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:29.065299 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1wpbws" event={"ID":"29a72e0e-4e70-414b-95a6-6ab81cf351b6","Type":"ContainerDied","Data":"d27c2793ae06a8f106a187834c1e1f469e0c1bcf29d829090dba59fb88aaa38d"} Apr 19 15:31:29.065426 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:29.065335 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1wpbws" event={"ID":"29a72e0e-4e70-414b-95a6-6ab81cf351b6","Type":"ContainerStarted","Data":"16af38b3abb4cd42cd2759c2a483afe98bc1593a27d788b4a4c3d3d28d453ed6"} Apr 19 15:31:29.082914 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:29.082884 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73gx7bw" Apr 19 15:31:29.150945 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:29.150909 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05wccs"] Apr 19 15:31:29.159679 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:29.159642 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05wccs" Apr 19 15:31:29.160964 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:29.160939 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05wccs"] Apr 19 15:31:29.216866 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:29.216839 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73gx7bw"] Apr 19 15:31:29.218760 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:31:29.218734 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6fa5de8e_282f_472b_bc6d_b56bf8c2fc82.slice/crio-62f1bac7f7e2a7b482a39ddbe619a1ca113cf21b0bfbc9d81c48022c8f537b6b WatchSource:0}: Error finding container 62f1bac7f7e2a7b482a39ddbe619a1ca113cf21b0bfbc9d81c48022c8f537b6b: Status 404 returned error can't find the container with id 62f1bac7f7e2a7b482a39ddbe619a1ca113cf21b0bfbc9d81c48022c8f537b6b Apr 19 15:31:29.266934 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:29.266853 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8b042e94-664b-4e73-9ca2-3a8f1330cacd-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05wccs\" (UID: \"8b042e94-664b-4e73-9ca2-3a8f1330cacd\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05wccs" Apr 19 15:31:29.267009 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:29.266965 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwp88\" (UniqueName: \"kubernetes.io/projected/8b042e94-664b-4e73-9ca2-3a8f1330cacd-kube-api-access-gwp88\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05wccs\" (UID: \"8b042e94-664b-4e73-9ca2-3a8f1330cacd\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05wccs" Apr 19 15:31:29.267081 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:29.267064 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8b042e94-664b-4e73-9ca2-3a8f1330cacd-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05wccs\" (UID: \"8b042e94-664b-4e73-9ca2-3a8f1330cacd\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05wccs" Apr 19 15:31:29.368311 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:29.368272 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gwp88\" (UniqueName: \"kubernetes.io/projected/8b042e94-664b-4e73-9ca2-3a8f1330cacd-kube-api-access-gwp88\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05wccs\" (UID: \"8b042e94-664b-4e73-9ca2-3a8f1330cacd\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05wccs" Apr 19 15:31:29.368499 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:29.368335 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8b042e94-664b-4e73-9ca2-3a8f1330cacd-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05wccs\" (UID: \"8b042e94-664b-4e73-9ca2-3a8f1330cacd\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05wccs" Apr 19 15:31:29.368566 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:29.368508 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8b042e94-664b-4e73-9ca2-3a8f1330cacd-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05wccs\" (UID: \"8b042e94-664b-4e73-9ca2-3a8f1330cacd\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05wccs" Apr 19 15:31:29.368679 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:29.368662 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8b042e94-664b-4e73-9ca2-3a8f1330cacd-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05wccs\" (UID: \"8b042e94-664b-4e73-9ca2-3a8f1330cacd\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05wccs" Apr 19 15:31:29.368878 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:29.368852 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8b042e94-664b-4e73-9ca2-3a8f1330cacd-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05wccs\" (UID: \"8b042e94-664b-4e73-9ca2-3a8f1330cacd\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05wccs" Apr 19 15:31:29.376315 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:29.376287 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwp88\" (UniqueName: \"kubernetes.io/projected/8b042e94-664b-4e73-9ca2-3a8f1330cacd-kube-api-access-gwp88\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05wccs\" (UID: \"8b042e94-664b-4e73-9ca2-3a8f1330cacd\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05wccs" Apr 19 15:31:29.470922 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:29.470881 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05wccs" Apr 19 15:31:29.557325 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:29.557289 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759fjgzz"] Apr 19 15:31:29.561331 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:29.561302 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759fjgzz" Apr 19 15:31:29.566807 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:29.566391 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759fjgzz"] Apr 19 15:31:29.610564 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:29.610533 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05wccs"] Apr 19 15:31:29.669235 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:31:29.669193 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b042e94_664b_4e73_9ca2_3a8f1330cacd.slice/crio-b6cfbcb544f995f5cc560594c337fe5d1ce9873ac6a9fe2c99a30366ff87ce23 WatchSource:0}: Error finding container b6cfbcb544f995f5cc560594c337fe5d1ce9873ac6a9fe2c99a30366ff87ce23: Status 404 returned error can't find the container with id b6cfbcb544f995f5cc560594c337fe5d1ce9873ac6a9fe2c99a30366ff87ce23 Apr 19 15:31:29.670478 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:29.670442 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6a58af15-197e-4703-9c60-0b75920abcf7-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759fjgzz\" (UID: \"6a58af15-197e-4703-9c60-0b75920abcf7\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759fjgzz" Apr 19 15:31:29.670654 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:29.670572 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6a58af15-197e-4703-9c60-0b75920abcf7-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759fjgzz\" (UID: \"6a58af15-197e-4703-9c60-0b75920abcf7\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759fjgzz" Apr 19 15:31:29.670885 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:29.670783 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cnkv\" (UniqueName: \"kubernetes.io/projected/6a58af15-197e-4703-9c60-0b75920abcf7-kube-api-access-8cnkv\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759fjgzz\" (UID: \"6a58af15-197e-4703-9c60-0b75920abcf7\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759fjgzz" Apr 19 15:31:29.772354 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:29.772327 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6a58af15-197e-4703-9c60-0b75920abcf7-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759fjgzz\" (UID: \"6a58af15-197e-4703-9c60-0b75920abcf7\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759fjgzz" Apr 19 15:31:29.772471 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:29.772402 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8cnkv\" (UniqueName: \"kubernetes.io/projected/6a58af15-197e-4703-9c60-0b75920abcf7-kube-api-access-8cnkv\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759fjgzz\" (UID: \"6a58af15-197e-4703-9c60-0b75920abcf7\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759fjgzz" Apr 19 15:31:29.772471 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:29.772436 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6a58af15-197e-4703-9c60-0b75920abcf7-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759fjgzz\" (UID: \"6a58af15-197e-4703-9c60-0b75920abcf7\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759fjgzz" Apr 19 15:31:29.772748 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:29.772689 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6a58af15-197e-4703-9c60-0b75920abcf7-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759fjgzz\" (UID: \"6a58af15-197e-4703-9c60-0b75920abcf7\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759fjgzz" Apr 19 15:31:29.772748 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:29.772714 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6a58af15-197e-4703-9c60-0b75920abcf7-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759fjgzz\" (UID: \"6a58af15-197e-4703-9c60-0b75920abcf7\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759fjgzz" Apr 19 15:31:29.781110 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:29.781086 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cnkv\" (UniqueName: \"kubernetes.io/projected/6a58af15-197e-4703-9c60-0b75920abcf7-kube-api-access-8cnkv\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759fjgzz\" (UID: \"6a58af15-197e-4703-9c60-0b75920abcf7\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759fjgzz" Apr 19 15:31:29.873476 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:29.873444 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759fjgzz" Apr 19 15:31:30.002298 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:30.002266 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759fjgzz"] Apr 19 15:31:30.003754 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:31:30.003703 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a58af15_197e_4703_9c60_0b75920abcf7.slice/crio-66f19a499c8244a16e22a7bc61108e5e837e2316926a4b58898a7d7122a0fbfb WatchSource:0}: Error finding container 66f19a499c8244a16e22a7bc61108e5e837e2316926a4b58898a7d7122a0fbfb: Status 404 returned error can't find the container with id 66f19a499c8244a16e22a7bc61108e5e837e2316926a4b58898a7d7122a0fbfb Apr 19 15:31:30.070736 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:30.070672 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759fjgzz" event={"ID":"6a58af15-197e-4703-9c60-0b75920abcf7","Type":"ContainerStarted","Data":"66f19a499c8244a16e22a7bc61108e5e837e2316926a4b58898a7d7122a0fbfb"} Apr 19 15:31:30.072045 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:30.072021 2579 generic.go:358] "Generic (PLEG): container finished" podID="8b042e94-664b-4e73-9ca2-3a8f1330cacd" containerID="88177d2f2c271cdb964202a7fe09939c4fb5bab9e5fbb8d4c3b15a658b9b6c7f" exitCode=0 Apr 19 15:31:30.072184 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:30.072123 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05wccs" event={"ID":"8b042e94-664b-4e73-9ca2-3a8f1330cacd","Type":"ContainerDied","Data":"88177d2f2c271cdb964202a7fe09939c4fb5bab9e5fbb8d4c3b15a658b9b6c7f"} Apr 19 15:31:30.072184 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:30.072166 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05wccs" event={"ID":"8b042e94-664b-4e73-9ca2-3a8f1330cacd","Type":"ContainerStarted","Data":"b6cfbcb544f995f5cc560594c337fe5d1ce9873ac6a9fe2c99a30366ff87ce23"} Apr 19 15:31:30.074083 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:30.074061 2579 generic.go:358] "Generic (PLEG): container finished" podID="29a72e0e-4e70-414b-95a6-6ab81cf351b6" containerID="069769b6a2eeb2f31215d0489076c5e675faf5bae1e70bac3938153cf65be216" exitCode=0 Apr 19 15:31:30.074195 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:30.074120 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1wpbws" event={"ID":"29a72e0e-4e70-414b-95a6-6ab81cf351b6","Type":"ContainerDied","Data":"069769b6a2eeb2f31215d0489076c5e675faf5bae1e70bac3938153cf65be216"} Apr 19 15:31:30.075833 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:30.075781 2579 generic.go:358] "Generic (PLEG): container finished" podID="6fa5de8e-282f-472b-bc6d-b56bf8c2fc82" containerID="59d8240006da7e338ede452740e05c371c277792bb8a75b185d521752458292a" exitCode=0 Apr 19 15:31:30.075903 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:30.075855 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73gx7bw" event={"ID":"6fa5de8e-282f-472b-bc6d-b56bf8c2fc82","Type":"ContainerDied","Data":"59d8240006da7e338ede452740e05c371c277792bb8a75b185d521752458292a"} Apr 19 15:31:30.075903 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:30.075878 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73gx7bw" event={"ID":"6fa5de8e-282f-472b-bc6d-b56bf8c2fc82","Type":"ContainerStarted","Data":"62f1bac7f7e2a7b482a39ddbe619a1ca113cf21b0bfbc9d81c48022c8f537b6b"} Apr 19 15:31:31.083205 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:31.083112 2579 generic.go:358] "Generic (PLEG): container finished" podID="6fa5de8e-282f-472b-bc6d-b56bf8c2fc82" containerID="c28a09afe56acd2f402f62eebd29303ea775e0247b50b544cbe80b9dfeecf6ad" exitCode=0 Apr 19 15:31:31.083592 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:31.083197 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73gx7bw" event={"ID":"6fa5de8e-282f-472b-bc6d-b56bf8c2fc82","Type":"ContainerDied","Data":"c28a09afe56acd2f402f62eebd29303ea775e0247b50b544cbe80b9dfeecf6ad"} Apr 19 15:31:31.084752 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:31.084701 2579 generic.go:358] "Generic (PLEG): container finished" podID="6a58af15-197e-4703-9c60-0b75920abcf7" containerID="d72cdb51505c4f3487941b6e411653e6c94085385fa2358464b7164a45b805e5" exitCode=0 Apr 19 15:31:31.084920 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:31.084839 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759fjgzz" event={"ID":"6a58af15-197e-4703-9c60-0b75920abcf7","Type":"ContainerDied","Data":"d72cdb51505c4f3487941b6e411653e6c94085385fa2358464b7164a45b805e5"} Apr 19 15:31:31.086706 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:31.086662 2579 generic.go:358] "Generic (PLEG): container finished" podID="8b042e94-664b-4e73-9ca2-3a8f1330cacd" containerID="2ee9a8e0429bb11e053841a41e23723358487086638140e3a3d48b6c9f2e9f5f" exitCode=0 Apr 19 15:31:31.086706 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:31.086691 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05wccs" event={"ID":"8b042e94-664b-4e73-9ca2-3a8f1330cacd","Type":"ContainerDied","Data":"2ee9a8e0429bb11e053841a41e23723358487086638140e3a3d48b6c9f2e9f5f"} Apr 19 15:31:31.088850 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:31.088791 2579 generic.go:358] "Generic (PLEG): container finished" podID="29a72e0e-4e70-414b-95a6-6ab81cf351b6" containerID="e828de97bb5f816f3db3e2b54dd9cb69b7f4a4c284350c1528f8e2e7d804ee1f" exitCode=0 Apr 19 15:31:31.088850 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:31.088818 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1wpbws" event={"ID":"29a72e0e-4e70-414b-95a6-6ab81cf351b6","Type":"ContainerDied","Data":"e828de97bb5f816f3db3e2b54dd9cb69b7f4a4c284350c1528f8e2e7d804ee1f"} Apr 19 15:31:32.095448 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:32.095347 2579 generic.go:358] "Generic (PLEG): container finished" podID="6fa5de8e-282f-472b-bc6d-b56bf8c2fc82" containerID="072b38a5bc57a36570c8b86608478aed3c9712ae4f041e4b5b3fcfbd73d4388f" exitCode=0 Apr 19 15:31:32.095865 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:32.095436 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73gx7bw" event={"ID":"6fa5de8e-282f-472b-bc6d-b56bf8c2fc82","Type":"ContainerDied","Data":"072b38a5bc57a36570c8b86608478aed3c9712ae4f041e4b5b3fcfbd73d4388f"} Apr 19 15:31:32.097180 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:32.097162 2579 generic.go:358] "Generic (PLEG): container finished" podID="6a58af15-197e-4703-9c60-0b75920abcf7" containerID="5f66ce3ef93cd233dc5f1691f6571fc7682df3bd4f2492adb0ecc9fe51ba8016" exitCode=0 Apr 19 15:31:32.097266 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:32.097246 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759fjgzz" event={"ID":"6a58af15-197e-4703-9c60-0b75920abcf7","Type":"ContainerDied","Data":"5f66ce3ef93cd233dc5f1691f6571fc7682df3bd4f2492adb0ecc9fe51ba8016"} Apr 19 15:31:32.099249 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:32.099195 2579 generic.go:358] "Generic (PLEG): container finished" podID="8b042e94-664b-4e73-9ca2-3a8f1330cacd" containerID="ff514ea5c481b9f9318c0c35d639d83e3b22a159019823e5c6934a5b322447ee" exitCode=0 Apr 19 15:31:32.099307 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:32.099273 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05wccs" event={"ID":"8b042e94-664b-4e73-9ca2-3a8f1330cacd","Type":"ContainerDied","Data":"ff514ea5c481b9f9318c0c35d639d83e3b22a159019823e5c6934a5b322447ee"} Apr 19 15:31:32.257085 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:32.257064 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1wpbws" Apr 19 15:31:32.397447 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:32.397414 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/29a72e0e-4e70-414b-95a6-6ab81cf351b6-util\") pod \"29a72e0e-4e70-414b-95a6-6ab81cf351b6\" (UID: \"29a72e0e-4e70-414b-95a6-6ab81cf351b6\") " Apr 19 15:31:32.397568 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:32.397482 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ltd7x\" (UniqueName: \"kubernetes.io/projected/29a72e0e-4e70-414b-95a6-6ab81cf351b6-kube-api-access-ltd7x\") pod \"29a72e0e-4e70-414b-95a6-6ab81cf351b6\" (UID: \"29a72e0e-4e70-414b-95a6-6ab81cf351b6\") " Apr 19 15:31:32.397568 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:32.397506 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/29a72e0e-4e70-414b-95a6-6ab81cf351b6-bundle\") pod \"29a72e0e-4e70-414b-95a6-6ab81cf351b6\" (UID: \"29a72e0e-4e70-414b-95a6-6ab81cf351b6\") " Apr 19 15:31:32.398109 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:32.398081 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29a72e0e-4e70-414b-95a6-6ab81cf351b6-bundle" (OuterVolumeSpecName: "bundle") pod "29a72e0e-4e70-414b-95a6-6ab81cf351b6" (UID: "29a72e0e-4e70-414b-95a6-6ab81cf351b6"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 19 15:31:32.399750 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:32.399708 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29a72e0e-4e70-414b-95a6-6ab81cf351b6-kube-api-access-ltd7x" (OuterVolumeSpecName: "kube-api-access-ltd7x") pod "29a72e0e-4e70-414b-95a6-6ab81cf351b6" (UID: "29a72e0e-4e70-414b-95a6-6ab81cf351b6"). InnerVolumeSpecName "kube-api-access-ltd7x". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 15:31:32.404639 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:32.404612 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29a72e0e-4e70-414b-95a6-6ab81cf351b6-util" (OuterVolumeSpecName: "util") pod "29a72e0e-4e70-414b-95a6-6ab81cf351b6" (UID: "29a72e0e-4e70-414b-95a6-6ab81cf351b6"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 19 15:31:32.498799 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:32.498764 2579 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/29a72e0e-4e70-414b-95a6-6ab81cf351b6-util\") on node \"ip-10-0-133-218.ec2.internal\" DevicePath \"\"" Apr 19 15:31:32.498799 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:32.498795 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ltd7x\" (UniqueName: \"kubernetes.io/projected/29a72e0e-4e70-414b-95a6-6ab81cf351b6-kube-api-access-ltd7x\") on node \"ip-10-0-133-218.ec2.internal\" DevicePath \"\"" Apr 19 15:31:32.498799 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:32.498805 2579 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/29a72e0e-4e70-414b-95a6-6ab81cf351b6-bundle\") on node \"ip-10-0-133-218.ec2.internal\" DevicePath \"\"" Apr 19 15:31:33.110934 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:33.110895 2579 generic.go:358] "Generic (PLEG): container finished" podID="6a58af15-197e-4703-9c60-0b75920abcf7" containerID="f5e0fc460963eb759b35aaacdb18c7f5f7915a01776f61976e7c4e99da878ab5" exitCode=0 Apr 19 15:31:33.111360 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:33.110974 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759fjgzz" event={"ID":"6a58af15-197e-4703-9c60-0b75920abcf7","Type":"ContainerDied","Data":"f5e0fc460963eb759b35aaacdb18c7f5f7915a01776f61976e7c4e99da878ab5"} Apr 19 15:31:33.112639 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:33.112620 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1wpbws" Apr 19 15:31:33.112707 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:33.112625 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1wpbws" event={"ID":"29a72e0e-4e70-414b-95a6-6ab81cf351b6","Type":"ContainerDied","Data":"16af38b3abb4cd42cd2759c2a483afe98bc1593a27d788b4a4c3d3d28d453ed6"} Apr 19 15:31:33.112707 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:33.112658 2579 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="16af38b3abb4cd42cd2759c2a483afe98bc1593a27d788b4a4c3d3d28d453ed6" Apr 19 15:31:33.275961 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:33.275936 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05wccs" Apr 19 15:31:33.278862 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:33.278843 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73gx7bw" Apr 19 15:31:33.308069 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:33.308036 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mh9rc\" (UniqueName: \"kubernetes.io/projected/6fa5de8e-282f-472b-bc6d-b56bf8c2fc82-kube-api-access-mh9rc\") pod \"6fa5de8e-282f-472b-bc6d-b56bf8c2fc82\" (UID: \"6fa5de8e-282f-472b-bc6d-b56bf8c2fc82\") " Apr 19 15:31:33.308221 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:33.308096 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8b042e94-664b-4e73-9ca2-3a8f1330cacd-bundle\") pod \"8b042e94-664b-4e73-9ca2-3a8f1330cacd\" (UID: \"8b042e94-664b-4e73-9ca2-3a8f1330cacd\") " Apr 19 15:31:33.308221 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:33.308122 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gwp88\" (UniqueName: \"kubernetes.io/projected/8b042e94-664b-4e73-9ca2-3a8f1330cacd-kube-api-access-gwp88\") pod \"8b042e94-664b-4e73-9ca2-3a8f1330cacd\" (UID: \"8b042e94-664b-4e73-9ca2-3a8f1330cacd\") " Apr 19 15:31:33.308221 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:33.308153 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6fa5de8e-282f-472b-bc6d-b56bf8c2fc82-bundle\") pod \"6fa5de8e-282f-472b-bc6d-b56bf8c2fc82\" (UID: \"6fa5de8e-282f-472b-bc6d-b56bf8c2fc82\") " Apr 19 15:31:33.309822 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:33.309765 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6fa5de8e-282f-472b-bc6d-b56bf8c2fc82-bundle" (OuterVolumeSpecName: "bundle") pod "6fa5de8e-282f-472b-bc6d-b56bf8c2fc82" (UID: "6fa5de8e-282f-472b-bc6d-b56bf8c2fc82"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 19 15:31:33.309954 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:33.309816 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b042e94-664b-4e73-9ca2-3a8f1330cacd-bundle" (OuterVolumeSpecName: "bundle") pod "8b042e94-664b-4e73-9ca2-3a8f1330cacd" (UID: "8b042e94-664b-4e73-9ca2-3a8f1330cacd"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 19 15:31:33.310126 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:33.310094 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6fa5de8e-282f-472b-bc6d-b56bf8c2fc82-util\") pod \"6fa5de8e-282f-472b-bc6d-b56bf8c2fc82\" (UID: \"6fa5de8e-282f-472b-bc6d-b56bf8c2fc82\") " Apr 19 15:31:33.310229 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:33.310165 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8b042e94-664b-4e73-9ca2-3a8f1330cacd-util\") pod \"8b042e94-664b-4e73-9ca2-3a8f1330cacd\" (UID: \"8b042e94-664b-4e73-9ca2-3a8f1330cacd\") " Apr 19 15:31:33.311830 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:33.310680 2579 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8b042e94-664b-4e73-9ca2-3a8f1330cacd-bundle\") on node \"ip-10-0-133-218.ec2.internal\" DevicePath \"\"" Apr 19 15:31:33.311830 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:33.310703 2579 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6fa5de8e-282f-472b-bc6d-b56bf8c2fc82-bundle\") on node \"ip-10-0-133-218.ec2.internal\" DevicePath \"\"" Apr 19 15:31:33.311830 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:33.310945 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fa5de8e-282f-472b-bc6d-b56bf8c2fc82-kube-api-access-mh9rc" (OuterVolumeSpecName: "kube-api-access-mh9rc") pod "6fa5de8e-282f-472b-bc6d-b56bf8c2fc82" (UID: "6fa5de8e-282f-472b-bc6d-b56bf8c2fc82"). InnerVolumeSpecName "kube-api-access-mh9rc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 15:31:33.311830 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:33.311181 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b042e94-664b-4e73-9ca2-3a8f1330cacd-kube-api-access-gwp88" (OuterVolumeSpecName: "kube-api-access-gwp88") pod "8b042e94-664b-4e73-9ca2-3a8f1330cacd" (UID: "8b042e94-664b-4e73-9ca2-3a8f1330cacd"). InnerVolumeSpecName "kube-api-access-gwp88". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 15:31:33.318754 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:33.318696 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b042e94-664b-4e73-9ca2-3a8f1330cacd-util" (OuterVolumeSpecName: "util") pod "8b042e94-664b-4e73-9ca2-3a8f1330cacd" (UID: "8b042e94-664b-4e73-9ca2-3a8f1330cacd"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 19 15:31:33.319002 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:33.318987 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6fa5de8e-282f-472b-bc6d-b56bf8c2fc82-util" (OuterVolumeSpecName: "util") pod "6fa5de8e-282f-472b-bc6d-b56bf8c2fc82" (UID: "6fa5de8e-282f-472b-bc6d-b56bf8c2fc82"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 19 15:31:33.412133 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:33.412089 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mh9rc\" (UniqueName: \"kubernetes.io/projected/6fa5de8e-282f-472b-bc6d-b56bf8c2fc82-kube-api-access-mh9rc\") on node \"ip-10-0-133-218.ec2.internal\" DevicePath \"\"" Apr 19 15:31:33.412133 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:33.412127 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gwp88\" (UniqueName: \"kubernetes.io/projected/8b042e94-664b-4e73-9ca2-3a8f1330cacd-kube-api-access-gwp88\") on node \"ip-10-0-133-218.ec2.internal\" DevicePath \"\"" Apr 19 15:31:33.412133 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:33.412141 2579 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6fa5de8e-282f-472b-bc6d-b56bf8c2fc82-util\") on node \"ip-10-0-133-218.ec2.internal\" DevicePath \"\"" Apr 19 15:31:33.412365 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:33.412155 2579 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8b042e94-664b-4e73-9ca2-3a8f1330cacd-util\") on node \"ip-10-0-133-218.ec2.internal\" DevicePath \"\"" Apr 19 15:31:34.118015 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:34.117984 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73gx7bw" Apr 19 15:31:34.118410 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:34.117984 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73gx7bw" event={"ID":"6fa5de8e-282f-472b-bc6d-b56bf8c2fc82","Type":"ContainerDied","Data":"62f1bac7f7e2a7b482a39ddbe619a1ca113cf21b0bfbc9d81c48022c8f537b6b"} Apr 19 15:31:34.118410 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:34.118114 2579 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="62f1bac7f7e2a7b482a39ddbe619a1ca113cf21b0bfbc9d81c48022c8f537b6b" Apr 19 15:31:34.119872 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:34.119844 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05wccs" event={"ID":"8b042e94-664b-4e73-9ca2-3a8f1330cacd","Type":"ContainerDied","Data":"b6cfbcb544f995f5cc560594c337fe5d1ce9873ac6a9fe2c99a30366ff87ce23"} Apr 19 15:31:34.120000 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:34.119876 2579 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b6cfbcb544f995f5cc560594c337fe5d1ce9873ac6a9fe2c99a30366ff87ce23" Apr 19 15:31:34.120000 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:34.119893 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05wccs" Apr 19 15:31:34.251086 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:34.251065 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759fjgzz" Apr 19 15:31:34.320465 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:34.320436 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6a58af15-197e-4703-9c60-0b75920abcf7-util\") pod \"6a58af15-197e-4703-9c60-0b75920abcf7\" (UID: \"6a58af15-197e-4703-9c60-0b75920abcf7\") " Apr 19 15:31:34.320465 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:34.320468 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6a58af15-197e-4703-9c60-0b75920abcf7-bundle\") pod \"6a58af15-197e-4703-9c60-0b75920abcf7\" (UID: \"6a58af15-197e-4703-9c60-0b75920abcf7\") " Apr 19 15:31:34.320690 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:34.320514 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8cnkv\" (UniqueName: \"kubernetes.io/projected/6a58af15-197e-4703-9c60-0b75920abcf7-kube-api-access-8cnkv\") pod \"6a58af15-197e-4703-9c60-0b75920abcf7\" (UID: \"6a58af15-197e-4703-9c60-0b75920abcf7\") " Apr 19 15:31:34.320969 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:34.320939 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a58af15-197e-4703-9c60-0b75920abcf7-bundle" (OuterVolumeSpecName: "bundle") pod "6a58af15-197e-4703-9c60-0b75920abcf7" (UID: "6a58af15-197e-4703-9c60-0b75920abcf7"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 19 15:31:34.322825 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:34.322799 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a58af15-197e-4703-9c60-0b75920abcf7-kube-api-access-8cnkv" (OuterVolumeSpecName: "kube-api-access-8cnkv") pod "6a58af15-197e-4703-9c60-0b75920abcf7" (UID: "6a58af15-197e-4703-9c60-0b75920abcf7"). InnerVolumeSpecName "kube-api-access-8cnkv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 15:31:34.326136 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:34.326114 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a58af15-197e-4703-9c60-0b75920abcf7-util" (OuterVolumeSpecName: "util") pod "6a58af15-197e-4703-9c60-0b75920abcf7" (UID: "6a58af15-197e-4703-9c60-0b75920abcf7"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 19 15:31:34.421335 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:34.421225 2579 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6a58af15-197e-4703-9c60-0b75920abcf7-util\") on node \"ip-10-0-133-218.ec2.internal\" DevicePath \"\"" Apr 19 15:31:34.421335 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:34.421276 2579 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6a58af15-197e-4703-9c60-0b75920abcf7-bundle\") on node \"ip-10-0-133-218.ec2.internal\" DevicePath \"\"" Apr 19 15:31:34.421335 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:34.421290 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8cnkv\" (UniqueName: \"kubernetes.io/projected/6a58af15-197e-4703-9c60-0b75920abcf7-kube-api-access-8cnkv\") on node \"ip-10-0-133-218.ec2.internal\" DevicePath \"\"" Apr 19 15:31:35.125044 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:35.125010 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759fjgzz" event={"ID":"6a58af15-197e-4703-9c60-0b75920abcf7","Type":"ContainerDied","Data":"66f19a499c8244a16e22a7bc61108e5e837e2316926a4b58898a7d7122a0fbfb"} Apr 19 15:31:35.125044 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:35.125034 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759fjgzz" Apr 19 15:31:35.125044 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:35.125045 2579 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="66f19a499c8244a16e22a7bc61108e5e837e2316926a4b58898a7d7122a0fbfb" Apr 19 15:31:44.200847 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:44.200808 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-shj9m"] Apr 19 15:31:44.201195 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:44.201133 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6fa5de8e-282f-472b-bc6d-b56bf8c2fc82" containerName="util" Apr 19 15:31:44.201195 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:44.201144 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fa5de8e-282f-472b-bc6d-b56bf8c2fc82" containerName="util" Apr 19 15:31:44.201195 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:44.201153 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6a58af15-197e-4703-9c60-0b75920abcf7" containerName="extract" Apr 19 15:31:44.201195 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:44.201158 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a58af15-197e-4703-9c60-0b75920abcf7" containerName="extract" Apr 19 15:31:44.201195 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:44.201166 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6a58af15-197e-4703-9c60-0b75920abcf7" containerName="pull" Apr 19 15:31:44.201195 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:44.201172 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a58af15-197e-4703-9c60-0b75920abcf7" containerName="pull" Apr 19 15:31:44.201195 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:44.201180 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6fa5de8e-282f-472b-bc6d-b56bf8c2fc82" containerName="pull" Apr 19 15:31:44.201195 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:44.201184 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fa5de8e-282f-472b-bc6d-b56bf8c2fc82" containerName="pull" Apr 19 15:31:44.201195 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:44.201191 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="29a72e0e-4e70-414b-95a6-6ab81cf351b6" containerName="extract" Apr 19 15:31:44.201195 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:44.201196 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="29a72e0e-4e70-414b-95a6-6ab81cf351b6" containerName="extract" Apr 19 15:31:44.201500 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:44.201202 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="29a72e0e-4e70-414b-95a6-6ab81cf351b6" containerName="pull" Apr 19 15:31:44.201500 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:44.201207 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="29a72e0e-4e70-414b-95a6-6ab81cf351b6" containerName="pull" Apr 19 15:31:44.201500 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:44.201215 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8b042e94-664b-4e73-9ca2-3a8f1330cacd" containerName="extract" Apr 19 15:31:44.201500 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:44.201220 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b042e94-664b-4e73-9ca2-3a8f1330cacd" containerName="extract" Apr 19 15:31:44.201500 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:44.201233 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6a58af15-197e-4703-9c60-0b75920abcf7" containerName="util" Apr 19 15:31:44.201500 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:44.201237 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a58af15-197e-4703-9c60-0b75920abcf7" containerName="util" Apr 19 15:31:44.201500 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:44.201245 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8b042e94-664b-4e73-9ca2-3a8f1330cacd" containerName="pull" Apr 19 15:31:44.201500 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:44.201249 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b042e94-664b-4e73-9ca2-3a8f1330cacd" containerName="pull" Apr 19 15:31:44.201500 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:44.201254 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="29a72e0e-4e70-414b-95a6-6ab81cf351b6" containerName="util" Apr 19 15:31:44.201500 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:44.201259 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="29a72e0e-4e70-414b-95a6-6ab81cf351b6" containerName="util" Apr 19 15:31:44.201500 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:44.201264 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8b042e94-664b-4e73-9ca2-3a8f1330cacd" containerName="util" Apr 19 15:31:44.201500 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:44.201269 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b042e94-664b-4e73-9ca2-3a8f1330cacd" containerName="util" Apr 19 15:31:44.201500 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:44.201274 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6fa5de8e-282f-472b-bc6d-b56bf8c2fc82" containerName="extract" Apr 19 15:31:44.201500 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:44.201279 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fa5de8e-282f-472b-bc6d-b56bf8c2fc82" containerName="extract" Apr 19 15:31:44.201500 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:44.201324 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="29a72e0e-4e70-414b-95a6-6ab81cf351b6" containerName="extract" Apr 19 15:31:44.201500 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:44.201330 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="6fa5de8e-282f-472b-bc6d-b56bf8c2fc82" containerName="extract" Apr 19 15:31:44.201500 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:44.201337 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="8b042e94-664b-4e73-9ca2-3a8f1330cacd" containerName="extract" Apr 19 15:31:44.201500 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:44.201346 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="6a58af15-197e-4703-9c60-0b75920abcf7" containerName="extract" Apr 19 15:31:44.205681 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:44.205659 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-shj9m" Apr 19 15:31:44.208081 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:44.208049 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-env\"" Apr 19 15:31:44.208183 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:44.208119 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-manager-dockercfg-tkgjm\"" Apr 19 15:31:44.213986 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:44.213955 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-shj9m"] Apr 19 15:31:44.309182 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:44.309122 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfbn5\" (UniqueName: \"kubernetes.io/projected/63ad4b6f-3b46-4d70-abc4-8ecbc1a30d94-kube-api-access-rfbn5\") pod \"dns-operator-controller-manager-648d5c98bc-shj9m\" (UID: \"63ad4b6f-3b46-4d70-abc4-8ecbc1a30d94\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-shj9m" Apr 19 15:31:44.410062 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:44.410023 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rfbn5\" (UniqueName: \"kubernetes.io/projected/63ad4b6f-3b46-4d70-abc4-8ecbc1a30d94-kube-api-access-rfbn5\") pod \"dns-operator-controller-manager-648d5c98bc-shj9m\" (UID: \"63ad4b6f-3b46-4d70-abc4-8ecbc1a30d94\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-shj9m" Apr 19 15:31:44.424623 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:44.424590 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfbn5\" (UniqueName: \"kubernetes.io/projected/63ad4b6f-3b46-4d70-abc4-8ecbc1a30d94-kube-api-access-rfbn5\") pod \"dns-operator-controller-manager-648d5c98bc-shj9m\" (UID: \"63ad4b6f-3b46-4d70-abc4-8ecbc1a30d94\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-shj9m" Apr 19 15:31:44.524794 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:44.524682 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-shj9m" Apr 19 15:31:44.652374 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:44.652345 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-shj9m"] Apr 19 15:31:44.654468 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:31:44.654444 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63ad4b6f_3b46_4d70_abc4_8ecbc1a30d94.slice/crio-f62c7856c3108a4a84433f71f05e845556c020aa5658355144bb7a7d3d87208d WatchSource:0}: Error finding container f62c7856c3108a4a84433f71f05e845556c020aa5658355144bb7a7d3d87208d: Status 404 returned error can't find the container with id f62c7856c3108a4a84433f71f05e845556c020aa5658355144bb7a7d3d87208d Apr 19 15:31:45.164450 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:45.164410 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-shj9m" event={"ID":"63ad4b6f-3b46-4d70-abc4-8ecbc1a30d94","Type":"ContainerStarted","Data":"f62c7856c3108a4a84433f71f05e845556c020aa5658355144bb7a7d3d87208d"} Apr 19 15:31:47.175886 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:47.175847 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-shj9m" event={"ID":"63ad4b6f-3b46-4d70-abc4-8ecbc1a30d94","Type":"ContainerStarted","Data":"7aa92ba7c167c94c45d10893dd81629e568d7390e876dc21a6748b0be0918dbf"} Apr 19 15:31:47.176273 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:47.175972 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-shj9m" Apr 19 15:31:47.194907 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:47.194850 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-shj9m" podStartSLOduration=1.264747311 podStartE2EDuration="3.194832007s" podCreationTimestamp="2026-04-19 15:31:44 +0000 UTC" firstStartedPulling="2026-04-19 15:31:44.656306403 +0000 UTC m=+403.762221769" lastFinishedPulling="2026-04-19 15:31:46.586391086 +0000 UTC m=+405.692306465" observedRunningTime="2026-04-19 15:31:47.19400867 +0000 UTC m=+406.299924058" watchObservedRunningTime="2026-04-19 15:31:47.194832007 +0000 UTC m=+406.300747397" Apr 19 15:31:52.382410 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:52.382375 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-khd8h"] Apr 19 15:31:52.387011 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:52.386989 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-khd8h" Apr 19 15:31:52.389073 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:52.389052 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-ltvvm\"" Apr 19 15:31:52.398122 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:52.398099 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-khd8h"] Apr 19 15:31:52.479956 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:52.479909 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/af41b07b-84e3-43d1-b1f8-d81a4c46b35f-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-khd8h\" (UID: \"af41b07b-84e3-43d1-b1f8-d81a4c46b35f\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-khd8h" Apr 19 15:31:52.480108 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:52.480083 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnvg7\" (UniqueName: \"kubernetes.io/projected/af41b07b-84e3-43d1-b1f8-d81a4c46b35f-kube-api-access-hnvg7\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-khd8h\" (UID: \"af41b07b-84e3-43d1-b1f8-d81a4c46b35f\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-khd8h" Apr 19 15:31:52.580748 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:52.580680 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hnvg7\" (UniqueName: \"kubernetes.io/projected/af41b07b-84e3-43d1-b1f8-d81a4c46b35f-kube-api-access-hnvg7\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-khd8h\" (UID: \"af41b07b-84e3-43d1-b1f8-d81a4c46b35f\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-khd8h" Apr 19 15:31:52.580748 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:52.580749 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/af41b07b-84e3-43d1-b1f8-d81a4c46b35f-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-khd8h\" (UID: \"af41b07b-84e3-43d1-b1f8-d81a4c46b35f\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-khd8h" Apr 19 15:31:52.581097 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:52.581081 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/af41b07b-84e3-43d1-b1f8-d81a4c46b35f-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-khd8h\" (UID: \"af41b07b-84e3-43d1-b1f8-d81a4c46b35f\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-khd8h" Apr 19 15:31:52.593566 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:52.593535 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnvg7\" (UniqueName: \"kubernetes.io/projected/af41b07b-84e3-43d1-b1f8-d81a4c46b35f-kube-api-access-hnvg7\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-khd8h\" (UID: \"af41b07b-84e3-43d1-b1f8-d81a4c46b35f\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-khd8h" Apr 19 15:31:52.697967 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:52.697927 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-khd8h" Apr 19 15:31:52.845598 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:52.845559 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-khd8h"] Apr 19 15:31:52.848619 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:31:52.848589 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf41b07b_84e3_43d1_b1f8_d81a4c46b35f.slice/crio-318cb63facffc3d8efdae051b40365b3f6c978a9a227c9b2fece7bb1d7d56089 WatchSource:0}: Error finding container 318cb63facffc3d8efdae051b40365b3f6c978a9a227c9b2fece7bb1d7d56089: Status 404 returned error can't find the container with id 318cb63facffc3d8efdae051b40365b3f6c978a9a227c9b2fece7bb1d7d56089 Apr 19 15:31:53.200404 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:53.200370 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-khd8h" event={"ID":"af41b07b-84e3-43d1-b1f8-d81a4c46b35f","Type":"ContainerStarted","Data":"318cb63facffc3d8efdae051b40365b3f6c978a9a227c9b2fece7bb1d7d56089"} Apr 19 15:31:58.182269 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:58.182237 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-shj9m" Apr 19 15:31:58.222005 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:58.221965 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-khd8h" event={"ID":"af41b07b-84e3-43d1-b1f8-d81a4c46b35f","Type":"ContainerStarted","Data":"0f1e06355cf231f5c0e1623ee9a959b43689123f7d1ce71a9a89ba9001a959c8"} Apr 19 15:31:58.222159 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:58.222149 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-khd8h" Apr 19 15:31:58.243540 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:31:58.243484 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-khd8h" podStartSLOduration=1.044781538 podStartE2EDuration="6.243465997s" podCreationTimestamp="2026-04-19 15:31:52 +0000 UTC" firstStartedPulling="2026-04-19 15:31:52.851124391 +0000 UTC m=+411.957039757" lastFinishedPulling="2026-04-19 15:31:58.049808849 +0000 UTC m=+417.155724216" observedRunningTime="2026-04-19 15:31:58.23977973 +0000 UTC m=+417.345695211" watchObservedRunningTime="2026-04-19 15:31:58.243465997 +0000 UTC m=+417.349381391" Apr 19 15:32:00.780012 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:00.779972 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-6xkts"] Apr 19 15:32:00.784957 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:00.784932 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-6xkts" Apr 19 15:32:00.787215 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:00.787191 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"plugin-serving-cert\"" Apr 19 15:32:00.788224 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:00.788202 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kuadrant-console-nginx-conf\"" Apr 19 15:32:00.788358 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:00.788273 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-qzbc7\"" Apr 19 15:32:00.790577 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:00.790555 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-6xkts"] Apr 19 15:32:00.855232 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:00.855189 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mp9xr\" (UniqueName: \"kubernetes.io/projected/486cbe01-c696-4420-b8d2-7a243e85b26d-kube-api-access-mp9xr\") pod \"kuadrant-console-plugin-6cb54b5c86-6xkts\" (UID: \"486cbe01-c696-4420-b8d2-7a243e85b26d\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-6xkts" Apr 19 15:32:00.855232 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:00.855229 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/486cbe01-c696-4420-b8d2-7a243e85b26d-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-6xkts\" (UID: \"486cbe01-c696-4420-b8d2-7a243e85b26d\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-6xkts" Apr 19 15:32:00.855467 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:00.855304 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/486cbe01-c696-4420-b8d2-7a243e85b26d-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-6xkts\" (UID: \"486cbe01-c696-4420-b8d2-7a243e85b26d\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-6xkts" Apr 19 15:32:00.951769 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:00.951703 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7cd8d7499f-z7gk7"] Apr 19 15:32:00.955653 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:00.955630 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7cd8d7499f-z7gk7" Apr 19 15:32:00.956034 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:00.956012 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mp9xr\" (UniqueName: \"kubernetes.io/projected/486cbe01-c696-4420-b8d2-7a243e85b26d-kube-api-access-mp9xr\") pod \"kuadrant-console-plugin-6cb54b5c86-6xkts\" (UID: \"486cbe01-c696-4420-b8d2-7a243e85b26d\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-6xkts" Apr 19 15:32:00.956086 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:00.956047 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/486cbe01-c696-4420-b8d2-7a243e85b26d-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-6xkts\" (UID: \"486cbe01-c696-4420-b8d2-7a243e85b26d\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-6xkts" Apr 19 15:32:00.956140 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:00.956124 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/486cbe01-c696-4420-b8d2-7a243e85b26d-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-6xkts\" (UID: \"486cbe01-c696-4420-b8d2-7a243e85b26d\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-6xkts" Apr 19 15:32:00.956275 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:32:00.956257 2579 secret.go:189] Couldn't get secret kuadrant-system/plugin-serving-cert: secret "plugin-serving-cert" not found Apr 19 15:32:00.956344 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:32:00.956332 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/486cbe01-c696-4420-b8d2-7a243e85b26d-plugin-serving-cert podName:486cbe01-c696-4420-b8d2-7a243e85b26d nodeName:}" failed. No retries permitted until 2026-04-19 15:32:01.456310751 +0000 UTC m=+420.562226118 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/486cbe01-c696-4420-b8d2-7a243e85b26d-plugin-serving-cert") pod "kuadrant-console-plugin-6cb54b5c86-6xkts" (UID: "486cbe01-c696-4420-b8d2-7a243e85b26d") : secret "plugin-serving-cert" not found Apr 19 15:32:00.956839 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:00.956806 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/486cbe01-c696-4420-b8d2-7a243e85b26d-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-6xkts\" (UID: \"486cbe01-c696-4420-b8d2-7a243e85b26d\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-6xkts" Apr 19 15:32:00.966329 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:00.966304 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7cd8d7499f-z7gk7"] Apr 19 15:32:00.969460 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:00.969439 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mp9xr\" (UniqueName: \"kubernetes.io/projected/486cbe01-c696-4420-b8d2-7a243e85b26d-kube-api-access-mp9xr\") pod \"kuadrant-console-plugin-6cb54b5c86-6xkts\" (UID: \"486cbe01-c696-4420-b8d2-7a243e85b26d\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-6xkts" Apr 19 15:32:01.057706 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:01.057619 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e64bbe5b-551a-4ec8-845b-34a15429f076-console-serving-cert\") pod \"console-7cd8d7499f-z7gk7\" (UID: \"e64bbe5b-551a-4ec8-845b-34a15429f076\") " pod="openshift-console/console-7cd8d7499f-z7gk7" Apr 19 15:32:01.057706 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:01.057658 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e64bbe5b-551a-4ec8-845b-34a15429f076-trusted-ca-bundle\") pod \"console-7cd8d7499f-z7gk7\" (UID: \"e64bbe5b-551a-4ec8-845b-34a15429f076\") " pod="openshift-console/console-7cd8d7499f-z7gk7" Apr 19 15:32:01.057706 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:01.057682 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e64bbe5b-551a-4ec8-845b-34a15429f076-console-oauth-config\") pod \"console-7cd8d7499f-z7gk7\" (UID: \"e64bbe5b-551a-4ec8-845b-34a15429f076\") " pod="openshift-console/console-7cd8d7499f-z7gk7" Apr 19 15:32:01.057973 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:01.057813 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxgmk\" (UniqueName: \"kubernetes.io/projected/e64bbe5b-551a-4ec8-845b-34a15429f076-kube-api-access-hxgmk\") pod \"console-7cd8d7499f-z7gk7\" (UID: \"e64bbe5b-551a-4ec8-845b-34a15429f076\") " pod="openshift-console/console-7cd8d7499f-z7gk7" Apr 19 15:32:01.057973 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:01.057851 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e64bbe5b-551a-4ec8-845b-34a15429f076-service-ca\") pod \"console-7cd8d7499f-z7gk7\" (UID: \"e64bbe5b-551a-4ec8-845b-34a15429f076\") " pod="openshift-console/console-7cd8d7499f-z7gk7" Apr 19 15:32:01.057973 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:01.057881 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e64bbe5b-551a-4ec8-845b-34a15429f076-oauth-serving-cert\") pod \"console-7cd8d7499f-z7gk7\" (UID: \"e64bbe5b-551a-4ec8-845b-34a15429f076\") " pod="openshift-console/console-7cd8d7499f-z7gk7" Apr 19 15:32:01.057973 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:01.057961 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e64bbe5b-551a-4ec8-845b-34a15429f076-console-config\") pod \"console-7cd8d7499f-z7gk7\" (UID: \"e64bbe5b-551a-4ec8-845b-34a15429f076\") " pod="openshift-console/console-7cd8d7499f-z7gk7" Apr 19 15:32:01.159454 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:01.159389 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e64bbe5b-551a-4ec8-845b-34a15429f076-console-config\") pod \"console-7cd8d7499f-z7gk7\" (UID: \"e64bbe5b-551a-4ec8-845b-34a15429f076\") " pod="openshift-console/console-7cd8d7499f-z7gk7" Apr 19 15:32:01.159454 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:01.159441 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e64bbe5b-551a-4ec8-845b-34a15429f076-console-serving-cert\") pod \"console-7cd8d7499f-z7gk7\" (UID: \"e64bbe5b-551a-4ec8-845b-34a15429f076\") " pod="openshift-console/console-7cd8d7499f-z7gk7" Apr 19 15:32:01.159454 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:01.159459 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e64bbe5b-551a-4ec8-845b-34a15429f076-trusted-ca-bundle\") pod \"console-7cd8d7499f-z7gk7\" (UID: \"e64bbe5b-551a-4ec8-845b-34a15429f076\") " pod="openshift-console/console-7cd8d7499f-z7gk7" Apr 19 15:32:01.159784 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:01.159485 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e64bbe5b-551a-4ec8-845b-34a15429f076-console-oauth-config\") pod \"console-7cd8d7499f-z7gk7\" (UID: \"e64bbe5b-551a-4ec8-845b-34a15429f076\") " pod="openshift-console/console-7cd8d7499f-z7gk7" Apr 19 15:32:01.159784 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:01.159544 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hxgmk\" (UniqueName: \"kubernetes.io/projected/e64bbe5b-551a-4ec8-845b-34a15429f076-kube-api-access-hxgmk\") pod \"console-7cd8d7499f-z7gk7\" (UID: \"e64bbe5b-551a-4ec8-845b-34a15429f076\") " pod="openshift-console/console-7cd8d7499f-z7gk7" Apr 19 15:32:01.159784 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:01.159713 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e64bbe5b-551a-4ec8-845b-34a15429f076-service-ca\") pod \"console-7cd8d7499f-z7gk7\" (UID: \"e64bbe5b-551a-4ec8-845b-34a15429f076\") " pod="openshift-console/console-7cd8d7499f-z7gk7" Apr 19 15:32:01.159937 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:01.159796 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e64bbe5b-551a-4ec8-845b-34a15429f076-oauth-serving-cert\") pod \"console-7cd8d7499f-z7gk7\" (UID: \"e64bbe5b-551a-4ec8-845b-34a15429f076\") " pod="openshift-console/console-7cd8d7499f-z7gk7" Apr 19 15:32:01.160374 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:01.160257 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e64bbe5b-551a-4ec8-845b-34a15429f076-console-config\") pod \"console-7cd8d7499f-z7gk7\" (UID: \"e64bbe5b-551a-4ec8-845b-34a15429f076\") " pod="openshift-console/console-7cd8d7499f-z7gk7" Apr 19 15:32:01.160645 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:01.160373 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e64bbe5b-551a-4ec8-845b-34a15429f076-service-ca\") pod \"console-7cd8d7499f-z7gk7\" (UID: \"e64bbe5b-551a-4ec8-845b-34a15429f076\") " pod="openshift-console/console-7cd8d7499f-z7gk7" Apr 19 15:32:01.160645 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:01.160456 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e64bbe5b-551a-4ec8-845b-34a15429f076-oauth-serving-cert\") pod \"console-7cd8d7499f-z7gk7\" (UID: \"e64bbe5b-551a-4ec8-845b-34a15429f076\") " pod="openshift-console/console-7cd8d7499f-z7gk7" Apr 19 15:32:01.160964 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:01.160941 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e64bbe5b-551a-4ec8-845b-34a15429f076-trusted-ca-bundle\") pod \"console-7cd8d7499f-z7gk7\" (UID: \"e64bbe5b-551a-4ec8-845b-34a15429f076\") " pod="openshift-console/console-7cd8d7499f-z7gk7" Apr 19 15:32:01.162322 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:01.162297 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e64bbe5b-551a-4ec8-845b-34a15429f076-console-serving-cert\") pod \"console-7cd8d7499f-z7gk7\" (UID: \"e64bbe5b-551a-4ec8-845b-34a15429f076\") " pod="openshift-console/console-7cd8d7499f-z7gk7" Apr 19 15:32:01.162410 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:01.162352 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e64bbe5b-551a-4ec8-845b-34a15429f076-console-oauth-config\") pod \"console-7cd8d7499f-z7gk7\" (UID: \"e64bbe5b-551a-4ec8-845b-34a15429f076\") " pod="openshift-console/console-7cd8d7499f-z7gk7" Apr 19 15:32:01.167613 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:01.167594 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxgmk\" (UniqueName: \"kubernetes.io/projected/e64bbe5b-551a-4ec8-845b-34a15429f076-kube-api-access-hxgmk\") pod \"console-7cd8d7499f-z7gk7\" (UID: \"e64bbe5b-551a-4ec8-845b-34a15429f076\") " pod="openshift-console/console-7cd8d7499f-z7gk7" Apr 19 15:32:01.266149 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:01.266113 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7cd8d7499f-z7gk7" Apr 19 15:32:01.400510 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:01.400485 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7cd8d7499f-z7gk7"] Apr 19 15:32:01.402945 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:32:01.402913 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode64bbe5b_551a_4ec8_845b_34a15429f076.slice/crio-d5019ae07eee8a244f39facaa1cef95fd111897a2b6779b0ab852f4171d70934 WatchSource:0}: Error finding container d5019ae07eee8a244f39facaa1cef95fd111897a2b6779b0ab852f4171d70934: Status 404 returned error can't find the container with id d5019ae07eee8a244f39facaa1cef95fd111897a2b6779b0ab852f4171d70934 Apr 19 15:32:01.463847 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:01.463759 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/486cbe01-c696-4420-b8d2-7a243e85b26d-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-6xkts\" (UID: \"486cbe01-c696-4420-b8d2-7a243e85b26d\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-6xkts" Apr 19 15:32:01.467012 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:01.466953 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/486cbe01-c696-4420-b8d2-7a243e85b26d-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-6xkts\" (UID: \"486cbe01-c696-4420-b8d2-7a243e85b26d\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-6xkts" Apr 19 15:32:01.697414 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:01.697387 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-qzbc7\"" Apr 19 15:32:01.705624 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:01.705604 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-6xkts" Apr 19 15:32:01.829902 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:01.829866 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-6xkts"] Apr 19 15:32:01.831799 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:32:01.831770 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod486cbe01_c696_4420_b8d2_7a243e85b26d.slice/crio-896d3a9e7f51b0a3eefb3efcc175a454cbd6241a143fd44c0355d24cd8d9330b WatchSource:0}: Error finding container 896d3a9e7f51b0a3eefb3efcc175a454cbd6241a143fd44c0355d24cd8d9330b: Status 404 returned error can't find the container with id 896d3a9e7f51b0a3eefb3efcc175a454cbd6241a143fd44c0355d24cd8d9330b Apr 19 15:32:02.239791 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:02.239756 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-6xkts" event={"ID":"486cbe01-c696-4420-b8d2-7a243e85b26d","Type":"ContainerStarted","Data":"896d3a9e7f51b0a3eefb3efcc175a454cbd6241a143fd44c0355d24cd8d9330b"} Apr 19 15:32:02.241124 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:02.241095 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7cd8d7499f-z7gk7" event={"ID":"e64bbe5b-551a-4ec8-845b-34a15429f076","Type":"ContainerStarted","Data":"89ef484ea8dc9159eaa224264cc5efdef0cdd283f522f307698f6cab3ac37a07"} Apr 19 15:32:02.241247 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:02.241132 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7cd8d7499f-z7gk7" event={"ID":"e64bbe5b-551a-4ec8-845b-34a15429f076","Type":"ContainerStarted","Data":"d5019ae07eee8a244f39facaa1cef95fd111897a2b6779b0ab852f4171d70934"} Apr 19 15:32:02.258845 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:02.258792 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7cd8d7499f-z7gk7" podStartSLOduration=2.258776634 podStartE2EDuration="2.258776634s" podCreationTimestamp="2026-04-19 15:32:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-19 15:32:02.256159121 +0000 UTC m=+421.362074543" watchObservedRunningTime="2026-04-19 15:32:02.258776634 +0000 UTC m=+421.364692019" Apr 19 15:32:09.228847 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:09.228810 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-khd8h" Apr 19 15:32:10.450801 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:10.450764 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-w9bh9"] Apr 19 15:32:10.457868 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:10.457839 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-w9bh9" Apr 19 15:32:10.466642 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:10.466611 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-w9bh9"] Apr 19 15:32:10.550364 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:10.550327 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/56ad8ec5-cdd6-4d63-b9fa-861e7c5081f7-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-84b657d985-w9bh9\" (UID: \"56ad8ec5-cdd6-4d63-b9fa-861e7c5081f7\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-w9bh9" Apr 19 15:32:10.550527 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:10.550412 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lf8cq\" (UniqueName: \"kubernetes.io/projected/56ad8ec5-cdd6-4d63-b9fa-861e7c5081f7-kube-api-access-lf8cq\") pod \"kuadrant-operator-controller-manager-84b657d985-w9bh9\" (UID: \"56ad8ec5-cdd6-4d63-b9fa-861e7c5081f7\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-w9bh9" Apr 19 15:32:10.651623 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:10.651583 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/56ad8ec5-cdd6-4d63-b9fa-861e7c5081f7-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-84b657d985-w9bh9\" (UID: \"56ad8ec5-cdd6-4d63-b9fa-861e7c5081f7\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-w9bh9" Apr 19 15:32:10.651848 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:10.651675 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lf8cq\" (UniqueName: \"kubernetes.io/projected/56ad8ec5-cdd6-4d63-b9fa-861e7c5081f7-kube-api-access-lf8cq\") pod \"kuadrant-operator-controller-manager-84b657d985-w9bh9\" (UID: \"56ad8ec5-cdd6-4d63-b9fa-861e7c5081f7\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-w9bh9" Apr 19 15:32:10.652028 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:10.652002 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/56ad8ec5-cdd6-4d63-b9fa-861e7c5081f7-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-84b657d985-w9bh9\" (UID: \"56ad8ec5-cdd6-4d63-b9fa-861e7c5081f7\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-w9bh9" Apr 19 15:32:10.660612 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:10.660587 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lf8cq\" (UniqueName: \"kubernetes.io/projected/56ad8ec5-cdd6-4d63-b9fa-861e7c5081f7-kube-api-access-lf8cq\") pod \"kuadrant-operator-controller-manager-84b657d985-w9bh9\" (UID: \"56ad8ec5-cdd6-4d63-b9fa-861e7c5081f7\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-w9bh9" Apr 19 15:32:10.773578 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:10.773496 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-w9bh9" Apr 19 15:32:10.872136 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:10.871712 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-khd8h"] Apr 19 15:32:10.872136 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:10.872005 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-khd8h" podUID="af41b07b-84e3-43d1-b1f8-d81a4c46b35f" containerName="manager" containerID="cri-o://0f1e06355cf231f5c0e1623ee9a959b43689123f7d1ce71a9a89ba9001a959c8" gracePeriod=2 Apr 19 15:32:10.874288 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:10.874006 2579 status_manager.go:895] "Failed to get status for pod" podUID="af41b07b-84e3-43d1-b1f8-d81a4c46b35f" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-khd8h" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-khd8h\" is forbidden: User \"system:node:ip-10-0-133-218.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-133-218.ec2.internal' and this object" Apr 19 15:32:10.876266 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:10.876236 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-khd8h"] Apr 19 15:32:10.893569 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:10.893514 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-fx6xt"] Apr 19 15:32:10.894100 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:10.894079 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="af41b07b-84e3-43d1-b1f8-d81a4c46b35f" containerName="manager" Apr 19 15:32:10.894100 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:10.894103 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="af41b07b-84e3-43d1-b1f8-d81a4c46b35f" containerName="manager" Apr 19 15:32:10.894262 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:10.894236 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="af41b07b-84e3-43d1-b1f8-d81a4c46b35f" containerName="manager" Apr 19 15:32:10.897469 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:10.897447 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-fx6xt" Apr 19 15:32:10.899612 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:10.899572 2579 status_manager.go:895] "Failed to get status for pod" podUID="af41b07b-84e3-43d1-b1f8-d81a4c46b35f" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-khd8h" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-khd8h\" is forbidden: User \"system:node:ip-10-0-133-218.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-133-218.ec2.internal' and this object" Apr 19 15:32:10.907363 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:10.906639 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-fx6xt"] Apr 19 15:32:10.910249 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:10.910224 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-w9bh9"] Apr 19 15:32:10.923336 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:10.923287 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-w9bh9"] Apr 19 15:32:10.939955 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:10.939928 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-qgw6h"] Apr 19 15:32:10.944182 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:10.944156 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-qgw6h" Apr 19 15:32:10.947226 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:10.947195 2579 status_manager.go:895] "Failed to get status for pod" podUID="af41b07b-84e3-43d1-b1f8-d81a4c46b35f" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-khd8h" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-khd8h\" is forbidden: User \"system:node:ip-10-0-133-218.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-133-218.ec2.internal' and this object" Apr 19 15:32:10.960880 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:10.960852 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-qgw6h"] Apr 19 15:32:11.054872 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:11.054794 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8llhc\" (UniqueName: \"kubernetes.io/projected/55c79b63-7131-4c1d-ab01-420a97395ed9-kube-api-access-8llhc\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-fx6xt\" (UID: \"55c79b63-7131-4c1d-ab01-420a97395ed9\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-fx6xt" Apr 19 15:32:11.055047 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:11.054870 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/53370a72-a5e8-48a1-98b3-b76f23e41002-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-84b657d985-qgw6h\" (UID: \"53370a72-a5e8-48a1-98b3-b76f23e41002\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-qgw6h" Apr 19 15:32:11.055047 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:11.054945 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/55c79b63-7131-4c1d-ab01-420a97395ed9-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-fx6xt\" (UID: \"55c79b63-7131-4c1d-ab01-420a97395ed9\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-fx6xt" Apr 19 15:32:11.055162 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:11.055053 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgxfb\" (UniqueName: \"kubernetes.io/projected/53370a72-a5e8-48a1-98b3-b76f23e41002-kube-api-access-pgxfb\") pod \"kuadrant-operator-controller-manager-84b657d985-qgw6h\" (UID: \"53370a72-a5e8-48a1-98b3-b76f23e41002\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-qgw6h" Apr 19 15:32:11.156304 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:11.156263 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8llhc\" (UniqueName: \"kubernetes.io/projected/55c79b63-7131-4c1d-ab01-420a97395ed9-kube-api-access-8llhc\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-fx6xt\" (UID: \"55c79b63-7131-4c1d-ab01-420a97395ed9\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-fx6xt" Apr 19 15:32:11.156493 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:11.156326 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/53370a72-a5e8-48a1-98b3-b76f23e41002-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-84b657d985-qgw6h\" (UID: \"53370a72-a5e8-48a1-98b3-b76f23e41002\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-qgw6h" Apr 19 15:32:11.156493 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:11.156384 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/55c79b63-7131-4c1d-ab01-420a97395ed9-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-fx6xt\" (UID: \"55c79b63-7131-4c1d-ab01-420a97395ed9\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-fx6xt" Apr 19 15:32:11.156493 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:11.156454 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pgxfb\" (UniqueName: \"kubernetes.io/projected/53370a72-a5e8-48a1-98b3-b76f23e41002-kube-api-access-pgxfb\") pod \"kuadrant-operator-controller-manager-84b657d985-qgw6h\" (UID: \"53370a72-a5e8-48a1-98b3-b76f23e41002\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-qgw6h" Apr 19 15:32:11.156817 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:11.156791 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/53370a72-a5e8-48a1-98b3-b76f23e41002-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-84b657d985-qgw6h\" (UID: \"53370a72-a5e8-48a1-98b3-b76f23e41002\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-qgw6h" Apr 19 15:32:11.156904 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:11.156858 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/55c79b63-7131-4c1d-ab01-420a97395ed9-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-fx6xt\" (UID: \"55c79b63-7131-4c1d-ab01-420a97395ed9\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-fx6xt" Apr 19 15:32:11.175296 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:11.175264 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8llhc\" (UniqueName: \"kubernetes.io/projected/55c79b63-7131-4c1d-ab01-420a97395ed9-kube-api-access-8llhc\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-fx6xt\" (UID: \"55c79b63-7131-4c1d-ab01-420a97395ed9\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-fx6xt" Apr 19 15:32:11.175464 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:11.175417 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgxfb\" (UniqueName: \"kubernetes.io/projected/53370a72-a5e8-48a1-98b3-b76f23e41002-kube-api-access-pgxfb\") pod \"kuadrant-operator-controller-manager-84b657d985-qgw6h\" (UID: \"53370a72-a5e8-48a1-98b3-b76f23e41002\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-qgw6h" Apr 19 15:32:11.248826 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:11.248778 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-fx6xt" Apr 19 15:32:11.259802 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:11.259775 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-qgw6h" Apr 19 15:32:11.266531 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:11.266506 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7cd8d7499f-z7gk7" Apr 19 15:32:11.266673 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:11.266547 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7cd8d7499f-z7gk7" Apr 19 15:32:11.272326 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:11.272305 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7cd8d7499f-z7gk7" Apr 19 15:32:11.274925 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:11.274888 2579 status_manager.go:895] "Failed to get status for pod" podUID="af41b07b-84e3-43d1-b1f8-d81a4c46b35f" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-khd8h" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-khd8h\" is forbidden: User \"system:node:ip-10-0-133-218.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-133-218.ec2.internal' and this object" Apr 19 15:32:11.289449 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:11.289423 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7cd8d7499f-z7gk7" Apr 19 15:32:11.302285 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:11.302247 2579 status_manager.go:895] "Failed to get status for pod" podUID="af41b07b-84e3-43d1-b1f8-d81a4c46b35f" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-khd8h" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-khd8h\" is forbidden: User \"system:node:ip-10-0-133-218.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-133-218.ec2.internal' and this object" Apr 19 15:32:11.369998 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:11.369910 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-579f7fb596-jpftg"] Apr 19 15:32:11.385422 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:11.385380 2579 status_manager.go:895] "Failed to get status for pod" podUID="af41b07b-84e3-43d1-b1f8-d81a4c46b35f" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-khd8h" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-khd8h\" is forbidden: User \"system:node:ip-10-0-133-218.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-133-218.ec2.internal' and this object" Apr 19 15:32:11.547836 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:11.547791 2579 status_manager.go:895] "Failed to get status for pod" podUID="af41b07b-84e3-43d1-b1f8-d81a4c46b35f" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-khd8h" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-khd8h\" is forbidden: User \"system:node:ip-10-0-133-218.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-133-218.ec2.internal' and this object" Apr 19 15:32:24.562243 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:24.561989 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-khd8h" Apr 19 15:32:24.566005 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:24.565955 2579 status_manager.go:895] "Failed to get status for pod" podUID="af41b07b-84e3-43d1-b1f8-d81a4c46b35f" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-khd8h" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-khd8h\" is forbidden: User \"system:node:ip-10-0-133-218.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-133-218.ec2.internal' and this object" Apr 19 15:32:24.568779 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:24.568753 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-fx6xt"] Apr 19 15:32:24.573160 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:32:24.573118 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55c79b63_7131_4c1d_ab01_420a97395ed9.slice/crio-4933c38f009eda195ec89018ff4ee927c3fcf6106b9581ff9765feb3adf7c391 WatchSource:0}: Error finding container 4933c38f009eda195ec89018ff4ee927c3fcf6106b9581ff9765feb3adf7c391: Status 404 returned error can't find the container with id 4933c38f009eda195ec89018ff4ee927c3fcf6106b9581ff9765feb3adf7c391 Apr 19 15:32:24.586882 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:24.586848 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hnvg7\" (UniqueName: \"kubernetes.io/projected/af41b07b-84e3-43d1-b1f8-d81a4c46b35f-kube-api-access-hnvg7\") pod \"af41b07b-84e3-43d1-b1f8-d81a4c46b35f\" (UID: \"af41b07b-84e3-43d1-b1f8-d81a4c46b35f\") " Apr 19 15:32:24.586993 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:24.586916 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/af41b07b-84e3-43d1-b1f8-d81a4c46b35f-extensions-socket-volume\") pod \"af41b07b-84e3-43d1-b1f8-d81a4c46b35f\" (UID: \"af41b07b-84e3-43d1-b1f8-d81a4c46b35f\") " Apr 19 15:32:24.588105 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:24.587569 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af41b07b-84e3-43d1-b1f8-d81a4c46b35f-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "af41b07b-84e3-43d1-b1f8-d81a4c46b35f" (UID: "af41b07b-84e3-43d1-b1f8-d81a4c46b35f"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 19 15:32:24.590536 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:24.590501 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af41b07b-84e3-43d1-b1f8-d81a4c46b35f-kube-api-access-hnvg7" (OuterVolumeSpecName: "kube-api-access-hnvg7") pod "af41b07b-84e3-43d1-b1f8-d81a4c46b35f" (UID: "af41b07b-84e3-43d1-b1f8-d81a4c46b35f"). InnerVolumeSpecName "kube-api-access-hnvg7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 15:32:24.592867 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:24.592844 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-qgw6h"] Apr 19 15:32:24.594466 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:32:24.594439 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53370a72_a5e8_48a1_98b3_b76f23e41002.slice/crio-c186d2437e2a90f4d81d8f93b47bd81951c7012fd53ae0362c8d2d8526386a5d WatchSource:0}: Error finding container c186d2437e2a90f4d81d8f93b47bd81951c7012fd53ae0362c8d2d8526386a5d: Status 404 returned error can't find the container with id c186d2437e2a90f4d81d8f93b47bd81951c7012fd53ae0362c8d2d8526386a5d Apr 19 15:32:24.687618 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:24.687588 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hnvg7\" (UniqueName: \"kubernetes.io/projected/af41b07b-84e3-43d1-b1f8-d81a4c46b35f-kube-api-access-hnvg7\") on node \"ip-10-0-133-218.ec2.internal\" DevicePath \"\"" Apr 19 15:32:24.687618 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:24.687621 2579 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/af41b07b-84e3-43d1-b1f8-d81a4c46b35f-extensions-socket-volume\") on node \"ip-10-0-133-218.ec2.internal\" DevicePath \"\"" Apr 19 15:32:25.347994 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:25.347956 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-fx6xt" event={"ID":"55c79b63-7131-4c1d-ab01-420a97395ed9","Type":"ContainerStarted","Data":"6731a6e7ddba7c73b55679bb7b6df5bb68c308b2296e176bb47ec378a7d330e9"} Apr 19 15:32:25.348169 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:25.348004 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-fx6xt" event={"ID":"55c79b63-7131-4c1d-ab01-420a97395ed9","Type":"ContainerStarted","Data":"4933c38f009eda195ec89018ff4ee927c3fcf6106b9581ff9765feb3adf7c391"} Apr 19 15:32:25.348169 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:25.348075 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-fx6xt" Apr 19 15:32:25.349601 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:25.349573 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-qgw6h" event={"ID":"53370a72-a5e8-48a1-98b3-b76f23e41002","Type":"ContainerStarted","Data":"34fee25f4371503d0a6c4a73653fdc2281032b2467bfd9907fc8f28731a6663b"} Apr 19 15:32:25.349601 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:25.349604 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-qgw6h" event={"ID":"53370a72-a5e8-48a1-98b3-b76f23e41002","Type":"ContainerStarted","Data":"c186d2437e2a90f4d81d8f93b47bd81951c7012fd53ae0362c8d2d8526386a5d"} Apr 19 15:32:25.349823 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:25.349804 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-qgw6h" Apr 19 15:32:25.349916 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:25.349879 2579 status_manager.go:895] "Failed to get status for pod" podUID="af41b07b-84e3-43d1-b1f8-d81a4c46b35f" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-khd8h" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-khd8h\" is forbidden: User \"system:node:ip-10-0-133-218.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-133-218.ec2.internal' and this object" Apr 19 15:32:25.350835 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:25.350812 2579 generic.go:358] "Generic (PLEG): container finished" podID="af41b07b-84e3-43d1-b1f8-d81a4c46b35f" containerID="0f1e06355cf231f5c0e1623ee9a959b43689123f7d1ce71a9a89ba9001a959c8" exitCode=0 Apr 19 15:32:25.350907 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:25.350857 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-khd8h" Apr 19 15:32:25.350976 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:25.350858 2579 scope.go:117] "RemoveContainer" containerID="0f1e06355cf231f5c0e1623ee9a959b43689123f7d1ce71a9a89ba9001a959c8" Apr 19 15:32:25.352375 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:25.352351 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-6xkts" event={"ID":"486cbe01-c696-4420-b8d2-7a243e85b26d","Type":"ContainerStarted","Data":"0d6e6a28522f4ec10b3888212eb35f2f741a744b65104bebd364f2c982b4063c"} Apr 19 15:32:25.360874 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:25.360855 2579 scope.go:117] "RemoveContainer" containerID="0f1e06355cf231f5c0e1623ee9a959b43689123f7d1ce71a9a89ba9001a959c8" Apr 19 15:32:25.361144 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:32:25.361127 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f1e06355cf231f5c0e1623ee9a959b43689123f7d1ce71a9a89ba9001a959c8\": container with ID starting with 0f1e06355cf231f5c0e1623ee9a959b43689123f7d1ce71a9a89ba9001a959c8 not found: ID does not exist" containerID="0f1e06355cf231f5c0e1623ee9a959b43689123f7d1ce71a9a89ba9001a959c8" Apr 19 15:32:25.361252 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:25.361153 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f1e06355cf231f5c0e1623ee9a959b43689123f7d1ce71a9a89ba9001a959c8"} err="failed to get container status \"0f1e06355cf231f5c0e1623ee9a959b43689123f7d1ce71a9a89ba9001a959c8\": rpc error: code = NotFound desc = could not find container \"0f1e06355cf231f5c0e1623ee9a959b43689123f7d1ce71a9a89ba9001a959c8\": container with ID starting with 0f1e06355cf231f5c0e1623ee9a959b43689123f7d1ce71a9a89ba9001a959c8 not found: ID does not exist" Apr 19 15:32:25.367329 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:25.367298 2579 status_manager.go:895] "Failed to get status for pod" podUID="af41b07b-84e3-43d1-b1f8-d81a4c46b35f" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-khd8h" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-khd8h\" is forbidden: User \"system:node:ip-10-0-133-218.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-133-218.ec2.internal' and this object" Apr 19 15:32:25.367826 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:25.367791 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-fx6xt" podStartSLOduration=15.367779654 podStartE2EDuration="15.367779654s" podCreationTimestamp="2026-04-19 15:32:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-19 15:32:25.365311814 +0000 UTC m=+444.471227204" watchObservedRunningTime="2026-04-19 15:32:25.367779654 +0000 UTC m=+444.473695043" Apr 19 15:32:25.383530 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:25.383481 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-6xkts" podStartSLOduration=2.711321354 podStartE2EDuration="25.383465826s" podCreationTimestamp="2026-04-19 15:32:00 +0000 UTC" firstStartedPulling="2026-04-19 15:32:01.83309574 +0000 UTC m=+420.939011105" lastFinishedPulling="2026-04-19 15:32:24.50524021 +0000 UTC m=+443.611155577" observedRunningTime="2026-04-19 15:32:25.382340399 +0000 UTC m=+444.488255786" watchObservedRunningTime="2026-04-19 15:32:25.383465826 +0000 UTC m=+444.489381215" Apr 19 15:32:25.399657 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:25.399611 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-qgw6h" podStartSLOduration=15.399594898 podStartE2EDuration="15.399594898s" podCreationTimestamp="2026-04-19 15:32:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-19 15:32:25.399084037 +0000 UTC m=+444.504999426" watchObservedRunningTime="2026-04-19 15:32:25.399594898 +0000 UTC m=+444.505510287" Apr 19 15:32:25.551737 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:25.551692 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af41b07b-84e3-43d1-b1f8-d81a4c46b35f" path="/var/lib/kubelet/pods/af41b07b-84e3-43d1-b1f8-d81a4c46b35f/volumes" Apr 19 15:32:26.935094 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:32:26.935065 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56ad8ec5_cdd6_4d63_b9fa_861e7c5081f7.slice/crio-4950364689d0e3d91ad2714a1c7555e1d8463c60b612546f09c7be119713e0cb WatchSource:0}: Error finding container 4950364689d0e3d91ad2714a1c7555e1d8463c60b612546f09c7be119713e0cb: Status 404 returned error can't find the container with id 4950364689d0e3d91ad2714a1c7555e1d8463c60b612546f09c7be119713e0cb Apr 19 15:32:27.363344 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:27.363247 2579 generic.go:358] "Generic (PLEG): container finished" podID="56ad8ec5-cdd6-4d63-b9fa-861e7c5081f7" containerID="e493c6abbadc9b8c903902fc7fd71785ae6741296d3fa45124d6068213cd4371" exitCode=1 Apr 19 15:32:27.365455 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:27.365426 2579 status_manager.go:895] "Failed to get status for pod" podUID="56ad8ec5-cdd6-4d63-b9fa-861e7c5081f7" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-w9bh9" err="pods \"kuadrant-operator-controller-manager-84b657d985-w9bh9\" is forbidden: User \"system:node:ip-10-0-133-218.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-133-218.ec2.internal' and this object" Apr 19 15:32:27.395468 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:27.395447 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-w9bh9" Apr 19 15:32:27.399139 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:27.399113 2579 status_manager.go:895] "Failed to get status for pod" podUID="56ad8ec5-cdd6-4d63-b9fa-861e7c5081f7" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-w9bh9" err="pods \"kuadrant-operator-controller-manager-84b657d985-w9bh9\" is forbidden: User \"system:node:ip-10-0-133-218.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-133-218.ec2.internal' and this object" Apr 19 15:32:27.514401 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:27.514364 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/56ad8ec5-cdd6-4d63-b9fa-861e7c5081f7-extensions-socket-volume\") pod \"56ad8ec5-cdd6-4d63-b9fa-861e7c5081f7\" (UID: \"56ad8ec5-cdd6-4d63-b9fa-861e7c5081f7\") " Apr 19 15:32:27.514548 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:27.514441 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lf8cq\" (UniqueName: \"kubernetes.io/projected/56ad8ec5-cdd6-4d63-b9fa-861e7c5081f7-kube-api-access-lf8cq\") pod \"56ad8ec5-cdd6-4d63-b9fa-861e7c5081f7\" (UID: \"56ad8ec5-cdd6-4d63-b9fa-861e7c5081f7\") " Apr 19 15:32:27.514660 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:27.514636 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56ad8ec5-cdd6-4d63-b9fa-861e7c5081f7-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "56ad8ec5-cdd6-4d63-b9fa-861e7c5081f7" (UID: "56ad8ec5-cdd6-4d63-b9fa-861e7c5081f7"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 19 15:32:27.516674 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:27.516654 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56ad8ec5-cdd6-4d63-b9fa-861e7c5081f7-kube-api-access-lf8cq" (OuterVolumeSpecName: "kube-api-access-lf8cq") pod "56ad8ec5-cdd6-4d63-b9fa-861e7c5081f7" (UID: "56ad8ec5-cdd6-4d63-b9fa-861e7c5081f7"). InnerVolumeSpecName "kube-api-access-lf8cq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 15:32:27.546815 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:27.546787 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56ad8ec5-cdd6-4d63-b9fa-861e7c5081f7" path="/var/lib/kubelet/pods/56ad8ec5-cdd6-4d63-b9fa-861e7c5081f7/volumes" Apr 19 15:32:27.615785 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:27.615690 2579 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/56ad8ec5-cdd6-4d63-b9fa-861e7c5081f7-extensions-socket-volume\") on node \"ip-10-0-133-218.ec2.internal\" DevicePath \"\"" Apr 19 15:32:27.615785 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:27.615713 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lf8cq\" (UniqueName: \"kubernetes.io/projected/56ad8ec5-cdd6-4d63-b9fa-861e7c5081f7-kube-api-access-lf8cq\") on node \"ip-10-0-133-218.ec2.internal\" DevicePath \"\"" Apr 19 15:32:28.369157 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:28.369080 2579 scope.go:117] "RemoveContainer" containerID="e493c6abbadc9b8c903902fc7fd71785ae6741296d3fa45124d6068213cd4371" Apr 19 15:32:28.369157 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:28.369080 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-w9bh9" Apr 19 15:32:28.371105 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:28.371060 2579 status_manager.go:895] "Failed to get status for pod" podUID="56ad8ec5-cdd6-4d63-b9fa-861e7c5081f7" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-w9bh9" err="pods \"kuadrant-operator-controller-manager-84b657d985-w9bh9\" is forbidden: User \"system:node:ip-10-0-133-218.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-133-218.ec2.internal' and this object" Apr 19 15:32:28.373158 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:28.373131 2579 status_manager.go:895] "Failed to get status for pod" podUID="56ad8ec5-cdd6-4d63-b9fa-861e7c5081f7" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-w9bh9" err="pods \"kuadrant-operator-controller-manager-84b657d985-w9bh9\" is forbidden: User \"system:node:ip-10-0-133-218.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-133-218.ec2.internal' and this object" Apr 19 15:32:36.359027 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:36.358989 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-fx6xt" Apr 19 15:32:36.359488 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:36.359052 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-qgw6h" Apr 19 15:32:36.392965 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:36.392897 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-579f7fb596-jpftg" podUID="af1accab-19c1-4b1c-a8e7-290d0f5252a4" containerName="console" containerID="cri-o://1356884c2a1eb520074dbe35a1c61971db21f64bb79009352f492fb6e596a9f1" gracePeriod=15 Apr 19 15:32:36.433372 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:36.433340 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-fx6xt"] Apr 19 15:32:36.433571 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:36.433549 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-fx6xt" podUID="55c79b63-7131-4c1d-ab01-420a97395ed9" containerName="manager" containerID="cri-o://6731a6e7ddba7c73b55679bb7b6df5bb68c308b2296e176bb47ec378a7d330e9" gracePeriod=10 Apr 19 15:32:36.649859 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:36.649828 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-zgp2n"] Apr 19 15:32:36.650255 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:36.650239 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="56ad8ec5-cdd6-4d63-b9fa-861e7c5081f7" containerName="manager" Apr 19 15:32:36.650358 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:36.650257 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="56ad8ec5-cdd6-4d63-b9fa-861e7c5081f7" containerName="manager" Apr 19 15:32:36.650358 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:36.650323 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="56ad8ec5-cdd6-4d63-b9fa-861e7c5081f7" containerName="manager" Apr 19 15:32:36.653528 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:36.653510 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-zgp2n" Apr 19 15:32:36.663518 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:36.663365 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-zgp2n"] Apr 19 15:32:36.697883 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:36.697857 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5h7b\" (UniqueName: \"kubernetes.io/projected/1af66041-ebb4-40f6-a6a9-f491ef0efd14-kube-api-access-d5h7b\") pod \"kuadrant-operator-controller-manager-55c7f4c975-zgp2n\" (UID: \"1af66041-ebb4-40f6-a6a9-f491ef0efd14\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-zgp2n" Apr 19 15:32:36.698046 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:36.697921 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/1af66041-ebb4-40f6-a6a9-f491ef0efd14-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-zgp2n\" (UID: \"1af66041-ebb4-40f6-a6a9-f491ef0efd14\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-zgp2n" Apr 19 15:32:36.708449 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:36.708426 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-579f7fb596-jpftg_af1accab-19c1-4b1c-a8e7-290d0f5252a4/console/0.log" Apr 19 15:32:36.708565 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:36.708491 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-579f7fb596-jpftg" Apr 19 15:32:36.712225 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:36.712189 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-fx6xt" Apr 19 15:32:36.798603 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:36.798565 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/af1accab-19c1-4b1c-a8e7-290d0f5252a4-console-oauth-config\") pod \"af1accab-19c1-4b1c-a8e7-290d0f5252a4\" (UID: \"af1accab-19c1-4b1c-a8e7-290d0f5252a4\") " Apr 19 15:32:36.798796 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:36.798665 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/55c79b63-7131-4c1d-ab01-420a97395ed9-extensions-socket-volume\") pod \"55c79b63-7131-4c1d-ab01-420a97395ed9\" (UID: \"55c79b63-7131-4c1d-ab01-420a97395ed9\") " Apr 19 15:32:36.798796 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:36.798748 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/af1accab-19c1-4b1c-a8e7-290d0f5252a4-service-ca\") pod \"af1accab-19c1-4b1c-a8e7-290d0f5252a4\" (UID: \"af1accab-19c1-4b1c-a8e7-290d0f5252a4\") " Apr 19 15:32:36.798796 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:36.798776 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/af1accab-19c1-4b1c-a8e7-290d0f5252a4-trusted-ca-bundle\") pod \"af1accab-19c1-4b1c-a8e7-290d0f5252a4\" (UID: \"af1accab-19c1-4b1c-a8e7-290d0f5252a4\") " Apr 19 15:32:36.799341 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:36.799170 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af1accab-19c1-4b1c-a8e7-290d0f5252a4-service-ca" (OuterVolumeSpecName: "service-ca") pod "af1accab-19c1-4b1c-a8e7-290d0f5252a4" (UID: "af1accab-19c1-4b1c-a8e7-290d0f5252a4"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 19 15:32:36.799341 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:36.799215 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55c79b63-7131-4c1d-ab01-420a97395ed9-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "55c79b63-7131-4c1d-ab01-420a97395ed9" (UID: "55c79b63-7131-4c1d-ab01-420a97395ed9"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 19 15:32:36.799603 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:36.799577 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/af1accab-19c1-4b1c-a8e7-290d0f5252a4-oauth-serving-cert\") pod \"af1accab-19c1-4b1c-a8e7-290d0f5252a4\" (UID: \"af1accab-19c1-4b1c-a8e7-290d0f5252a4\") " Apr 19 15:32:36.799760 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:36.799631 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8llhc\" (UniqueName: \"kubernetes.io/projected/55c79b63-7131-4c1d-ab01-420a97395ed9-kube-api-access-8llhc\") pod \"55c79b63-7131-4c1d-ab01-420a97395ed9\" (UID: \"55c79b63-7131-4c1d-ab01-420a97395ed9\") " Apr 19 15:32:36.799760 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:36.799658 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/af1accab-19c1-4b1c-a8e7-290d0f5252a4-console-serving-cert\") pod \"af1accab-19c1-4b1c-a8e7-290d0f5252a4\" (UID: \"af1accab-19c1-4b1c-a8e7-290d0f5252a4\") " Apr 19 15:32:36.799760 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:36.799675 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af1accab-19c1-4b1c-a8e7-290d0f5252a4-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "af1accab-19c1-4b1c-a8e7-290d0f5252a4" (UID: "af1accab-19c1-4b1c-a8e7-290d0f5252a4"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 19 15:32:36.799760 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:36.799744 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/af1accab-19c1-4b1c-a8e7-290d0f5252a4-console-config\") pod \"af1accab-19c1-4b1c-a8e7-290d0f5252a4\" (UID: \"af1accab-19c1-4b1c-a8e7-290d0f5252a4\") " Apr 19 15:32:36.799994 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:36.799793 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2h6kl\" (UniqueName: \"kubernetes.io/projected/af1accab-19c1-4b1c-a8e7-290d0f5252a4-kube-api-access-2h6kl\") pod \"af1accab-19c1-4b1c-a8e7-290d0f5252a4\" (UID: \"af1accab-19c1-4b1c-a8e7-290d0f5252a4\") " Apr 19 15:32:36.799994 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:36.799965 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af1accab-19c1-4b1c-a8e7-290d0f5252a4-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "af1accab-19c1-4b1c-a8e7-290d0f5252a4" (UID: "af1accab-19c1-4b1c-a8e7-290d0f5252a4"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 19 15:32:36.800144 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:36.800117 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d5h7b\" (UniqueName: \"kubernetes.io/projected/1af66041-ebb4-40f6-a6a9-f491ef0efd14-kube-api-access-d5h7b\") pod \"kuadrant-operator-controller-manager-55c7f4c975-zgp2n\" (UID: \"1af66041-ebb4-40f6-a6a9-f491ef0efd14\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-zgp2n" Apr 19 15:32:36.800244 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:36.800144 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af1accab-19c1-4b1c-a8e7-290d0f5252a4-console-config" (OuterVolumeSpecName: "console-config") pod "af1accab-19c1-4b1c-a8e7-290d0f5252a4" (UID: "af1accab-19c1-4b1c-a8e7-290d0f5252a4"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 19 15:32:36.800244 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:36.800219 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/1af66041-ebb4-40f6-a6a9-f491ef0efd14-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-zgp2n\" (UID: \"1af66041-ebb4-40f6-a6a9-f491ef0efd14\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-zgp2n" Apr 19 15:32:36.800403 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:36.800327 2579 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/af1accab-19c1-4b1c-a8e7-290d0f5252a4-console-config\") on node \"ip-10-0-133-218.ec2.internal\" DevicePath \"\"" Apr 19 15:32:36.800403 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:36.800343 2579 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/55c79b63-7131-4c1d-ab01-420a97395ed9-extensions-socket-volume\") on node \"ip-10-0-133-218.ec2.internal\" DevicePath \"\"" Apr 19 15:32:36.800403 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:36.800358 2579 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/af1accab-19c1-4b1c-a8e7-290d0f5252a4-service-ca\") on node \"ip-10-0-133-218.ec2.internal\" DevicePath \"\"" Apr 19 15:32:36.800403 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:36.800373 2579 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/af1accab-19c1-4b1c-a8e7-290d0f5252a4-trusted-ca-bundle\") on node \"ip-10-0-133-218.ec2.internal\" DevicePath \"\"" Apr 19 15:32:36.800403 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:36.800386 2579 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/af1accab-19c1-4b1c-a8e7-290d0f5252a4-oauth-serving-cert\") on node \"ip-10-0-133-218.ec2.internal\" DevicePath \"\"" Apr 19 15:32:36.800874 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:36.800849 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/1af66041-ebb4-40f6-a6a9-f491ef0efd14-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-zgp2n\" (UID: \"1af66041-ebb4-40f6-a6a9-f491ef0efd14\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-zgp2n" Apr 19 15:32:36.801472 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:36.801449 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af1accab-19c1-4b1c-a8e7-290d0f5252a4-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "af1accab-19c1-4b1c-a8e7-290d0f5252a4" (UID: "af1accab-19c1-4b1c-a8e7-290d0f5252a4"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 19 15:32:36.802099 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:36.802078 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55c79b63-7131-4c1d-ab01-420a97395ed9-kube-api-access-8llhc" (OuterVolumeSpecName: "kube-api-access-8llhc") pod "55c79b63-7131-4c1d-ab01-420a97395ed9" (UID: "55c79b63-7131-4c1d-ab01-420a97395ed9"). InnerVolumeSpecName "kube-api-access-8llhc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 15:32:36.802447 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:36.802421 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af1accab-19c1-4b1c-a8e7-290d0f5252a4-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "af1accab-19c1-4b1c-a8e7-290d0f5252a4" (UID: "af1accab-19c1-4b1c-a8e7-290d0f5252a4"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 19 15:32:36.802566 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:36.802546 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af1accab-19c1-4b1c-a8e7-290d0f5252a4-kube-api-access-2h6kl" (OuterVolumeSpecName: "kube-api-access-2h6kl") pod "af1accab-19c1-4b1c-a8e7-290d0f5252a4" (UID: "af1accab-19c1-4b1c-a8e7-290d0f5252a4"). InnerVolumeSpecName "kube-api-access-2h6kl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 15:32:36.818470 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:36.818449 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5h7b\" (UniqueName: \"kubernetes.io/projected/1af66041-ebb4-40f6-a6a9-f491ef0efd14-kube-api-access-d5h7b\") pod \"kuadrant-operator-controller-manager-55c7f4c975-zgp2n\" (UID: \"1af66041-ebb4-40f6-a6a9-f491ef0efd14\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-zgp2n" Apr 19 15:32:36.900941 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:36.900849 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8llhc\" (UniqueName: \"kubernetes.io/projected/55c79b63-7131-4c1d-ab01-420a97395ed9-kube-api-access-8llhc\") on node \"ip-10-0-133-218.ec2.internal\" DevicePath \"\"" Apr 19 15:32:36.900941 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:36.900885 2579 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/af1accab-19c1-4b1c-a8e7-290d0f5252a4-console-serving-cert\") on node \"ip-10-0-133-218.ec2.internal\" DevicePath \"\"" Apr 19 15:32:36.900941 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:36.900895 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2h6kl\" (UniqueName: \"kubernetes.io/projected/af1accab-19c1-4b1c-a8e7-290d0f5252a4-kube-api-access-2h6kl\") on node \"ip-10-0-133-218.ec2.internal\" DevicePath \"\"" Apr 19 15:32:36.900941 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:36.900904 2579 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/af1accab-19c1-4b1c-a8e7-290d0f5252a4-console-oauth-config\") on node \"ip-10-0-133-218.ec2.internal\" DevicePath \"\"" Apr 19 15:32:36.967915 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:36.967875 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-zgp2n" Apr 19 15:32:37.118294 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:37.118257 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-zgp2n"] Apr 19 15:32:37.121520 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:32:37.121483 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1af66041_ebb4_40f6_a6a9_f491ef0efd14.slice/crio-2a2952d6bc156c91bffa275486988b1db91c5806063dfb4fac0eddb6b7c4c657 WatchSource:0}: Error finding container 2a2952d6bc156c91bffa275486988b1db91c5806063dfb4fac0eddb6b7c4c657: Status 404 returned error can't find the container with id 2a2952d6bc156c91bffa275486988b1db91c5806063dfb4fac0eddb6b7c4c657 Apr 19 15:32:37.409275 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:37.409196 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-579f7fb596-jpftg_af1accab-19c1-4b1c-a8e7-290d0f5252a4/console/0.log" Apr 19 15:32:37.409275 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:37.409236 2579 generic.go:358] "Generic (PLEG): container finished" podID="af1accab-19c1-4b1c-a8e7-290d0f5252a4" containerID="1356884c2a1eb520074dbe35a1c61971db21f64bb79009352f492fb6e596a9f1" exitCode=2 Apr 19 15:32:37.409769 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:37.409306 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-579f7fb596-jpftg" event={"ID":"af1accab-19c1-4b1c-a8e7-290d0f5252a4","Type":"ContainerDied","Data":"1356884c2a1eb520074dbe35a1c61971db21f64bb79009352f492fb6e596a9f1"} Apr 19 15:32:37.409769 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:37.409334 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-579f7fb596-jpftg" event={"ID":"af1accab-19c1-4b1c-a8e7-290d0f5252a4","Type":"ContainerDied","Data":"8395a04ea9f5b529a029a514b9a284198b0c1c95dfaaf9cdf8490a809b0db53f"} Apr 19 15:32:37.409769 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:37.409353 2579 scope.go:117] "RemoveContainer" containerID="1356884c2a1eb520074dbe35a1c61971db21f64bb79009352f492fb6e596a9f1" Apr 19 15:32:37.409769 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:37.409358 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-579f7fb596-jpftg" Apr 19 15:32:37.411013 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:37.410937 2579 generic.go:358] "Generic (PLEG): container finished" podID="55c79b63-7131-4c1d-ab01-420a97395ed9" containerID="6731a6e7ddba7c73b55679bb7b6df5bb68c308b2296e176bb47ec378a7d330e9" exitCode=0 Apr 19 15:32:37.411118 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:37.411090 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-fx6xt" event={"ID":"55c79b63-7131-4c1d-ab01-420a97395ed9","Type":"ContainerDied","Data":"6731a6e7ddba7c73b55679bb7b6df5bb68c308b2296e176bb47ec378a7d330e9"} Apr 19 15:32:37.411184 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:37.411118 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-fx6xt" event={"ID":"55c79b63-7131-4c1d-ab01-420a97395ed9","Type":"ContainerDied","Data":"4933c38f009eda195ec89018ff4ee927c3fcf6106b9581ff9765feb3adf7c391"} Apr 19 15:32:37.414762 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:37.411248 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-fx6xt" Apr 19 15:32:37.414762 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:37.412879 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-zgp2n" event={"ID":"1af66041-ebb4-40f6-a6a9-f491ef0efd14","Type":"ContainerStarted","Data":"010b9a17f02af55e7d40b58a5c87073d43505efb755714ce991f93ac6630f87f"} Apr 19 15:32:37.414762 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:37.412904 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-zgp2n" event={"ID":"1af66041-ebb4-40f6-a6a9-f491ef0efd14","Type":"ContainerStarted","Data":"2a2952d6bc156c91bffa275486988b1db91c5806063dfb4fac0eddb6b7c4c657"} Apr 19 15:32:37.414762 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:37.413359 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-zgp2n" Apr 19 15:32:37.421065 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:37.421042 2579 scope.go:117] "RemoveContainer" containerID="1356884c2a1eb520074dbe35a1c61971db21f64bb79009352f492fb6e596a9f1" Apr 19 15:32:37.421354 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:32:37.421335 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1356884c2a1eb520074dbe35a1c61971db21f64bb79009352f492fb6e596a9f1\": container with ID starting with 1356884c2a1eb520074dbe35a1c61971db21f64bb79009352f492fb6e596a9f1 not found: ID does not exist" containerID="1356884c2a1eb520074dbe35a1c61971db21f64bb79009352f492fb6e596a9f1" Apr 19 15:32:37.421416 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:37.421364 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1356884c2a1eb520074dbe35a1c61971db21f64bb79009352f492fb6e596a9f1"} err="failed to get container status \"1356884c2a1eb520074dbe35a1c61971db21f64bb79009352f492fb6e596a9f1\": rpc error: code = NotFound desc = could not find container \"1356884c2a1eb520074dbe35a1c61971db21f64bb79009352f492fb6e596a9f1\": container with ID starting with 1356884c2a1eb520074dbe35a1c61971db21f64bb79009352f492fb6e596a9f1 not found: ID does not exist" Apr 19 15:32:37.421416 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:37.421382 2579 scope.go:117] "RemoveContainer" containerID="6731a6e7ddba7c73b55679bb7b6df5bb68c308b2296e176bb47ec378a7d330e9" Apr 19 15:32:37.430069 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:37.430052 2579 scope.go:117] "RemoveContainer" containerID="6731a6e7ddba7c73b55679bb7b6df5bb68c308b2296e176bb47ec378a7d330e9" Apr 19 15:32:37.430324 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:32:37.430304 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6731a6e7ddba7c73b55679bb7b6df5bb68c308b2296e176bb47ec378a7d330e9\": container with ID starting with 6731a6e7ddba7c73b55679bb7b6df5bb68c308b2296e176bb47ec378a7d330e9 not found: ID does not exist" containerID="6731a6e7ddba7c73b55679bb7b6df5bb68c308b2296e176bb47ec378a7d330e9" Apr 19 15:32:37.430374 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:37.430331 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6731a6e7ddba7c73b55679bb7b6df5bb68c308b2296e176bb47ec378a7d330e9"} err="failed to get container status \"6731a6e7ddba7c73b55679bb7b6df5bb68c308b2296e176bb47ec378a7d330e9\": rpc error: code = NotFound desc = could not find container \"6731a6e7ddba7c73b55679bb7b6df5bb68c308b2296e176bb47ec378a7d330e9\": container with ID starting with 6731a6e7ddba7c73b55679bb7b6df5bb68c308b2296e176bb47ec378a7d330e9 not found: ID does not exist" Apr 19 15:32:37.435433 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:37.435380 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-zgp2n" podStartSLOduration=1.435366898 podStartE2EDuration="1.435366898s" podCreationTimestamp="2026-04-19 15:32:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-19 15:32:37.433267689 +0000 UTC m=+456.539183082" watchObservedRunningTime="2026-04-19 15:32:37.435366898 +0000 UTC m=+456.541282286" Apr 19 15:32:37.459835 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:37.459805 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-579f7fb596-jpftg"] Apr 19 15:32:37.463441 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:37.463415 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-579f7fb596-jpftg"] Apr 19 15:32:37.473635 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:37.473608 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-fx6xt"] Apr 19 15:32:37.477683 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:37.477654 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-fx6xt"] Apr 19 15:32:37.546668 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:37.546634 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55c79b63-7131-4c1d-ab01-420a97395ed9" path="/var/lib/kubelet/pods/55c79b63-7131-4c1d-ab01-420a97395ed9/volumes" Apr 19 15:32:37.547061 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:37.547047 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af1accab-19c1-4b1c-a8e7-290d0f5252a4" path="/var/lib/kubelet/pods/af1accab-19c1-4b1c-a8e7-290d0f5252a4/volumes" Apr 19 15:32:49.425978 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:49.425943 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-zgp2n" Apr 19 15:32:49.473062 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:49.473032 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-qgw6h"] Apr 19 15:32:49.473365 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:49.473337 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-qgw6h" podUID="53370a72-a5e8-48a1-98b3-b76f23e41002" containerName="manager" containerID="cri-o://34fee25f4371503d0a6c4a73653fdc2281032b2467bfd9907fc8f28731a6663b" gracePeriod=10 Apr 19 15:32:49.748545 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:49.748518 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-qgw6h" Apr 19 15:32:49.820558 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:49.820524 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pgxfb\" (UniqueName: \"kubernetes.io/projected/53370a72-a5e8-48a1-98b3-b76f23e41002-kube-api-access-pgxfb\") pod \"53370a72-a5e8-48a1-98b3-b76f23e41002\" (UID: \"53370a72-a5e8-48a1-98b3-b76f23e41002\") " Apr 19 15:32:49.820715 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:49.820631 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/53370a72-a5e8-48a1-98b3-b76f23e41002-extensions-socket-volume\") pod \"53370a72-a5e8-48a1-98b3-b76f23e41002\" (UID: \"53370a72-a5e8-48a1-98b3-b76f23e41002\") " Apr 19 15:32:49.821133 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:49.821104 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53370a72-a5e8-48a1-98b3-b76f23e41002-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "53370a72-a5e8-48a1-98b3-b76f23e41002" (UID: "53370a72-a5e8-48a1-98b3-b76f23e41002"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 19 15:32:49.822860 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:49.822837 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53370a72-a5e8-48a1-98b3-b76f23e41002-kube-api-access-pgxfb" (OuterVolumeSpecName: "kube-api-access-pgxfb") pod "53370a72-a5e8-48a1-98b3-b76f23e41002" (UID: "53370a72-a5e8-48a1-98b3-b76f23e41002"). InnerVolumeSpecName "kube-api-access-pgxfb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 15:32:49.922258 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:49.922213 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pgxfb\" (UniqueName: \"kubernetes.io/projected/53370a72-a5e8-48a1-98b3-b76f23e41002-kube-api-access-pgxfb\") on node \"ip-10-0-133-218.ec2.internal\" DevicePath \"\"" Apr 19 15:32:49.922258 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:49.922253 2579 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/53370a72-a5e8-48a1-98b3-b76f23e41002-extensions-socket-volume\") on node \"ip-10-0-133-218.ec2.internal\" DevicePath \"\"" Apr 19 15:32:50.470711 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:50.470672 2579 generic.go:358] "Generic (PLEG): container finished" podID="53370a72-a5e8-48a1-98b3-b76f23e41002" containerID="34fee25f4371503d0a6c4a73653fdc2281032b2467bfd9907fc8f28731a6663b" exitCode=0 Apr 19 15:32:50.471180 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:50.470745 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-qgw6h" event={"ID":"53370a72-a5e8-48a1-98b3-b76f23e41002","Type":"ContainerDied","Data":"34fee25f4371503d0a6c4a73653fdc2281032b2467bfd9907fc8f28731a6663b"} Apr 19 15:32:50.471180 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:50.470773 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-qgw6h" Apr 19 15:32:50.471180 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:50.470790 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-qgw6h" event={"ID":"53370a72-a5e8-48a1-98b3-b76f23e41002","Type":"ContainerDied","Data":"c186d2437e2a90f4d81d8f93b47bd81951c7012fd53ae0362c8d2d8526386a5d"} Apr 19 15:32:50.471180 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:50.470810 2579 scope.go:117] "RemoveContainer" containerID="34fee25f4371503d0a6c4a73653fdc2281032b2467bfd9907fc8f28731a6663b" Apr 19 15:32:50.488142 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:50.487459 2579 scope.go:117] "RemoveContainer" containerID="34fee25f4371503d0a6c4a73653fdc2281032b2467bfd9907fc8f28731a6663b" Apr 19 15:32:50.488142 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:32:50.487890 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34fee25f4371503d0a6c4a73653fdc2281032b2467bfd9907fc8f28731a6663b\": container with ID starting with 34fee25f4371503d0a6c4a73653fdc2281032b2467bfd9907fc8f28731a6663b not found: ID does not exist" containerID="34fee25f4371503d0a6c4a73653fdc2281032b2467bfd9907fc8f28731a6663b" Apr 19 15:32:50.488142 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:50.487930 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34fee25f4371503d0a6c4a73653fdc2281032b2467bfd9907fc8f28731a6663b"} err="failed to get container status \"34fee25f4371503d0a6c4a73653fdc2281032b2467bfd9907fc8f28731a6663b\": rpc error: code = NotFound desc = could not find container \"34fee25f4371503d0a6c4a73653fdc2281032b2467bfd9907fc8f28731a6663b\": container with ID starting with 34fee25f4371503d0a6c4a73653fdc2281032b2467bfd9907fc8f28731a6663b not found: ID does not exist" Apr 19 15:32:50.499633 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:50.499606 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-qgw6h"] Apr 19 15:32:50.503495 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:50.503469 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-qgw6h"] Apr 19 15:32:51.546852 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:51.546818 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53370a72-a5e8-48a1-98b3-b76f23e41002" path="/var/lib/kubelet/pods/53370a72-a5e8-48a1-98b3-b76f23e41002/volumes" Apr 19 15:32:52.646560 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:52.646480 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-k8m5w"] Apr 19 15:32:52.647160 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:52.646882 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="55c79b63-7131-4c1d-ab01-420a97395ed9" containerName="manager" Apr 19 15:32:52.647160 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:52.646900 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="55c79b63-7131-4c1d-ab01-420a97395ed9" containerName="manager" Apr 19 15:32:52.647160 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:52.646923 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="af1accab-19c1-4b1c-a8e7-290d0f5252a4" containerName="console" Apr 19 15:32:52.647160 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:52.646932 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="af1accab-19c1-4b1c-a8e7-290d0f5252a4" containerName="console" Apr 19 15:32:52.647160 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:52.646950 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="53370a72-a5e8-48a1-98b3-b76f23e41002" containerName="manager" Apr 19 15:32:52.647160 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:52.646960 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="53370a72-a5e8-48a1-98b3-b76f23e41002" containerName="manager" Apr 19 15:32:52.647160 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:52.647053 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="af1accab-19c1-4b1c-a8e7-290d0f5252a4" containerName="console" Apr 19 15:32:52.647160 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:52.647069 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="55c79b63-7131-4c1d-ab01-420a97395ed9" containerName="manager" Apr 19 15:32:52.647160 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:52.647079 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="53370a72-a5e8-48a1-98b3-b76f23e41002" containerName="manager" Apr 19 15:32:52.920135 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:52.920036 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-k8m5w"] Apr 19 15:32:52.920321 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:52.920195 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-k8m5w" Apr 19 15:32:52.922969 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:52.922947 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"maas-default-gateway-openshift-default-dockercfg-2t828\"" Apr 19 15:32:53.052487 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:53.052455 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/bc0715c9-b497-4e43-9ffb-a87664024408-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-k8m5w\" (UID: \"bc0715c9-b497-4e43-9ffb-a87664024408\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-k8m5w" Apr 19 15:32:53.052682 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:53.052498 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/bc0715c9-b497-4e43-9ffb-a87664024408-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-k8m5w\" (UID: \"bc0715c9-b497-4e43-9ffb-a87664024408\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-k8m5w" Apr 19 15:32:53.052682 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:53.052524 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/bc0715c9-b497-4e43-9ffb-a87664024408-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-k8m5w\" (UID: \"bc0715c9-b497-4e43-9ffb-a87664024408\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-k8m5w" Apr 19 15:32:53.052682 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:53.052545 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/bc0715c9-b497-4e43-9ffb-a87664024408-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-k8m5w\" (UID: \"bc0715c9-b497-4e43-9ffb-a87664024408\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-k8m5w" Apr 19 15:32:53.052682 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:53.052572 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/bc0715c9-b497-4e43-9ffb-a87664024408-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-k8m5w\" (UID: \"bc0715c9-b497-4e43-9ffb-a87664024408\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-k8m5w" Apr 19 15:32:53.052682 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:53.052590 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/bc0715c9-b497-4e43-9ffb-a87664024408-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-k8m5w\" (UID: \"bc0715c9-b497-4e43-9ffb-a87664024408\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-k8m5w" Apr 19 15:32:53.052916 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:53.052676 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/bc0715c9-b497-4e43-9ffb-a87664024408-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-k8m5w\" (UID: \"bc0715c9-b497-4e43-9ffb-a87664024408\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-k8m5w" Apr 19 15:32:53.052916 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:53.052755 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/bc0715c9-b497-4e43-9ffb-a87664024408-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-k8m5w\" (UID: \"bc0715c9-b497-4e43-9ffb-a87664024408\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-k8m5w" Apr 19 15:32:53.052916 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:53.052798 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n785t\" (UniqueName: \"kubernetes.io/projected/bc0715c9-b497-4e43-9ffb-a87664024408-kube-api-access-n785t\") pod \"maas-default-gateway-openshift-default-58b6f876-k8m5w\" (UID: \"bc0715c9-b497-4e43-9ffb-a87664024408\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-k8m5w" Apr 19 15:32:53.154235 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:53.154193 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/bc0715c9-b497-4e43-9ffb-a87664024408-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-k8m5w\" (UID: \"bc0715c9-b497-4e43-9ffb-a87664024408\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-k8m5w" Apr 19 15:32:53.154446 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:53.154253 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/bc0715c9-b497-4e43-9ffb-a87664024408-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-k8m5w\" (UID: \"bc0715c9-b497-4e43-9ffb-a87664024408\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-k8m5w" Apr 19 15:32:53.154446 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:53.154278 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/bc0715c9-b497-4e43-9ffb-a87664024408-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-k8m5w\" (UID: \"bc0715c9-b497-4e43-9ffb-a87664024408\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-k8m5w" Apr 19 15:32:53.154446 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:53.154304 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/bc0715c9-b497-4e43-9ffb-a87664024408-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-k8m5w\" (UID: \"bc0715c9-b497-4e43-9ffb-a87664024408\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-k8m5w" Apr 19 15:32:53.154446 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:53.154345 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/bc0715c9-b497-4e43-9ffb-a87664024408-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-k8m5w\" (UID: \"bc0715c9-b497-4e43-9ffb-a87664024408\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-k8m5w" Apr 19 15:32:53.154446 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:53.154374 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/bc0715c9-b497-4e43-9ffb-a87664024408-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-k8m5w\" (UID: \"bc0715c9-b497-4e43-9ffb-a87664024408\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-k8m5w" Apr 19 15:32:53.154446 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:53.154421 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/bc0715c9-b497-4e43-9ffb-a87664024408-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-k8m5w\" (UID: \"bc0715c9-b497-4e43-9ffb-a87664024408\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-k8m5w" Apr 19 15:32:53.154446 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:53.154444 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/bc0715c9-b497-4e43-9ffb-a87664024408-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-k8m5w\" (UID: \"bc0715c9-b497-4e43-9ffb-a87664024408\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-k8m5w" Apr 19 15:32:53.154829 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:53.154500 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n785t\" (UniqueName: \"kubernetes.io/projected/bc0715c9-b497-4e43-9ffb-a87664024408-kube-api-access-n785t\") pod \"maas-default-gateway-openshift-default-58b6f876-k8m5w\" (UID: \"bc0715c9-b497-4e43-9ffb-a87664024408\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-k8m5w" Apr 19 15:32:53.154829 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:53.154658 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/bc0715c9-b497-4e43-9ffb-a87664024408-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-k8m5w\" (UID: \"bc0715c9-b497-4e43-9ffb-a87664024408\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-k8m5w" Apr 19 15:32:53.154829 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:53.154678 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/bc0715c9-b497-4e43-9ffb-a87664024408-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-k8m5w\" (UID: \"bc0715c9-b497-4e43-9ffb-a87664024408\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-k8m5w" Apr 19 15:32:53.154829 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:53.154794 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/bc0715c9-b497-4e43-9ffb-a87664024408-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-k8m5w\" (UID: \"bc0715c9-b497-4e43-9ffb-a87664024408\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-k8m5w" Apr 19 15:32:53.155103 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:53.155076 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/bc0715c9-b497-4e43-9ffb-a87664024408-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-k8m5w\" (UID: \"bc0715c9-b497-4e43-9ffb-a87664024408\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-k8m5w" Apr 19 15:32:53.155296 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:53.155275 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/bc0715c9-b497-4e43-9ffb-a87664024408-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-k8m5w\" (UID: \"bc0715c9-b497-4e43-9ffb-a87664024408\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-k8m5w" Apr 19 15:32:53.156938 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:53.156921 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/bc0715c9-b497-4e43-9ffb-a87664024408-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-k8m5w\" (UID: \"bc0715c9-b497-4e43-9ffb-a87664024408\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-k8m5w" Apr 19 15:32:53.157033 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:53.157016 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/bc0715c9-b497-4e43-9ffb-a87664024408-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-k8m5w\" (UID: \"bc0715c9-b497-4e43-9ffb-a87664024408\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-k8m5w" Apr 19 15:32:53.162730 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:53.162692 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/bc0715c9-b497-4e43-9ffb-a87664024408-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-k8m5w\" (UID: \"bc0715c9-b497-4e43-9ffb-a87664024408\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-k8m5w" Apr 19 15:32:53.163134 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:53.163110 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n785t\" (UniqueName: \"kubernetes.io/projected/bc0715c9-b497-4e43-9ffb-a87664024408-kube-api-access-n785t\") pod \"maas-default-gateway-openshift-default-58b6f876-k8m5w\" (UID: \"bc0715c9-b497-4e43-9ffb-a87664024408\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-k8m5w" Apr 19 15:32:53.231144 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:53.231104 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-k8m5w" Apr 19 15:32:53.369489 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:53.369463 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-k8m5w"] Apr 19 15:32:53.371518 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:32:53.371462 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc0715c9_b497_4e43_9ffb_a87664024408.slice/crio-6716d22ee3d87f760e405b9ed06ad5d08280996971859d54b3dc896f5570646a WatchSource:0}: Error finding container 6716d22ee3d87f760e405b9ed06ad5d08280996971859d54b3dc896f5570646a: Status 404 returned error can't find the container with id 6716d22ee3d87f760e405b9ed06ad5d08280996971859d54b3dc896f5570646a Apr 19 15:32:53.373759 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:53.373706 2579 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892160Ki","pods":"250"} Apr 19 15:32:53.373843 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:53.373793 2579 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892160Ki","pods":"250"} Apr 19 15:32:53.373843 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:53.373829 2579 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892160Ki","pods":"250"} Apr 19 15:32:53.485635 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:53.485575 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-k8m5w" event={"ID":"bc0715c9-b497-4e43-9ffb-a87664024408","Type":"ContainerStarted","Data":"6716d22ee3d87f760e405b9ed06ad5d08280996971859d54b3dc896f5570646a"} Apr 19 15:32:54.491374 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:54.491332 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-k8m5w" event={"ID":"bc0715c9-b497-4e43-9ffb-a87664024408","Type":"ContainerStarted","Data":"03f50902bb20db2313d19fb29374dc01803900a073aca535cdc73e472d04fa07"} Apr 19 15:32:54.510375 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:54.510326 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-k8m5w" podStartSLOduration=2.510312609 podStartE2EDuration="2.510312609s" podCreationTimestamp="2026-04-19 15:32:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-19 15:32:54.507102216 +0000 UTC m=+473.613017604" watchObservedRunningTime="2026-04-19 15:32:54.510312609 +0000 UTC m=+473.616227997" Apr 19 15:32:55.231790 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:55.231747 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-k8m5w" Apr 19 15:32:55.236602 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:55.236573 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-k8m5w" Apr 19 15:32:55.496206 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:55.496126 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-k8m5w" Apr 19 15:32:55.497225 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:32:55.497202 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-k8m5w" Apr 19 15:33:09.889388 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:09.889339 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-gstpl"] Apr 19 15:33:09.929267 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:09.929232 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-gstpl"] Apr 19 15:33:09.929494 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:09.929364 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-gstpl" Apr 19 15:33:09.931732 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:09.931697 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-q77m2\"" Apr 19 15:33:09.999138 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:09.999095 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4kgm\" (UniqueName: \"kubernetes.io/projected/d30aec08-e6e1-44f6-bb01-79e0a21708bd-kube-api-access-j4kgm\") pod \"authorino-f99f4b5cd-gstpl\" (UID: \"d30aec08-e6e1-44f6-bb01-79e0a21708bd\") " pod="kuadrant-system/authorino-f99f4b5cd-gstpl" Apr 19 15:33:10.099587 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:10.099556 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j4kgm\" (UniqueName: \"kubernetes.io/projected/d30aec08-e6e1-44f6-bb01-79e0a21708bd-kube-api-access-j4kgm\") pod \"authorino-f99f4b5cd-gstpl\" (UID: \"d30aec08-e6e1-44f6-bb01-79e0a21708bd\") " pod="kuadrant-system/authorino-f99f4b5cd-gstpl" Apr 19 15:33:10.112815 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:10.112785 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4kgm\" (UniqueName: \"kubernetes.io/projected/d30aec08-e6e1-44f6-bb01-79e0a21708bd-kube-api-access-j4kgm\") pod \"authorino-f99f4b5cd-gstpl\" (UID: \"d30aec08-e6e1-44f6-bb01-79e0a21708bd\") " pod="kuadrant-system/authorino-f99f4b5cd-gstpl" Apr 19 15:33:10.238890 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:10.238851 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-gstpl" Apr 19 15:33:10.369531 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:10.369506 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-gstpl"] Apr 19 15:33:10.371795 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:33:10.371764 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd30aec08_e6e1_44f6_bb01_79e0a21708bd.slice/crio-37e6fc8fcd7db4bac0c3d9adcd9f9e20e75c670a5e6ed5df99a295f84a619bdc WatchSource:0}: Error finding container 37e6fc8fcd7db4bac0c3d9adcd9f9e20e75c670a5e6ed5df99a295f84a619bdc: Status 404 returned error can't find the container with id 37e6fc8fcd7db4bac0c3d9adcd9f9e20e75c670a5e6ed5df99a295f84a619bdc Apr 19 15:33:10.556221 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:10.556132 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-gstpl" event={"ID":"d30aec08-e6e1-44f6-bb01-79e0a21708bd","Type":"ContainerStarted","Data":"37e6fc8fcd7db4bac0c3d9adcd9f9e20e75c670a5e6ed5df99a295f84a619bdc"} Apr 19 15:33:13.572402 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:13.572361 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-gstpl" event={"ID":"d30aec08-e6e1-44f6-bb01-79e0a21708bd","Type":"ContainerStarted","Data":"26c3983c005522703731180bd5bcb335e5cc2780434f620866af6c0b770b2926"} Apr 19 15:33:13.587774 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:13.587701 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-f99f4b5cd-gstpl" podStartSLOduration=2.076412698 podStartE2EDuration="4.58768527s" podCreationTimestamp="2026-04-19 15:33:09 +0000 UTC" firstStartedPulling="2026-04-19 15:33:10.373432673 +0000 UTC m=+489.479348039" lastFinishedPulling="2026-04-19 15:33:12.884705239 +0000 UTC m=+491.990620611" observedRunningTime="2026-04-19 15:33:13.585640419 +0000 UTC m=+492.691555807" watchObservedRunningTime="2026-04-19 15:33:13.58768527 +0000 UTC m=+492.693600658" Apr 19 15:33:14.708790 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:14.708756 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-gstpl"] Apr 19 15:33:15.580633 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:15.580588 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-f99f4b5cd-gstpl" podUID="d30aec08-e6e1-44f6-bb01-79e0a21708bd" containerName="authorino" containerID="cri-o://26c3983c005522703731180bd5bcb335e5cc2780434f620866af6c0b770b2926" gracePeriod=30 Apr 19 15:33:15.825142 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:15.825118 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-gstpl" Apr 19 15:33:15.851494 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:15.851410 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j4kgm\" (UniqueName: \"kubernetes.io/projected/d30aec08-e6e1-44f6-bb01-79e0a21708bd-kube-api-access-j4kgm\") pod \"d30aec08-e6e1-44f6-bb01-79e0a21708bd\" (UID: \"d30aec08-e6e1-44f6-bb01-79e0a21708bd\") " Apr 19 15:33:15.853616 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:15.853577 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d30aec08-e6e1-44f6-bb01-79e0a21708bd-kube-api-access-j4kgm" (OuterVolumeSpecName: "kube-api-access-j4kgm") pod "d30aec08-e6e1-44f6-bb01-79e0a21708bd" (UID: "d30aec08-e6e1-44f6-bb01-79e0a21708bd"). InnerVolumeSpecName "kube-api-access-j4kgm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 15:33:15.952988 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:15.952957 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-j4kgm\" (UniqueName: \"kubernetes.io/projected/d30aec08-e6e1-44f6-bb01-79e0a21708bd-kube-api-access-j4kgm\") on node \"ip-10-0-133-218.ec2.internal\" DevicePath \"\"" Apr 19 15:33:16.584856 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:16.584814 2579 generic.go:358] "Generic (PLEG): container finished" podID="d30aec08-e6e1-44f6-bb01-79e0a21708bd" containerID="26c3983c005522703731180bd5bcb335e5cc2780434f620866af6c0b770b2926" exitCode=0 Apr 19 15:33:16.585056 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:16.584867 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-gstpl" Apr 19 15:33:16.585056 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:16.584908 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-gstpl" event={"ID":"d30aec08-e6e1-44f6-bb01-79e0a21708bd","Type":"ContainerDied","Data":"26c3983c005522703731180bd5bcb335e5cc2780434f620866af6c0b770b2926"} Apr 19 15:33:16.585056 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:16.584947 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-gstpl" event={"ID":"d30aec08-e6e1-44f6-bb01-79e0a21708bd","Type":"ContainerDied","Data":"37e6fc8fcd7db4bac0c3d9adcd9f9e20e75c670a5e6ed5df99a295f84a619bdc"} Apr 19 15:33:16.585056 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:16.584964 2579 scope.go:117] "RemoveContainer" containerID="26c3983c005522703731180bd5bcb335e5cc2780434f620866af6c0b770b2926" Apr 19 15:33:16.594345 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:16.594324 2579 scope.go:117] "RemoveContainer" containerID="26c3983c005522703731180bd5bcb335e5cc2780434f620866af6c0b770b2926" Apr 19 15:33:16.594599 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:33:16.594576 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26c3983c005522703731180bd5bcb335e5cc2780434f620866af6c0b770b2926\": container with ID starting with 26c3983c005522703731180bd5bcb335e5cc2780434f620866af6c0b770b2926 not found: ID does not exist" containerID="26c3983c005522703731180bd5bcb335e5cc2780434f620866af6c0b770b2926" Apr 19 15:33:16.594644 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:16.594610 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26c3983c005522703731180bd5bcb335e5cc2780434f620866af6c0b770b2926"} err="failed to get container status \"26c3983c005522703731180bd5bcb335e5cc2780434f620866af6c0b770b2926\": rpc error: code = NotFound desc = could not find container \"26c3983c005522703731180bd5bcb335e5cc2780434f620866af6c0b770b2926\": container with ID starting with 26c3983c005522703731180bd5bcb335e5cc2780434f620866af6c0b770b2926 not found: ID does not exist" Apr 19 15:33:16.605839 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:16.605812 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-gstpl"] Apr 19 15:33:16.609766 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:16.609739 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-gstpl"] Apr 19 15:33:17.551552 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:17.547315 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d30aec08-e6e1-44f6-bb01-79e0a21708bd" path="/var/lib/kubelet/pods/d30aec08-e6e1-44f6-bb01-79e0a21708bd/volumes" Apr 19 15:33:43.491661 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:43.491624 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-6fn6s"] Apr 19 15:33:43.492241 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:43.492219 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d30aec08-e6e1-44f6-bb01-79e0a21708bd" containerName="authorino" Apr 19 15:33:43.492315 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:43.492245 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="d30aec08-e6e1-44f6-bb01-79e0a21708bd" containerName="authorino" Apr 19 15:33:43.492390 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:43.492377 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="d30aec08-e6e1-44f6-bb01-79e0a21708bd" containerName="authorino" Apr 19 15:33:43.496839 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:43.496814 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-6fn6s" Apr 19 15:33:43.499293 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:43.499268 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-ncp5d\"" Apr 19 15:33:43.504296 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:43.504268 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-6fn6s"] Apr 19 15:33:43.600966 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:43.600913 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbxr2\" (UniqueName: \"kubernetes.io/projected/e2999878-8536-4102-a769-9be8c29eddf2-kube-api-access-kbxr2\") pod \"maas-controller-6d4c8f55f9-6fn6s\" (UID: \"e2999878-8536-4102-a769-9be8c29eddf2\") " pod="opendatahub/maas-controller-6d4c8f55f9-6fn6s" Apr 19 15:33:43.640499 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:43.640458 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-67747567d9-sz5nv"] Apr 19 15:33:43.643960 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:43.643943 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-67747567d9-sz5nv" Apr 19 15:33:43.652876 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:43.652848 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-67747567d9-sz5nv"] Apr 19 15:33:43.702447 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:43.702414 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kbxr2\" (UniqueName: \"kubernetes.io/projected/e2999878-8536-4102-a769-9be8c29eddf2-kube-api-access-kbxr2\") pod \"maas-controller-6d4c8f55f9-6fn6s\" (UID: \"e2999878-8536-4102-a769-9be8c29eddf2\") " pod="opendatahub/maas-controller-6d4c8f55f9-6fn6s" Apr 19 15:33:43.711023 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:43.710991 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbxr2\" (UniqueName: \"kubernetes.io/projected/e2999878-8536-4102-a769-9be8c29eddf2-kube-api-access-kbxr2\") pod \"maas-controller-6d4c8f55f9-6fn6s\" (UID: \"e2999878-8536-4102-a769-9be8c29eddf2\") " pod="opendatahub/maas-controller-6d4c8f55f9-6fn6s" Apr 19 15:33:43.757466 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:43.757370 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-67747567d9-sz5nv"] Apr 19 15:33:43.757685 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:33:43.757663 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-g7974], unattached volumes=[], failed to process volumes=[]: context canceled" pod="opendatahub/maas-controller-67747567d9-sz5nv" podUID="bf84d3b7-beb6-4c3e-aae7-670dc0aa1bda" Apr 19 15:33:43.781046 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:43.781003 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-6b8fdc774b-s5gfg"] Apr 19 15:33:43.784596 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:43.784573 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6b8fdc774b-s5gfg" Apr 19 15:33:43.791919 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:43.791891 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-6b8fdc774b-s5gfg"] Apr 19 15:33:43.803343 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:43.803305 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7974\" (UniqueName: \"kubernetes.io/projected/bf84d3b7-beb6-4c3e-aae7-670dc0aa1bda-kube-api-access-g7974\") pod \"maas-controller-67747567d9-sz5nv\" (UID: \"bf84d3b7-beb6-4c3e-aae7-670dc0aa1bda\") " pod="opendatahub/maas-controller-67747567d9-sz5nv" Apr 19 15:33:43.808849 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:43.808827 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-6fn6s" Apr 19 15:33:43.904595 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:43.904559 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4tdk\" (UniqueName: \"kubernetes.io/projected/7087ff84-8933-47d7-a5d7-15a39b2239fd-kube-api-access-h4tdk\") pod \"maas-controller-6b8fdc774b-s5gfg\" (UID: \"7087ff84-8933-47d7-a5d7-15a39b2239fd\") " pod="opendatahub/maas-controller-6b8fdc774b-s5gfg" Apr 19 15:33:43.904760 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:43.904639 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g7974\" (UniqueName: \"kubernetes.io/projected/bf84d3b7-beb6-4c3e-aae7-670dc0aa1bda-kube-api-access-g7974\") pod \"maas-controller-67747567d9-sz5nv\" (UID: \"bf84d3b7-beb6-4c3e-aae7-670dc0aa1bda\") " pod="opendatahub/maas-controller-67747567d9-sz5nv" Apr 19 15:33:43.912848 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:43.912819 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7974\" (UniqueName: \"kubernetes.io/projected/bf84d3b7-beb6-4c3e-aae7-670dc0aa1bda-kube-api-access-g7974\") pod \"maas-controller-67747567d9-sz5nv\" (UID: \"bf84d3b7-beb6-4c3e-aae7-670dc0aa1bda\") " pod="opendatahub/maas-controller-67747567d9-sz5nv" Apr 19 15:33:44.006152 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:44.006096 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h4tdk\" (UniqueName: \"kubernetes.io/projected/7087ff84-8933-47d7-a5d7-15a39b2239fd-kube-api-access-h4tdk\") pod \"maas-controller-6b8fdc774b-s5gfg\" (UID: \"7087ff84-8933-47d7-a5d7-15a39b2239fd\") " pod="opendatahub/maas-controller-6b8fdc774b-s5gfg" Apr 19 15:33:44.013955 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:44.013883 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4tdk\" (UniqueName: \"kubernetes.io/projected/7087ff84-8933-47d7-a5d7-15a39b2239fd-kube-api-access-h4tdk\") pod \"maas-controller-6b8fdc774b-s5gfg\" (UID: \"7087ff84-8933-47d7-a5d7-15a39b2239fd\") " pod="opendatahub/maas-controller-6b8fdc774b-s5gfg" Apr 19 15:33:44.096332 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:44.096297 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6b8fdc774b-s5gfg" Apr 19 15:33:44.146713 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:44.146683 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-6fn6s"] Apr 19 15:33:44.224093 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:44.224065 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-6b8fdc774b-s5gfg"] Apr 19 15:33:44.225528 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:33:44.225501 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7087ff84_8933_47d7_a5d7_15a39b2239fd.slice/crio-d0bebea145934a2ee9fe4cdfad42be824671b3b815f794af4f2f48d5f1ddcbc9 WatchSource:0}: Error finding container d0bebea145934a2ee9fe4cdfad42be824671b3b815f794af4f2f48d5f1ddcbc9: Status 404 returned error can't find the container with id d0bebea145934a2ee9fe4cdfad42be824671b3b815f794af4f2f48d5f1ddcbc9 Apr 19 15:33:44.707272 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:44.707226 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-6fn6s" event={"ID":"e2999878-8536-4102-a769-9be8c29eddf2","Type":"ContainerStarted","Data":"736884e3e91a658f3ec4583a6f93a46a11cf82565f61a8a0b540185845744cd1"} Apr 19 15:33:44.709329 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:44.709297 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6b8fdc774b-s5gfg" event={"ID":"7087ff84-8933-47d7-a5d7-15a39b2239fd","Type":"ContainerStarted","Data":"d0bebea145934a2ee9fe4cdfad42be824671b3b815f794af4f2f48d5f1ddcbc9"} Apr 19 15:33:44.709470 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:44.709366 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-67747567d9-sz5nv" Apr 19 15:33:44.734070 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:44.734040 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-67747567d9-sz5nv" Apr 19 15:33:44.916586 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:44.916553 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g7974\" (UniqueName: \"kubernetes.io/projected/bf84d3b7-beb6-4c3e-aae7-670dc0aa1bda-kube-api-access-g7974\") pod \"bf84d3b7-beb6-4c3e-aae7-670dc0aa1bda\" (UID: \"bf84d3b7-beb6-4c3e-aae7-670dc0aa1bda\") " Apr 19 15:33:44.921545 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:44.921498 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf84d3b7-beb6-4c3e-aae7-670dc0aa1bda-kube-api-access-g7974" (OuterVolumeSpecName: "kube-api-access-g7974") pod "bf84d3b7-beb6-4c3e-aae7-670dc0aa1bda" (UID: "bf84d3b7-beb6-4c3e-aae7-670dc0aa1bda"). InnerVolumeSpecName "kube-api-access-g7974". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 15:33:45.018560 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:45.018471 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-g7974\" (UniqueName: \"kubernetes.io/projected/bf84d3b7-beb6-4c3e-aae7-670dc0aa1bda-kube-api-access-g7974\") on node \"ip-10-0-133-218.ec2.internal\" DevicePath \"\"" Apr 19 15:33:45.713525 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:45.713493 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-67747567d9-sz5nv" Apr 19 15:33:45.741025 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:45.740997 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-67747567d9-sz5nv"] Apr 19 15:33:45.745101 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:45.745070 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-67747567d9-sz5nv"] Apr 19 15:33:47.547067 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:47.546971 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf84d3b7-beb6-4c3e-aae7-670dc0aa1bda" path="/var/lib/kubelet/pods/bf84d3b7-beb6-4c3e-aae7-670dc0aa1bda/volumes" Apr 19 15:33:47.729988 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:47.729951 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-6fn6s" event={"ID":"e2999878-8536-4102-a769-9be8c29eddf2","Type":"ContainerStarted","Data":"ac26b19d06e52be3104b8c274da61677ff7f4fc286c3d82cfa06358dbf3d48c0"} Apr 19 15:33:47.730188 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:47.730085 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-6d4c8f55f9-6fn6s" Apr 19 15:33:47.731335 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:47.731309 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6b8fdc774b-s5gfg" event={"ID":"7087ff84-8933-47d7-a5d7-15a39b2239fd","Type":"ContainerStarted","Data":"a92a118a788d699bdbde00649864bbc0aab73b5051576c07ecf023e695c18a59"} Apr 19 15:33:47.731488 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:47.731474 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-6b8fdc774b-s5gfg" Apr 19 15:33:47.746088 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:47.746026 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-6d4c8f55f9-6fn6s" podStartSLOduration=1.6462501170000001 podStartE2EDuration="4.746008464s" podCreationTimestamp="2026-04-19 15:33:43 +0000 UTC" firstStartedPulling="2026-04-19 15:33:44.149910729 +0000 UTC m=+523.255826095" lastFinishedPulling="2026-04-19 15:33:47.249669076 +0000 UTC m=+526.355584442" observedRunningTime="2026-04-19 15:33:47.745791412 +0000 UTC m=+526.851706800" watchObservedRunningTime="2026-04-19 15:33:47.746008464 +0000 UTC m=+526.851923853" Apr 19 15:33:47.760609 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:47.760441 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-6b8fdc774b-s5gfg" podStartSLOduration=1.735675145 podStartE2EDuration="4.760424376s" podCreationTimestamp="2026-04-19 15:33:43 +0000 UTC" firstStartedPulling="2026-04-19 15:33:44.226919143 +0000 UTC m=+523.332834510" lastFinishedPulling="2026-04-19 15:33:47.251668374 +0000 UTC m=+526.357583741" observedRunningTime="2026-04-19 15:33:47.759789351 +0000 UTC m=+526.865704750" watchObservedRunningTime="2026-04-19 15:33:47.760424376 +0000 UTC m=+526.866339767" Apr 19 15:33:49.650780 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:49.650689 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-6759f7f9d8-fdtn5"] Apr 19 15:33:49.654165 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:49.654149 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-6759f7f9d8-fdtn5" Apr 19 15:33:49.656379 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:49.656347 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-serving-cert\"" Apr 19 15:33:49.656499 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:49.656350 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"maas-parameters\"" Apr 19 15:33:49.656499 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:49.656358 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-rcvbl\"" Apr 19 15:33:49.660019 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:49.659870 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtkf2\" (UniqueName: \"kubernetes.io/projected/0b26d368-5fbd-47aa-beaa-7c149e024d36-kube-api-access-qtkf2\") pod \"maas-api-6759f7f9d8-fdtn5\" (UID: \"0b26d368-5fbd-47aa-beaa-7c149e024d36\") " pod="opendatahub/maas-api-6759f7f9d8-fdtn5" Apr 19 15:33:49.660019 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:49.659978 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/0b26d368-5fbd-47aa-beaa-7c149e024d36-maas-api-tls\") pod \"maas-api-6759f7f9d8-fdtn5\" (UID: \"0b26d368-5fbd-47aa-beaa-7c149e024d36\") " pod="opendatahub/maas-api-6759f7f9d8-fdtn5" Apr 19 15:33:49.663828 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:49.663809 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-6759f7f9d8-fdtn5"] Apr 19 15:33:49.761155 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:49.761107 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/0b26d368-5fbd-47aa-beaa-7c149e024d36-maas-api-tls\") pod \"maas-api-6759f7f9d8-fdtn5\" (UID: \"0b26d368-5fbd-47aa-beaa-7c149e024d36\") " pod="opendatahub/maas-api-6759f7f9d8-fdtn5" Apr 19 15:33:49.761368 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:49.761224 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qtkf2\" (UniqueName: \"kubernetes.io/projected/0b26d368-5fbd-47aa-beaa-7c149e024d36-kube-api-access-qtkf2\") pod \"maas-api-6759f7f9d8-fdtn5\" (UID: \"0b26d368-5fbd-47aa-beaa-7c149e024d36\") " pod="opendatahub/maas-api-6759f7f9d8-fdtn5" Apr 19 15:33:49.761368 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:33:49.761234 2579 secret.go:189] Couldn't get secret opendatahub/maas-api-serving-cert: secret "maas-api-serving-cert" not found Apr 19 15:33:49.761368 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:33:49.761292 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0b26d368-5fbd-47aa-beaa-7c149e024d36-maas-api-tls podName:0b26d368-5fbd-47aa-beaa-7c149e024d36 nodeName:}" failed. No retries permitted until 2026-04-19 15:33:50.261275714 +0000 UTC m=+529.367191080 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "maas-api-tls" (UniqueName: "kubernetes.io/secret/0b26d368-5fbd-47aa-beaa-7c149e024d36-maas-api-tls") pod "maas-api-6759f7f9d8-fdtn5" (UID: "0b26d368-5fbd-47aa-beaa-7c149e024d36") : secret "maas-api-serving-cert" not found Apr 19 15:33:49.772367 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:49.772330 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtkf2\" (UniqueName: \"kubernetes.io/projected/0b26d368-5fbd-47aa-beaa-7c149e024d36-kube-api-access-qtkf2\") pod \"maas-api-6759f7f9d8-fdtn5\" (UID: \"0b26d368-5fbd-47aa-beaa-7c149e024d36\") " pod="opendatahub/maas-api-6759f7f9d8-fdtn5" Apr 19 15:33:50.264837 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:50.264800 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/0b26d368-5fbd-47aa-beaa-7c149e024d36-maas-api-tls\") pod \"maas-api-6759f7f9d8-fdtn5\" (UID: \"0b26d368-5fbd-47aa-beaa-7c149e024d36\") " pod="opendatahub/maas-api-6759f7f9d8-fdtn5" Apr 19 15:33:50.267485 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:50.267460 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/0b26d368-5fbd-47aa-beaa-7c149e024d36-maas-api-tls\") pod \"maas-api-6759f7f9d8-fdtn5\" (UID: \"0b26d368-5fbd-47aa-beaa-7c149e024d36\") " pod="opendatahub/maas-api-6759f7f9d8-fdtn5" Apr 19 15:33:50.566012 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:50.565910 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-6759f7f9d8-fdtn5" Apr 19 15:33:50.733184 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:50.733158 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-6759f7f9d8-fdtn5"] Apr 19 15:33:50.766314 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:50.766264 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-6759f7f9d8-fdtn5" event={"ID":"0b26d368-5fbd-47aa-beaa-7c149e024d36","Type":"ContainerStarted","Data":"f144b9839f5999563b49567c4aba208109581ee1f8c124fb7612b3559424ea2a"} Apr 19 15:33:52.777539 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:52.777501 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-6759f7f9d8-fdtn5" event={"ID":"0b26d368-5fbd-47aa-beaa-7c149e024d36","Type":"ContainerStarted","Data":"6a9a25920789e754d7308daca58aaa5acbee738fb6c4a962331f6dcd48306d52"} Apr 19 15:33:52.778008 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:52.777618 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-6759f7f9d8-fdtn5" Apr 19 15:33:52.795761 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:52.795660 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-6759f7f9d8-fdtn5" podStartSLOduration=2.179015986 podStartE2EDuration="3.795633284s" podCreationTimestamp="2026-04-19 15:33:49 +0000 UTC" firstStartedPulling="2026-04-19 15:33:50.755048442 +0000 UTC m=+529.860963822" lastFinishedPulling="2026-04-19 15:33:52.371665754 +0000 UTC m=+531.477581120" observedRunningTime="2026-04-19 15:33:52.793367262 +0000 UTC m=+531.899282653" watchObservedRunningTime="2026-04-19 15:33:52.795633284 +0000 UTC m=+531.901548690" Apr 19 15:33:58.740903 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:58.740868 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-6b8fdc774b-s5gfg" Apr 19 15:33:58.741282 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:58.741087 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-6d4c8f55f9-6fn6s" Apr 19 15:33:58.787643 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:58.787617 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-6759f7f9d8-fdtn5" Apr 19 15:33:58.797172 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:58.797144 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-6fn6s"] Apr 19 15:33:58.803386 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:58.803348 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-6d4c8f55f9-6fn6s" podUID="e2999878-8536-4102-a769-9be8c29eddf2" containerName="manager" containerID="cri-o://ac26b19d06e52be3104b8c274da61677ff7f4fc286c3d82cfa06358dbf3d48c0" gracePeriod=10 Apr 19 15:33:59.041569 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:59.041544 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-6fn6s" Apr 19 15:33:59.096011 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:59.095973 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-78fd9b446f-dxglg"] Apr 19 15:33:59.096494 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:59.096475 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e2999878-8536-4102-a769-9be8c29eddf2" containerName="manager" Apr 19 15:33:59.096593 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:59.096495 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2999878-8536-4102-a769-9be8c29eddf2" containerName="manager" Apr 19 15:33:59.096651 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:59.096642 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="e2999878-8536-4102-a769-9be8c29eddf2" containerName="manager" Apr 19 15:33:59.100319 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:59.100296 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-78fd9b446f-dxglg" Apr 19 15:33:59.108260 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:59.108076 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-78fd9b446f-dxglg"] Apr 19 15:33:59.147450 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:59.147422 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kbxr2\" (UniqueName: \"kubernetes.io/projected/e2999878-8536-4102-a769-9be8c29eddf2-kube-api-access-kbxr2\") pod \"e2999878-8536-4102-a769-9be8c29eddf2\" (UID: \"e2999878-8536-4102-a769-9be8c29eddf2\") " Apr 19 15:33:59.147619 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:59.147552 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4797p\" (UniqueName: \"kubernetes.io/projected/30e3f4ed-12b9-435e-85e2-603bfcedc0d5-kube-api-access-4797p\") pod \"maas-controller-78fd9b446f-dxglg\" (UID: \"30e3f4ed-12b9-435e-85e2-603bfcedc0d5\") " pod="opendatahub/maas-controller-78fd9b446f-dxglg" Apr 19 15:33:59.149815 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:59.149711 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2999878-8536-4102-a769-9be8c29eddf2-kube-api-access-kbxr2" (OuterVolumeSpecName: "kube-api-access-kbxr2") pod "e2999878-8536-4102-a769-9be8c29eddf2" (UID: "e2999878-8536-4102-a769-9be8c29eddf2"). InnerVolumeSpecName "kube-api-access-kbxr2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 15:33:59.248623 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:59.248586 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4797p\" (UniqueName: \"kubernetes.io/projected/30e3f4ed-12b9-435e-85e2-603bfcedc0d5-kube-api-access-4797p\") pod \"maas-controller-78fd9b446f-dxglg\" (UID: \"30e3f4ed-12b9-435e-85e2-603bfcedc0d5\") " pod="opendatahub/maas-controller-78fd9b446f-dxglg" Apr 19 15:33:59.248817 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:59.248652 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kbxr2\" (UniqueName: \"kubernetes.io/projected/e2999878-8536-4102-a769-9be8c29eddf2-kube-api-access-kbxr2\") on node \"ip-10-0-133-218.ec2.internal\" DevicePath \"\"" Apr 19 15:33:59.256531 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:59.256507 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4797p\" (UniqueName: \"kubernetes.io/projected/30e3f4ed-12b9-435e-85e2-603bfcedc0d5-kube-api-access-4797p\") pod \"maas-controller-78fd9b446f-dxglg\" (UID: \"30e3f4ed-12b9-435e-85e2-603bfcedc0d5\") " pod="opendatahub/maas-controller-78fd9b446f-dxglg" Apr 19 15:33:59.412836 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:59.412697 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-78fd9b446f-dxglg" Apr 19 15:33:59.540569 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:59.540541 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-78fd9b446f-dxglg"] Apr 19 15:33:59.542644 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:33:59.542616 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod30e3f4ed_12b9_435e_85e2_603bfcedc0d5.slice/crio-12048d8eb9d30131f525105d754fea35d79f4e3451594d5237526ea525c0c0a1 WatchSource:0}: Error finding container 12048d8eb9d30131f525105d754fea35d79f4e3451594d5237526ea525c0c0a1: Status 404 returned error can't find the container with id 12048d8eb9d30131f525105d754fea35d79f4e3451594d5237526ea525c0c0a1 Apr 19 15:33:59.808863 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:59.808829 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-78fd9b446f-dxglg" event={"ID":"30e3f4ed-12b9-435e-85e2-603bfcedc0d5","Type":"ContainerStarted","Data":"12048d8eb9d30131f525105d754fea35d79f4e3451594d5237526ea525c0c0a1"} Apr 19 15:33:59.810079 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:59.810048 2579 generic.go:358] "Generic (PLEG): container finished" podID="e2999878-8536-4102-a769-9be8c29eddf2" containerID="ac26b19d06e52be3104b8c274da61677ff7f4fc286c3d82cfa06358dbf3d48c0" exitCode=0 Apr 19 15:33:59.810195 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:59.810106 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-6fn6s" event={"ID":"e2999878-8536-4102-a769-9be8c29eddf2","Type":"ContainerDied","Data":"ac26b19d06e52be3104b8c274da61677ff7f4fc286c3d82cfa06358dbf3d48c0"} Apr 19 15:33:59.810195 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:59.810133 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-6fn6s" event={"ID":"e2999878-8536-4102-a769-9be8c29eddf2","Type":"ContainerDied","Data":"736884e3e91a658f3ec4583a6f93a46a11cf82565f61a8a0b540185845744cd1"} Apr 19 15:33:59.810195 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:59.810148 2579 scope.go:117] "RemoveContainer" containerID="ac26b19d06e52be3104b8c274da61677ff7f4fc286c3d82cfa06358dbf3d48c0" Apr 19 15:33:59.810195 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:59.810109 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-6fn6s" Apr 19 15:33:59.819272 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:59.819245 2579 scope.go:117] "RemoveContainer" containerID="ac26b19d06e52be3104b8c274da61677ff7f4fc286c3d82cfa06358dbf3d48c0" Apr 19 15:33:59.819529 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:33:59.819506 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac26b19d06e52be3104b8c274da61677ff7f4fc286c3d82cfa06358dbf3d48c0\": container with ID starting with ac26b19d06e52be3104b8c274da61677ff7f4fc286c3d82cfa06358dbf3d48c0 not found: ID does not exist" containerID="ac26b19d06e52be3104b8c274da61677ff7f4fc286c3d82cfa06358dbf3d48c0" Apr 19 15:33:59.819576 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:59.819537 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac26b19d06e52be3104b8c274da61677ff7f4fc286c3d82cfa06358dbf3d48c0"} err="failed to get container status \"ac26b19d06e52be3104b8c274da61677ff7f4fc286c3d82cfa06358dbf3d48c0\": rpc error: code = NotFound desc = could not find container \"ac26b19d06e52be3104b8c274da61677ff7f4fc286c3d82cfa06358dbf3d48c0\": container with ID starting with ac26b19d06e52be3104b8c274da61677ff7f4fc286c3d82cfa06358dbf3d48c0 not found: ID does not exist" Apr 19 15:33:59.827118 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:59.827094 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-6fn6s"] Apr 19 15:33:59.829521 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:33:59.829500 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-6fn6s"] Apr 19 15:34:00.815595 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:00.815559 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-78fd9b446f-dxglg" event={"ID":"30e3f4ed-12b9-435e-85e2-603bfcedc0d5","Type":"ContainerStarted","Data":"ab2f50d3af9d0a6fa34e1a2857c873605e28a4a4a06ed1971e70a33ae315477e"} Apr 19 15:34:00.816052 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:00.815631 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-78fd9b446f-dxglg" Apr 19 15:34:00.832161 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:00.832114 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-78fd9b446f-dxglg" podStartSLOduration=1.483461355 podStartE2EDuration="1.832099057s" podCreationTimestamp="2026-04-19 15:33:59 +0000 UTC" firstStartedPulling="2026-04-19 15:33:59.543893332 +0000 UTC m=+538.649808698" lastFinishedPulling="2026-04-19 15:33:59.892531031 +0000 UTC m=+538.998446400" observedRunningTime="2026-04-19 15:34:00.829393916 +0000 UTC m=+539.935309306" watchObservedRunningTime="2026-04-19 15:34:00.832099057 +0000 UTC m=+539.938014444" Apr 19 15:34:01.545991 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:01.545961 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2999878-8536-4102-a769-9be8c29eddf2" path="/var/lib/kubelet/pods/e2999878-8536-4102-a769-9be8c29eddf2/volumes" Apr 19 15:34:11.826703 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:11.826662 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-78fd9b446f-dxglg" Apr 19 15:34:11.864291 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:11.864258 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6b8fdc774b-s5gfg"] Apr 19 15:34:11.864556 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:11.864530 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-6b8fdc774b-s5gfg" podUID="7087ff84-8933-47d7-a5d7-15a39b2239fd" containerName="manager" containerID="cri-o://a92a118a788d699bdbde00649864bbc0aab73b5051576c07ecf023e695c18a59" gracePeriod=10 Apr 19 15:34:12.117037 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:12.117014 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6b8fdc774b-s5gfg" Apr 19 15:34:12.172365 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:12.172332 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h4tdk\" (UniqueName: \"kubernetes.io/projected/7087ff84-8933-47d7-a5d7-15a39b2239fd-kube-api-access-h4tdk\") pod \"7087ff84-8933-47d7-a5d7-15a39b2239fd\" (UID: \"7087ff84-8933-47d7-a5d7-15a39b2239fd\") " Apr 19 15:34:12.174468 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:12.174445 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7087ff84-8933-47d7-a5d7-15a39b2239fd-kube-api-access-h4tdk" (OuterVolumeSpecName: "kube-api-access-h4tdk") pod "7087ff84-8933-47d7-a5d7-15a39b2239fd" (UID: "7087ff84-8933-47d7-a5d7-15a39b2239fd"). InnerVolumeSpecName "kube-api-access-h4tdk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 15:34:12.273765 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:12.273699 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-h4tdk\" (UniqueName: \"kubernetes.io/projected/7087ff84-8933-47d7-a5d7-15a39b2239fd-kube-api-access-h4tdk\") on node \"ip-10-0-133-218.ec2.internal\" DevicePath \"\"" Apr 19 15:34:12.862195 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:12.862156 2579 generic.go:358] "Generic (PLEG): container finished" podID="7087ff84-8933-47d7-a5d7-15a39b2239fd" containerID="a92a118a788d699bdbde00649864bbc0aab73b5051576c07ecf023e695c18a59" exitCode=0 Apr 19 15:34:12.862659 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:12.862226 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6b8fdc774b-s5gfg" Apr 19 15:34:12.862659 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:12.862236 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6b8fdc774b-s5gfg" event={"ID":"7087ff84-8933-47d7-a5d7-15a39b2239fd","Type":"ContainerDied","Data":"a92a118a788d699bdbde00649864bbc0aab73b5051576c07ecf023e695c18a59"} Apr 19 15:34:12.862659 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:12.862272 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6b8fdc774b-s5gfg" event={"ID":"7087ff84-8933-47d7-a5d7-15a39b2239fd","Type":"ContainerDied","Data":"d0bebea145934a2ee9fe4cdfad42be824671b3b815f794af4f2f48d5f1ddcbc9"} Apr 19 15:34:12.862659 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:12.862287 2579 scope.go:117] "RemoveContainer" containerID="a92a118a788d699bdbde00649864bbc0aab73b5051576c07ecf023e695c18a59" Apr 19 15:34:12.872271 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:12.872066 2579 scope.go:117] "RemoveContainer" containerID="a92a118a788d699bdbde00649864bbc0aab73b5051576c07ecf023e695c18a59" Apr 19 15:34:12.872383 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:34:12.872361 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a92a118a788d699bdbde00649864bbc0aab73b5051576c07ecf023e695c18a59\": container with ID starting with a92a118a788d699bdbde00649864bbc0aab73b5051576c07ecf023e695c18a59 not found: ID does not exist" containerID="a92a118a788d699bdbde00649864bbc0aab73b5051576c07ecf023e695c18a59" Apr 19 15:34:12.872432 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:12.872393 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a92a118a788d699bdbde00649864bbc0aab73b5051576c07ecf023e695c18a59"} err="failed to get container status \"a92a118a788d699bdbde00649864bbc0aab73b5051576c07ecf023e695c18a59\": rpc error: code = NotFound desc = could not find container \"a92a118a788d699bdbde00649864bbc0aab73b5051576c07ecf023e695c18a59\": container with ID starting with a92a118a788d699bdbde00649864bbc0aab73b5051576c07ecf023e695c18a59 not found: ID does not exist" Apr 19 15:34:12.884669 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:12.884643 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6b8fdc774b-s5gfg"] Apr 19 15:34:12.888246 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:12.888225 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-6b8fdc774b-s5gfg"] Apr 19 15:34:13.549741 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:13.549111 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7087ff84-8933-47d7-a5d7-15a39b2239fd" path="/var/lib/kubelet/pods/7087ff84-8933-47d7-a5d7-15a39b2239fd/volumes" Apr 19 15:34:18.517879 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:18.517837 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-947g7"] Apr 19 15:34:18.518261 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:18.518255 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7087ff84-8933-47d7-a5d7-15a39b2239fd" containerName="manager" Apr 19 15:34:18.518303 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:18.518266 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="7087ff84-8933-47d7-a5d7-15a39b2239fd" containerName="manager" Apr 19 15:34:18.518365 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:18.518353 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="7087ff84-8933-47d7-a5d7-15a39b2239fd" containerName="manager" Apr 19 15:34:18.567443 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:18.567399 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-947g7"] Apr 19 15:34:18.567638 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:18.567544 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-947g7" Apr 19 15:34:18.570042 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:18.570017 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-w98zh\"" Apr 19 15:34:18.570196 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:18.570046 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-trlp-test-simulated-kserve-self-signed-certs\"" Apr 19 15:34:18.570196 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:18.570025 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 19 15:34:18.570811 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:18.570788 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 19 15:34:18.733029 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:18.732987 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/cc45878c-6dbb-47e9-9a41-a1bfd9c6edf8-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-947g7\" (UID: \"cc45878c-6dbb-47e9-9a41-a1bfd9c6edf8\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-947g7" Apr 19 15:34:18.733229 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:18.733043 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/cc45878c-6dbb-47e9-9a41-a1bfd9c6edf8-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-947g7\" (UID: \"cc45878c-6dbb-47e9-9a41-a1bfd9c6edf8\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-947g7" Apr 19 15:34:18.733229 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:18.733174 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/cc45878c-6dbb-47e9-9a41-a1bfd9c6edf8-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-947g7\" (UID: \"cc45878c-6dbb-47e9-9a41-a1bfd9c6edf8\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-947g7" Apr 19 15:34:18.733370 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:18.733227 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/cc45878c-6dbb-47e9-9a41-a1bfd9c6edf8-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-947g7\" (UID: \"cc45878c-6dbb-47e9-9a41-a1bfd9c6edf8\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-947g7" Apr 19 15:34:18.733370 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:18.733316 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cc45878c-6dbb-47e9-9a41-a1bfd9c6edf8-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-947g7\" (UID: \"cc45878c-6dbb-47e9-9a41-a1bfd9c6edf8\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-947g7" Apr 19 15:34:18.733370 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:18.733344 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xn5ng\" (UniqueName: \"kubernetes.io/projected/cc45878c-6dbb-47e9-9a41-a1bfd9c6edf8-kube-api-access-xn5ng\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-947g7\" (UID: \"cc45878c-6dbb-47e9-9a41-a1bfd9c6edf8\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-947g7" Apr 19 15:34:18.834687 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:18.834589 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/cc45878c-6dbb-47e9-9a41-a1bfd9c6edf8-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-947g7\" (UID: \"cc45878c-6dbb-47e9-9a41-a1bfd9c6edf8\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-947g7" Apr 19 15:34:18.834687 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:18.834659 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/cc45878c-6dbb-47e9-9a41-a1bfd9c6edf8-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-947g7\" (UID: \"cc45878c-6dbb-47e9-9a41-a1bfd9c6edf8\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-947g7" Apr 19 15:34:18.834687 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:18.834683 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/cc45878c-6dbb-47e9-9a41-a1bfd9c6edf8-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-947g7\" (UID: \"cc45878c-6dbb-47e9-9a41-a1bfd9c6edf8\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-947g7" Apr 19 15:34:18.835008 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:18.834757 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cc45878c-6dbb-47e9-9a41-a1bfd9c6edf8-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-947g7\" (UID: \"cc45878c-6dbb-47e9-9a41-a1bfd9c6edf8\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-947g7" Apr 19 15:34:18.835008 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:18.834787 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xn5ng\" (UniqueName: \"kubernetes.io/projected/cc45878c-6dbb-47e9-9a41-a1bfd9c6edf8-kube-api-access-xn5ng\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-947g7\" (UID: \"cc45878c-6dbb-47e9-9a41-a1bfd9c6edf8\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-947g7" Apr 19 15:34:18.835008 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:18.834868 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/cc45878c-6dbb-47e9-9a41-a1bfd9c6edf8-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-947g7\" (UID: \"cc45878c-6dbb-47e9-9a41-a1bfd9c6edf8\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-947g7" Apr 19 15:34:18.835170 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:18.835093 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/cc45878c-6dbb-47e9-9a41-a1bfd9c6edf8-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-947g7\" (UID: \"cc45878c-6dbb-47e9-9a41-a1bfd9c6edf8\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-947g7" Apr 19 15:34:18.835170 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:18.835156 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/cc45878c-6dbb-47e9-9a41-a1bfd9c6edf8-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-947g7\" (UID: \"cc45878c-6dbb-47e9-9a41-a1bfd9c6edf8\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-947g7" Apr 19 15:34:18.835296 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:18.835228 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cc45878c-6dbb-47e9-9a41-a1bfd9c6edf8-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-947g7\" (UID: \"cc45878c-6dbb-47e9-9a41-a1bfd9c6edf8\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-947g7" Apr 19 15:34:18.837124 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:18.837104 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/cc45878c-6dbb-47e9-9a41-a1bfd9c6edf8-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-947g7\" (UID: \"cc45878c-6dbb-47e9-9a41-a1bfd9c6edf8\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-947g7" Apr 19 15:34:18.837517 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:18.837497 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/cc45878c-6dbb-47e9-9a41-a1bfd9c6edf8-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-947g7\" (UID: \"cc45878c-6dbb-47e9-9a41-a1bfd9c6edf8\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-947g7" Apr 19 15:34:18.843445 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:18.843421 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xn5ng\" (UniqueName: \"kubernetes.io/projected/cc45878c-6dbb-47e9-9a41-a1bfd9c6edf8-kube-api-access-xn5ng\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-947g7\" (UID: \"cc45878c-6dbb-47e9-9a41-a1bfd9c6edf8\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-947g7" Apr 19 15:34:18.878645 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:18.878606 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-947g7" Apr 19 15:34:19.015952 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:19.015926 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-947g7"] Apr 19 15:34:19.017770 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:34:19.017734 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcc45878c_6dbb_47e9_9a41_a1bfd9c6edf8.slice/crio-d76abd7a7d1bb826c9de865433d9bd5c4073572a4d016b2c8ef93cd3340f3cc8 WatchSource:0}: Error finding container d76abd7a7d1bb826c9de865433d9bd5c4073572a4d016b2c8ef93cd3340f3cc8: Status 404 returned error can't find the container with id d76abd7a7d1bb826c9de865433d9bd5c4073572a4d016b2c8ef93cd3340f3cc8 Apr 19 15:34:19.892819 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:19.892764 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-947g7" event={"ID":"cc45878c-6dbb-47e9-9a41-a1bfd9c6edf8","Type":"ContainerStarted","Data":"d76abd7a7d1bb826c9de865433d9bd5c4073572a4d016b2c8ef93cd3340f3cc8"} Apr 19 15:34:25.919037 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:25.919000 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-947g7" event={"ID":"cc45878c-6dbb-47e9-9a41-a1bfd9c6edf8","Type":"ContainerStarted","Data":"51cece94bc8155349585e7508e0cbda65f33a1136c5d828d6177a4fc9e277b75"} Apr 19 15:34:29.720960 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:29.720922 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-6759f7f9d8-fdtn5"] Apr 19 15:34:29.721327 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:29.721165 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-6759f7f9d8-fdtn5" podUID="0b26d368-5fbd-47aa-beaa-7c149e024d36" containerName="maas-api" containerID="cri-o://6a9a25920789e754d7308daca58aaa5acbee738fb6c4a962331f6dcd48306d52" gracePeriod=30 Apr 19 15:34:29.939301 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:29.939267 2579 generic.go:358] "Generic (PLEG): container finished" podID="0b26d368-5fbd-47aa-beaa-7c149e024d36" containerID="6a9a25920789e754d7308daca58aaa5acbee738fb6c4a962331f6dcd48306d52" exitCode=0 Apr 19 15:34:29.939494 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:29.939346 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-6759f7f9d8-fdtn5" event={"ID":"0b26d368-5fbd-47aa-beaa-7c149e024d36","Type":"ContainerDied","Data":"6a9a25920789e754d7308daca58aaa5acbee738fb6c4a962331f6dcd48306d52"} Apr 19 15:34:29.985005 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:29.984945 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-6759f7f9d8-fdtn5" Apr 19 15:34:30.050123 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:30.050081 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qtkf2\" (UniqueName: \"kubernetes.io/projected/0b26d368-5fbd-47aa-beaa-7c149e024d36-kube-api-access-qtkf2\") pod \"0b26d368-5fbd-47aa-beaa-7c149e024d36\" (UID: \"0b26d368-5fbd-47aa-beaa-7c149e024d36\") " Apr 19 15:34:30.050344 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:30.050314 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/0b26d368-5fbd-47aa-beaa-7c149e024d36-maas-api-tls\") pod \"0b26d368-5fbd-47aa-beaa-7c149e024d36\" (UID: \"0b26d368-5fbd-47aa-beaa-7c149e024d36\") " Apr 19 15:34:30.052506 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:30.052476 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b26d368-5fbd-47aa-beaa-7c149e024d36-kube-api-access-qtkf2" (OuterVolumeSpecName: "kube-api-access-qtkf2") pod "0b26d368-5fbd-47aa-beaa-7c149e024d36" (UID: "0b26d368-5fbd-47aa-beaa-7c149e024d36"). InnerVolumeSpecName "kube-api-access-qtkf2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 15:34:30.052604 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:30.052527 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b26d368-5fbd-47aa-beaa-7c149e024d36-maas-api-tls" (OuterVolumeSpecName: "maas-api-tls") pod "0b26d368-5fbd-47aa-beaa-7c149e024d36" (UID: "0b26d368-5fbd-47aa-beaa-7c149e024d36"). InnerVolumeSpecName "maas-api-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 19 15:34:30.151215 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:30.151173 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qtkf2\" (UniqueName: \"kubernetes.io/projected/0b26d368-5fbd-47aa-beaa-7c149e024d36-kube-api-access-qtkf2\") on node \"ip-10-0-133-218.ec2.internal\" DevicePath \"\"" Apr 19 15:34:30.151215 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:30.151205 2579 reconciler_common.go:299] "Volume detached for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/0b26d368-5fbd-47aa-beaa-7c149e024d36-maas-api-tls\") on node \"ip-10-0-133-218.ec2.internal\" DevicePath \"\"" Apr 19 15:34:30.945235 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:30.945202 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-6759f7f9d8-fdtn5" Apr 19 15:34:30.945235 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:30.945219 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-6759f7f9d8-fdtn5" event={"ID":"0b26d368-5fbd-47aa-beaa-7c149e024d36","Type":"ContainerDied","Data":"f144b9839f5999563b49567c4aba208109581ee1f8c124fb7612b3559424ea2a"} Apr 19 15:34:30.945807 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:30.945265 2579 scope.go:117] "RemoveContainer" containerID="6a9a25920789e754d7308daca58aaa5acbee738fb6c4a962331f6dcd48306d52" Apr 19 15:34:30.967148 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:30.967101 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-6759f7f9d8-fdtn5"] Apr 19 15:34:30.970194 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:30.970169 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-6759f7f9d8-fdtn5"] Apr 19 15:34:31.551853 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:31.551820 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b26d368-5fbd-47aa-beaa-7c149e024d36" path="/var/lib/kubelet/pods/0b26d368-5fbd-47aa-beaa-7c149e024d36/volumes" Apr 19 15:34:31.951817 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:31.951779 2579 generic.go:358] "Generic (PLEG): container finished" podID="cc45878c-6dbb-47e9-9a41-a1bfd9c6edf8" containerID="51cece94bc8155349585e7508e0cbda65f33a1136c5d828d6177a4fc9e277b75" exitCode=0 Apr 19 15:34:31.952222 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:31.951844 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-947g7" event={"ID":"cc45878c-6dbb-47e9-9a41-a1bfd9c6edf8","Type":"ContainerDied","Data":"51cece94bc8155349585e7508e0cbda65f33a1136c5d828d6177a4fc9e277b75"} Apr 19 15:34:33.962220 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:33.962184 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-947g7" event={"ID":"cc45878c-6dbb-47e9-9a41-a1bfd9c6edf8","Type":"ContainerStarted","Data":"ee8ee003d31c475f79a1d15b3161c1c7158bf0be1098451714c099ed98394771"} Apr 19 15:34:33.962588 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:33.962411 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-947g7" Apr 19 15:34:33.978792 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:33.978736 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-947g7" podStartSLOduration=1.9529584070000001 podStartE2EDuration="15.978699582s" podCreationTimestamp="2026-04-19 15:34:18 +0000 UTC" firstStartedPulling="2026-04-19 15:34:19.019803466 +0000 UTC m=+558.125718832" lastFinishedPulling="2026-04-19 15:34:33.045544628 +0000 UTC m=+572.151460007" observedRunningTime="2026-04-19 15:34:33.977833408 +0000 UTC m=+573.083748799" watchObservedRunningTime="2026-04-19 15:34:33.978699582 +0000 UTC m=+573.084614974" Apr 19 15:34:35.910743 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:35.910674 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-9lcrk"] Apr 19 15:34:35.911126 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:35.911085 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0b26d368-5fbd-47aa-beaa-7c149e024d36" containerName="maas-api" Apr 19 15:34:35.911126 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:35.911096 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b26d368-5fbd-47aa-beaa-7c149e024d36" containerName="maas-api" Apr 19 15:34:35.911205 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:35.911154 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="0b26d368-5fbd-47aa-beaa-7c149e024d36" containerName="maas-api" Apr 19 15:34:35.914602 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:35.914585 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-9lcrk" Apr 19 15:34:35.916917 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:35.916899 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"facebook-opt-125m-simulated-kserve-self-signed-certs\"" Apr 19 15:34:35.924596 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:35.924573 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-9lcrk"] Apr 19 15:34:36.002705 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:36.002665 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fh69w\" (UniqueName: \"kubernetes.io/projected/ac0fb6b7-ff74-4721-9cf5-f7df68c1c205-kube-api-access-fh69w\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-9lcrk\" (UID: \"ac0fb6b7-ff74-4721-9cf5-f7df68c1c205\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-9lcrk" Apr 19 15:34:36.002892 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:36.002772 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ac0fb6b7-ff74-4721-9cf5-f7df68c1c205-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-9lcrk\" (UID: \"ac0fb6b7-ff74-4721-9cf5-f7df68c1c205\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-9lcrk" Apr 19 15:34:36.002892 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:36.002827 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ac0fb6b7-ff74-4721-9cf5-f7df68c1c205-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-9lcrk\" (UID: \"ac0fb6b7-ff74-4721-9cf5-f7df68c1c205\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-9lcrk" Apr 19 15:34:36.003011 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:36.002894 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ac0fb6b7-ff74-4721-9cf5-f7df68c1c205-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-9lcrk\" (UID: \"ac0fb6b7-ff74-4721-9cf5-f7df68c1c205\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-9lcrk" Apr 19 15:34:36.003011 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:36.002918 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ac0fb6b7-ff74-4721-9cf5-f7df68c1c205-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-9lcrk\" (UID: \"ac0fb6b7-ff74-4721-9cf5-f7df68c1c205\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-9lcrk" Apr 19 15:34:36.003111 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:36.003040 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ac0fb6b7-ff74-4721-9cf5-f7df68c1c205-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-9lcrk\" (UID: \"ac0fb6b7-ff74-4721-9cf5-f7df68c1c205\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-9lcrk" Apr 19 15:34:36.103650 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:36.103613 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ac0fb6b7-ff74-4721-9cf5-f7df68c1c205-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-9lcrk\" (UID: \"ac0fb6b7-ff74-4721-9cf5-f7df68c1c205\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-9lcrk" Apr 19 15:34:36.103869 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:36.103668 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ac0fb6b7-ff74-4721-9cf5-f7df68c1c205-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-9lcrk\" (UID: \"ac0fb6b7-ff74-4721-9cf5-f7df68c1c205\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-9lcrk" Apr 19 15:34:36.103869 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:36.103690 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ac0fb6b7-ff74-4721-9cf5-f7df68c1c205-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-9lcrk\" (UID: \"ac0fb6b7-ff74-4721-9cf5-f7df68c1c205\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-9lcrk" Apr 19 15:34:36.103869 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:36.103805 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ac0fb6b7-ff74-4721-9cf5-f7df68c1c205-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-9lcrk\" (UID: \"ac0fb6b7-ff74-4721-9cf5-f7df68c1c205\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-9lcrk" Apr 19 15:34:36.103869 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:36.103867 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fh69w\" (UniqueName: \"kubernetes.io/projected/ac0fb6b7-ff74-4721-9cf5-f7df68c1c205-kube-api-access-fh69w\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-9lcrk\" (UID: \"ac0fb6b7-ff74-4721-9cf5-f7df68c1c205\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-9lcrk" Apr 19 15:34:36.104092 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:36.103908 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ac0fb6b7-ff74-4721-9cf5-f7df68c1c205-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-9lcrk\" (UID: \"ac0fb6b7-ff74-4721-9cf5-f7df68c1c205\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-9lcrk" Apr 19 15:34:36.104153 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:36.104136 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ac0fb6b7-ff74-4721-9cf5-f7df68c1c205-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-9lcrk\" (UID: \"ac0fb6b7-ff74-4721-9cf5-f7df68c1c205\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-9lcrk" Apr 19 15:34:36.104213 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:36.104182 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ac0fb6b7-ff74-4721-9cf5-f7df68c1c205-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-9lcrk\" (UID: \"ac0fb6b7-ff74-4721-9cf5-f7df68c1c205\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-9lcrk" Apr 19 15:34:36.104282 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:36.104266 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ac0fb6b7-ff74-4721-9cf5-f7df68c1c205-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-9lcrk\" (UID: \"ac0fb6b7-ff74-4721-9cf5-f7df68c1c205\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-9lcrk" Apr 19 15:34:36.106191 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:36.106169 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ac0fb6b7-ff74-4721-9cf5-f7df68c1c205-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-9lcrk\" (UID: \"ac0fb6b7-ff74-4721-9cf5-f7df68c1c205\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-9lcrk" Apr 19 15:34:36.106398 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:36.106383 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ac0fb6b7-ff74-4721-9cf5-f7df68c1c205-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-9lcrk\" (UID: \"ac0fb6b7-ff74-4721-9cf5-f7df68c1c205\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-9lcrk" Apr 19 15:34:36.112340 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:36.112317 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fh69w\" (UniqueName: \"kubernetes.io/projected/ac0fb6b7-ff74-4721-9cf5-f7df68c1c205-kube-api-access-fh69w\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-9lcrk\" (UID: \"ac0fb6b7-ff74-4721-9cf5-f7df68c1c205\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-9lcrk" Apr 19 15:34:36.226174 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:36.226132 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-9lcrk" Apr 19 15:34:36.362784 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:36.362760 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-9lcrk"] Apr 19 15:34:36.368356 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:34:36.368310 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac0fb6b7_ff74_4721_9cf5_f7df68c1c205.slice/crio-7a3863a6e38061e07b7e57881daf2b54cd5a599361ed24b934dc5b8df88f56a8 WatchSource:0}: Error finding container 7a3863a6e38061e07b7e57881daf2b54cd5a599361ed24b934dc5b8df88f56a8: Status 404 returned error can't find the container with id 7a3863a6e38061e07b7e57881daf2b54cd5a599361ed24b934dc5b8df88f56a8 Apr 19 15:34:36.976506 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:36.976465 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-9lcrk" event={"ID":"ac0fb6b7-ff74-4721-9cf5-f7df68c1c205","Type":"ContainerStarted","Data":"0ce6ae1cd7fe7a2445a149e0cdc1b33e2ea8f9c2eced0b64a4ec4b433d5ce9c8"} Apr 19 15:34:36.976506 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:36.976505 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-9lcrk" event={"ID":"ac0fb6b7-ff74-4721-9cf5-f7df68c1c205","Type":"ContainerStarted","Data":"7a3863a6e38061e07b7e57881daf2b54cd5a599361ed24b934dc5b8df88f56a8"} Apr 19 15:34:39.667269 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:39.667235 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-z4x8p"] Apr 19 15:34:39.671468 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:39.671441 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-z4x8p" Apr 19 15:34:39.673538 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:39.673516 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-simulated-kserve-self-signed-certs\"" Apr 19 15:34:39.686243 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:39.686215 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-z4x8p"] Apr 19 15:34:39.739348 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:39.739306 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpn47\" (UniqueName: \"kubernetes.io/projected/fff9a6a1-6553-41bf-8c3d-80c138fdee6e-kube-api-access-kpn47\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-z4x8p\" (UID: \"fff9a6a1-6553-41bf-8c3d-80c138fdee6e\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-z4x8p" Apr 19 15:34:39.739512 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:39.739370 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/fff9a6a1-6553-41bf-8c3d-80c138fdee6e-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-z4x8p\" (UID: \"fff9a6a1-6553-41bf-8c3d-80c138fdee6e\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-z4x8p" Apr 19 15:34:39.739512 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:39.739392 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/fff9a6a1-6553-41bf-8c3d-80c138fdee6e-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-z4x8p\" (UID: \"fff9a6a1-6553-41bf-8c3d-80c138fdee6e\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-z4x8p" Apr 19 15:34:39.739512 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:39.739417 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fff9a6a1-6553-41bf-8c3d-80c138fdee6e-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-z4x8p\" (UID: \"fff9a6a1-6553-41bf-8c3d-80c138fdee6e\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-z4x8p" Apr 19 15:34:39.739623 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:39.739533 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/fff9a6a1-6553-41bf-8c3d-80c138fdee6e-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-z4x8p\" (UID: \"fff9a6a1-6553-41bf-8c3d-80c138fdee6e\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-z4x8p" Apr 19 15:34:39.739623 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:39.739565 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fff9a6a1-6553-41bf-8c3d-80c138fdee6e-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-z4x8p\" (UID: \"fff9a6a1-6553-41bf-8c3d-80c138fdee6e\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-z4x8p" Apr 19 15:34:39.840795 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:39.840751 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/fff9a6a1-6553-41bf-8c3d-80c138fdee6e-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-z4x8p\" (UID: \"fff9a6a1-6553-41bf-8c3d-80c138fdee6e\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-z4x8p" Apr 19 15:34:39.840795 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:39.840791 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fff9a6a1-6553-41bf-8c3d-80c138fdee6e-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-z4x8p\" (UID: \"fff9a6a1-6553-41bf-8c3d-80c138fdee6e\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-z4x8p" Apr 19 15:34:39.841105 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:39.840942 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kpn47\" (UniqueName: \"kubernetes.io/projected/fff9a6a1-6553-41bf-8c3d-80c138fdee6e-kube-api-access-kpn47\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-z4x8p\" (UID: \"fff9a6a1-6553-41bf-8c3d-80c138fdee6e\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-z4x8p" Apr 19 15:34:39.841105 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:39.841063 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/fff9a6a1-6553-41bf-8c3d-80c138fdee6e-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-z4x8p\" (UID: \"fff9a6a1-6553-41bf-8c3d-80c138fdee6e\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-z4x8p" Apr 19 15:34:39.841105 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:39.841100 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/fff9a6a1-6553-41bf-8c3d-80c138fdee6e-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-z4x8p\" (UID: \"fff9a6a1-6553-41bf-8c3d-80c138fdee6e\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-z4x8p" Apr 19 15:34:39.841267 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:39.841155 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fff9a6a1-6553-41bf-8c3d-80c138fdee6e-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-z4x8p\" (UID: \"fff9a6a1-6553-41bf-8c3d-80c138fdee6e\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-z4x8p" Apr 19 15:34:39.841453 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:39.841405 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/fff9a6a1-6553-41bf-8c3d-80c138fdee6e-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-z4x8p\" (UID: \"fff9a6a1-6553-41bf-8c3d-80c138fdee6e\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-z4x8p" Apr 19 15:34:39.841592 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:39.841478 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/fff9a6a1-6553-41bf-8c3d-80c138fdee6e-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-z4x8p\" (UID: \"fff9a6a1-6553-41bf-8c3d-80c138fdee6e\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-z4x8p" Apr 19 15:34:39.841592 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:39.841570 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fff9a6a1-6553-41bf-8c3d-80c138fdee6e-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-z4x8p\" (UID: \"fff9a6a1-6553-41bf-8c3d-80c138fdee6e\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-z4x8p" Apr 19 15:34:39.843810 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:39.843786 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/fff9a6a1-6553-41bf-8c3d-80c138fdee6e-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-z4x8p\" (UID: \"fff9a6a1-6553-41bf-8c3d-80c138fdee6e\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-z4x8p" Apr 19 15:34:39.844072 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:39.844050 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fff9a6a1-6553-41bf-8c3d-80c138fdee6e-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-z4x8p\" (UID: \"fff9a6a1-6553-41bf-8c3d-80c138fdee6e\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-z4x8p" Apr 19 15:34:39.848023 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:39.848000 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpn47\" (UniqueName: \"kubernetes.io/projected/fff9a6a1-6553-41bf-8c3d-80c138fdee6e-kube-api-access-kpn47\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-z4x8p\" (UID: \"fff9a6a1-6553-41bf-8c3d-80c138fdee6e\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-z4x8p" Apr 19 15:34:39.988685 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:39.988651 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-z4x8p" Apr 19 15:34:40.132351 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:34:40.132315 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfff9a6a1_6553_41bf_8c3d_80c138fdee6e.slice/crio-3e7072358b7e89f91381c7fd7ceb5633060e8600dc7d241b315c31f416648c54 WatchSource:0}: Error finding container 3e7072358b7e89f91381c7fd7ceb5633060e8600dc7d241b315c31f416648c54: Status 404 returned error can't find the container with id 3e7072358b7e89f91381c7fd7ceb5633060e8600dc7d241b315c31f416648c54 Apr 19 15:34:40.132909 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:40.132878 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-z4x8p"] Apr 19 15:34:40.997024 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:40.996983 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-z4x8p" event={"ID":"fff9a6a1-6553-41bf-8c3d-80c138fdee6e","Type":"ContainerStarted","Data":"40ec86b0e67c639584307c63e9a868fad3a1fac8d97037811ebce4048ea65fc4"} Apr 19 15:34:40.997438 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:40.997031 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-z4x8p" event={"ID":"fff9a6a1-6553-41bf-8c3d-80c138fdee6e","Type":"ContainerStarted","Data":"3e7072358b7e89f91381c7fd7ceb5633060e8600dc7d241b315c31f416648c54"} Apr 19 15:34:43.007208 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:43.007163 2579 generic.go:358] "Generic (PLEG): container finished" podID="ac0fb6b7-ff74-4721-9cf5-f7df68c1c205" containerID="0ce6ae1cd7fe7a2445a149e0cdc1b33e2ea8f9c2eced0b64a4ec4b433d5ce9c8" exitCode=0 Apr 19 15:34:43.007806 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:43.007233 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-9lcrk" event={"ID":"ac0fb6b7-ff74-4721-9cf5-f7df68c1c205","Type":"ContainerDied","Data":"0ce6ae1cd7fe7a2445a149e0cdc1b33e2ea8f9c2eced0b64a4ec4b433d5ce9c8"} Apr 19 15:34:44.012994 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:44.012959 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-9lcrk" event={"ID":"ac0fb6b7-ff74-4721-9cf5-f7df68c1c205","Type":"ContainerStarted","Data":"fbffd80c9a1fb81a7a5bbc56922c2660432af9b1647d3ed8dc4a7537616dbbf9"} Apr 19 15:34:44.013388 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:44.013184 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-9lcrk" Apr 19 15:34:44.030907 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:44.030842 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-9lcrk" podStartSLOduration=8.874220499 podStartE2EDuration="9.030822067s" podCreationTimestamp="2026-04-19 15:34:35 +0000 UTC" firstStartedPulling="2026-04-19 15:34:43.007971197 +0000 UTC m=+582.113886562" lastFinishedPulling="2026-04-19 15:34:43.164572751 +0000 UTC m=+582.270488130" observedRunningTime="2026-04-19 15:34:44.028781905 +0000 UTC m=+583.134697293" watchObservedRunningTime="2026-04-19 15:34:44.030822067 +0000 UTC m=+583.136737456" Apr 19 15:34:44.979657 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:44.979621 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-947g7" Apr 19 15:34:46.030275 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:46.030235 2579 generic.go:358] "Generic (PLEG): container finished" podID="fff9a6a1-6553-41bf-8c3d-80c138fdee6e" containerID="40ec86b0e67c639584307c63e9a868fad3a1fac8d97037811ebce4048ea65fc4" exitCode=0 Apr 19 15:34:46.030666 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:46.030317 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-z4x8p" event={"ID":"fff9a6a1-6553-41bf-8c3d-80c138fdee6e","Type":"ContainerDied","Data":"40ec86b0e67c639584307c63e9a868fad3a1fac8d97037811ebce4048ea65fc4"} Apr 19 15:34:47.036175 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:47.036141 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-z4x8p" event={"ID":"fff9a6a1-6553-41bf-8c3d-80c138fdee6e","Type":"ContainerStarted","Data":"5607e693c7ab7be4f61090547cbec7feb7e6e16fa7266e9638bb426f2f27463d"} Apr 19 15:34:47.036578 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:47.036360 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-z4x8p" Apr 19 15:34:47.056083 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:47.056029 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-z4x8p" podStartSLOduration=7.88993946 podStartE2EDuration="8.056012286s" podCreationTimestamp="2026-04-19 15:34:39 +0000 UTC" firstStartedPulling="2026-04-19 15:34:46.03098197 +0000 UTC m=+585.136897336" lastFinishedPulling="2026-04-19 15:34:46.197054787 +0000 UTC m=+585.302970162" observedRunningTime="2026-04-19 15:34:47.052769006 +0000 UTC m=+586.158684395" watchObservedRunningTime="2026-04-19 15:34:47.056012286 +0000 UTC m=+586.161927675" Apr 19 15:34:55.031948 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:55.031913 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-9lcrk" Apr 19 15:34:58.055023 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:34:58.054985 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-z4x8p" Apr 19 15:35:01.456130 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:35:01.456102 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xxqlx_73514b32-300b-4466-b414-022b4c2e1f8e/ovn-acl-logging/0.log" Apr 19 15:35:01.459013 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:35:01.458991 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xxqlx_73514b32-300b-4466-b414-022b4c2e1f8e/ovn-acl-logging/0.log" Apr 19 15:35:49.802308 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:35:49.802274 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-c5b548-nxjvc"] Apr 19 15:35:49.804785 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:35:49.804765 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-c5b548-nxjvc" Apr 19 15:35:49.806948 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:35:49.806925 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-server-cert\"" Apr 19 15:35:49.807064 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:35:49.806961 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-q77m2\"" Apr 19 15:35:49.811750 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:35:49.811703 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-c5b548-nxjvc"] Apr 19 15:35:49.818264 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:35:49.818231 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8sk62\" (UniqueName: \"kubernetes.io/projected/170bddb3-b518-4c61-ba0d-b51af950506f-kube-api-access-8sk62\") pod \"authorino-c5b548-nxjvc\" (UID: \"170bddb3-b518-4c61-ba0d-b51af950506f\") " pod="kuadrant-system/authorino-c5b548-nxjvc" Apr 19 15:35:49.818708 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:35:49.818685 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/170bddb3-b518-4c61-ba0d-b51af950506f-tls-cert\") pod \"authorino-c5b548-nxjvc\" (UID: \"170bddb3-b518-4c61-ba0d-b51af950506f\") " pod="kuadrant-system/authorino-c5b548-nxjvc" Apr 19 15:35:49.919351 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:35:49.919308 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8sk62\" (UniqueName: \"kubernetes.io/projected/170bddb3-b518-4c61-ba0d-b51af950506f-kube-api-access-8sk62\") pod \"authorino-c5b548-nxjvc\" (UID: \"170bddb3-b518-4c61-ba0d-b51af950506f\") " pod="kuadrant-system/authorino-c5b548-nxjvc" Apr 19 15:35:49.919532 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:35:49.919391 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/170bddb3-b518-4c61-ba0d-b51af950506f-tls-cert\") pod \"authorino-c5b548-nxjvc\" (UID: \"170bddb3-b518-4c61-ba0d-b51af950506f\") " pod="kuadrant-system/authorino-c5b548-nxjvc" Apr 19 15:35:49.921939 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:35:49.921907 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/170bddb3-b518-4c61-ba0d-b51af950506f-tls-cert\") pod \"authorino-c5b548-nxjvc\" (UID: \"170bddb3-b518-4c61-ba0d-b51af950506f\") " pod="kuadrant-system/authorino-c5b548-nxjvc" Apr 19 15:35:49.927976 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:35:49.927951 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8sk62\" (UniqueName: \"kubernetes.io/projected/170bddb3-b518-4c61-ba0d-b51af950506f-kube-api-access-8sk62\") pod \"authorino-c5b548-nxjvc\" (UID: \"170bddb3-b518-4c61-ba0d-b51af950506f\") " pod="kuadrant-system/authorino-c5b548-nxjvc" Apr 19 15:35:50.115296 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:35:50.115217 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-c5b548-nxjvc" Apr 19 15:35:50.249388 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:35:50.249359 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-c5b548-nxjvc"] Apr 19 15:35:50.251260 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:35:50.251235 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod170bddb3_b518_4c61_ba0d_b51af950506f.slice/crio-aab9d287fb949eaf6e4493782d82afe0ad29d23457f503ca1eaf944953051cc3 WatchSource:0}: Error finding container aab9d287fb949eaf6e4493782d82afe0ad29d23457f503ca1eaf944953051cc3: Status 404 returned error can't find the container with id aab9d287fb949eaf6e4493782d82afe0ad29d23457f503ca1eaf944953051cc3 Apr 19 15:35:50.252463 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:35:50.252443 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 19 15:35:50.299668 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:35:50.299630 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-c5b548-nxjvc" event={"ID":"170bddb3-b518-4c61-ba0d-b51af950506f","Type":"ContainerStarted","Data":"aab9d287fb949eaf6e4493782d82afe0ad29d23457f503ca1eaf944953051cc3"} Apr 19 15:35:51.305585 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:35:51.305539 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-c5b548-nxjvc" event={"ID":"170bddb3-b518-4c61-ba0d-b51af950506f","Type":"ContainerStarted","Data":"9792b18fc1607441dd157beea84f079fea6f6e23a4af5bd561a23945d7019b31"} Apr 19 15:35:51.322786 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:35:51.322693 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-c5b548-nxjvc" podStartSLOduration=1.9128740579999999 podStartE2EDuration="2.322674115s" podCreationTimestamp="2026-04-19 15:35:49 +0000 UTC" firstStartedPulling="2026-04-19 15:35:50.252565879 +0000 UTC m=+649.358481245" lastFinishedPulling="2026-04-19 15:35:50.662365936 +0000 UTC m=+649.768281302" observedRunningTime="2026-04-19 15:35:51.320682862 +0000 UTC m=+650.426598261" watchObservedRunningTime="2026-04-19 15:35:51.322674115 +0000 UTC m=+650.428589502" Apr 19 15:37:12.239802 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:37:12.239689 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-78fd9b446f-dxglg"] Apr 19 15:37:12.240299 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:37:12.240043 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-78fd9b446f-dxglg" podUID="30e3f4ed-12b9-435e-85e2-603bfcedc0d5" containerName="manager" containerID="cri-o://ab2f50d3af9d0a6fa34e1a2857c873605e28a4a4a06ed1971e70a33ae315477e" gracePeriod=10 Apr 19 15:37:12.489079 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:37:12.489050 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-78fd9b446f-dxglg" Apr 19 15:37:12.514122 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:37:12.514037 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4797p\" (UniqueName: \"kubernetes.io/projected/30e3f4ed-12b9-435e-85e2-603bfcedc0d5-kube-api-access-4797p\") pod \"30e3f4ed-12b9-435e-85e2-603bfcedc0d5\" (UID: \"30e3f4ed-12b9-435e-85e2-603bfcedc0d5\") " Apr 19 15:37:12.516315 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:37:12.516285 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30e3f4ed-12b9-435e-85e2-603bfcedc0d5-kube-api-access-4797p" (OuterVolumeSpecName: "kube-api-access-4797p") pod "30e3f4ed-12b9-435e-85e2-603bfcedc0d5" (UID: "30e3f4ed-12b9-435e-85e2-603bfcedc0d5"). InnerVolumeSpecName "kube-api-access-4797p". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 15:37:12.614907 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:37:12.614864 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4797p\" (UniqueName: \"kubernetes.io/projected/30e3f4ed-12b9-435e-85e2-603bfcedc0d5-kube-api-access-4797p\") on node \"ip-10-0-133-218.ec2.internal\" DevicePath \"\"" Apr 19 15:37:12.647394 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:37:12.647361 2579 generic.go:358] "Generic (PLEG): container finished" podID="30e3f4ed-12b9-435e-85e2-603bfcedc0d5" containerID="ab2f50d3af9d0a6fa34e1a2857c873605e28a4a4a06ed1971e70a33ae315477e" exitCode=0 Apr 19 15:37:12.647581 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:37:12.647424 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-78fd9b446f-dxglg" Apr 19 15:37:12.647581 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:37:12.647450 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-78fd9b446f-dxglg" event={"ID":"30e3f4ed-12b9-435e-85e2-603bfcedc0d5","Type":"ContainerDied","Data":"ab2f50d3af9d0a6fa34e1a2857c873605e28a4a4a06ed1971e70a33ae315477e"} Apr 19 15:37:12.647581 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:37:12.647490 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-78fd9b446f-dxglg" event={"ID":"30e3f4ed-12b9-435e-85e2-603bfcedc0d5","Type":"ContainerDied","Data":"12048d8eb9d30131f525105d754fea35d79f4e3451594d5237526ea525c0c0a1"} Apr 19 15:37:12.647581 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:37:12.647506 2579 scope.go:117] "RemoveContainer" containerID="ab2f50d3af9d0a6fa34e1a2857c873605e28a4a4a06ed1971e70a33ae315477e" Apr 19 15:37:12.657692 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:37:12.657670 2579 scope.go:117] "RemoveContainer" containerID="ab2f50d3af9d0a6fa34e1a2857c873605e28a4a4a06ed1971e70a33ae315477e" Apr 19 15:37:12.658021 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:37:12.657996 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab2f50d3af9d0a6fa34e1a2857c873605e28a4a4a06ed1971e70a33ae315477e\": container with ID starting with ab2f50d3af9d0a6fa34e1a2857c873605e28a4a4a06ed1971e70a33ae315477e not found: ID does not exist" containerID="ab2f50d3af9d0a6fa34e1a2857c873605e28a4a4a06ed1971e70a33ae315477e" Apr 19 15:37:12.658084 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:37:12.658035 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab2f50d3af9d0a6fa34e1a2857c873605e28a4a4a06ed1971e70a33ae315477e"} err="failed to get container status \"ab2f50d3af9d0a6fa34e1a2857c873605e28a4a4a06ed1971e70a33ae315477e\": rpc error: code = NotFound desc = could not find container \"ab2f50d3af9d0a6fa34e1a2857c873605e28a4a4a06ed1971e70a33ae315477e\": container with ID starting with ab2f50d3af9d0a6fa34e1a2857c873605e28a4a4a06ed1971e70a33ae315477e not found: ID does not exist" Apr 19 15:37:12.674708 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:37:12.674640 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-78fd9b446f-dxglg"] Apr 19 15:37:12.677583 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:37:12.677561 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-78fd9b446f-dxglg"] Apr 19 15:37:13.353289 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:37:13.353243 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-78fd9b446f-sz2gj"] Apr 19 15:37:13.353692 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:37:13.353668 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="30e3f4ed-12b9-435e-85e2-603bfcedc0d5" containerName="manager" Apr 19 15:37:13.353692 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:37:13.353679 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="30e3f4ed-12b9-435e-85e2-603bfcedc0d5" containerName="manager" Apr 19 15:37:13.353792 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:37:13.353775 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="30e3f4ed-12b9-435e-85e2-603bfcedc0d5" containerName="manager" Apr 19 15:37:13.357707 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:37:13.357687 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-78fd9b446f-sz2gj" Apr 19 15:37:13.359962 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:37:13.359936 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-ncp5d\"" Apr 19 15:37:13.362656 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:37:13.362630 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-78fd9b446f-sz2gj"] Apr 19 15:37:13.421451 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:37:13.421420 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6hlj\" (UniqueName: \"kubernetes.io/projected/cbeb5ce2-5e31-46a3-822d-5ca343065cf0-kube-api-access-m6hlj\") pod \"maas-controller-78fd9b446f-sz2gj\" (UID: \"cbeb5ce2-5e31-46a3-822d-5ca343065cf0\") " pod="opendatahub/maas-controller-78fd9b446f-sz2gj" Apr 19 15:37:13.522916 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:37:13.522870 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m6hlj\" (UniqueName: \"kubernetes.io/projected/cbeb5ce2-5e31-46a3-822d-5ca343065cf0-kube-api-access-m6hlj\") pod \"maas-controller-78fd9b446f-sz2gj\" (UID: \"cbeb5ce2-5e31-46a3-822d-5ca343065cf0\") " pod="opendatahub/maas-controller-78fd9b446f-sz2gj" Apr 19 15:37:13.530571 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:37:13.530543 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6hlj\" (UniqueName: \"kubernetes.io/projected/cbeb5ce2-5e31-46a3-822d-5ca343065cf0-kube-api-access-m6hlj\") pod \"maas-controller-78fd9b446f-sz2gj\" (UID: \"cbeb5ce2-5e31-46a3-822d-5ca343065cf0\") " pod="opendatahub/maas-controller-78fd9b446f-sz2gj" Apr 19 15:37:13.546628 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:37:13.546596 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30e3f4ed-12b9-435e-85e2-603bfcedc0d5" path="/var/lib/kubelet/pods/30e3f4ed-12b9-435e-85e2-603bfcedc0d5/volumes" Apr 19 15:37:13.669430 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:37:13.669337 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-78fd9b446f-sz2gj" Apr 19 15:37:13.803053 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:37:13.803029 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-78fd9b446f-sz2gj"] Apr 19 15:37:13.805119 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:37:13.805078 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcbeb5ce2_5e31_46a3_822d_5ca343065cf0.slice/crio-02118fa5e5db8bd28d387baf133aa2e92461e73dd848d4a7e9cac92af158685b WatchSource:0}: Error finding container 02118fa5e5db8bd28d387baf133aa2e92461e73dd848d4a7e9cac92af158685b: Status 404 returned error can't find the container with id 02118fa5e5db8bd28d387baf133aa2e92461e73dd848d4a7e9cac92af158685b Apr 19 15:37:14.658245 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:37:14.658152 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-78fd9b446f-sz2gj" event={"ID":"cbeb5ce2-5e31-46a3-822d-5ca343065cf0","Type":"ContainerStarted","Data":"2e485ac43d875b68aa16ce6d3c6369bd6e7a9d725824cc7786bc08dde08af258"} Apr 19 15:37:14.658245 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:37:14.658189 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-78fd9b446f-sz2gj" event={"ID":"cbeb5ce2-5e31-46a3-822d-5ca343065cf0","Type":"ContainerStarted","Data":"02118fa5e5db8bd28d387baf133aa2e92461e73dd848d4a7e9cac92af158685b"} Apr 19 15:37:14.658245 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:37:14.658219 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-78fd9b446f-sz2gj" Apr 19 15:37:14.676337 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:37:14.676277 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-78fd9b446f-sz2gj" podStartSLOduration=1.296366861 podStartE2EDuration="1.676257589s" podCreationTimestamp="2026-04-19 15:37:13 +0000 UTC" firstStartedPulling="2026-04-19 15:37:13.806470269 +0000 UTC m=+732.912385634" lastFinishedPulling="2026-04-19 15:37:14.186360992 +0000 UTC m=+733.292276362" observedRunningTime="2026-04-19 15:37:14.675700436 +0000 UTC m=+733.781615824" watchObservedRunningTime="2026-04-19 15:37:14.676257589 +0000 UTC m=+733.782172977" Apr 19 15:37:25.668055 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:37:25.668017 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-78fd9b446f-sz2gj" Apr 19 15:40:01.496386 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:40:01.496295 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xxqlx_73514b32-300b-4466-b414-022b4c2e1f8e/ovn-acl-logging/0.log" Apr 19 15:40:01.504747 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:40:01.504687 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xxqlx_73514b32-300b-4466-b414-022b4c2e1f8e/ovn-acl-logging/0.log" Apr 19 15:45:00.136585 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:45:00.136549 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-key-cleanup-29610225-pnggm"] Apr 19 15:45:00.140186 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:45:00.140165 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29610225-pnggm" Apr 19 15:45:00.142246 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:45:00.142230 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-rcvbl\"" Apr 19 15:45:00.152141 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:45:00.152118 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29610225-pnggm"] Apr 19 15:45:00.246059 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:45:00.246019 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxj94\" (UniqueName: \"kubernetes.io/projected/5884a3d5-b5ef-4566-9750-d4c195488baa-kube-api-access-mxj94\") pod \"maas-api-key-cleanup-29610225-pnggm\" (UID: \"5884a3d5-b5ef-4566-9750-d4c195488baa\") " pod="opendatahub/maas-api-key-cleanup-29610225-pnggm" Apr 19 15:45:00.347477 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:45:00.347431 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mxj94\" (UniqueName: \"kubernetes.io/projected/5884a3d5-b5ef-4566-9750-d4c195488baa-kube-api-access-mxj94\") pod \"maas-api-key-cleanup-29610225-pnggm\" (UID: \"5884a3d5-b5ef-4566-9750-d4c195488baa\") " pod="opendatahub/maas-api-key-cleanup-29610225-pnggm" Apr 19 15:45:00.355320 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:45:00.355286 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxj94\" (UniqueName: \"kubernetes.io/projected/5884a3d5-b5ef-4566-9750-d4c195488baa-kube-api-access-mxj94\") pod \"maas-api-key-cleanup-29610225-pnggm\" (UID: \"5884a3d5-b5ef-4566-9750-d4c195488baa\") " pod="opendatahub/maas-api-key-cleanup-29610225-pnggm" Apr 19 15:45:00.451319 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:45:00.451284 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29610225-pnggm" Apr 19 15:45:00.580404 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:45:00.580379 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29610225-pnggm"] Apr 19 15:45:00.582949 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:45:00.582921 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5884a3d5_b5ef_4566_9750_d4c195488baa.slice/crio-93bfd6856847c789d5a4eca023ca81e40ad5e64646fc6f29173114d979d06e70 WatchSource:0}: Error finding container 93bfd6856847c789d5a4eca023ca81e40ad5e64646fc6f29173114d979d06e70: Status 404 returned error can't find the container with id 93bfd6856847c789d5a4eca023ca81e40ad5e64646fc6f29173114d979d06e70 Apr 19 15:45:00.584810 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:45:00.584791 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 19 15:45:00.638134 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:45:00.638091 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29610225-pnggm" event={"ID":"5884a3d5-b5ef-4566-9750-d4c195488baa","Type":"ContainerStarted","Data":"93bfd6856847c789d5a4eca023ca81e40ad5e64646fc6f29173114d979d06e70"} Apr 19 15:45:01.532697 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:45:01.532668 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xxqlx_73514b32-300b-4466-b414-022b4c2e1f8e/ovn-acl-logging/0.log" Apr 19 15:45:01.541386 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:45:01.541363 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xxqlx_73514b32-300b-4466-b414-022b4c2e1f8e/ovn-acl-logging/0.log" Apr 19 15:45:01.644502 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:45:01.644468 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29610225-pnggm" event={"ID":"5884a3d5-b5ef-4566-9750-d4c195488baa","Type":"ContainerStarted","Data":"3f7701791056a27ef8ddbe113d8f279be3abe3c16defe21125679d12d4932517"} Apr 19 15:45:01.659513 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:45:01.659467 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-key-cleanup-29610225-pnggm" podStartSLOduration=1.054297873 podStartE2EDuration="1.659448527s" podCreationTimestamp="2026-04-19 15:45:00 +0000 UTC" firstStartedPulling="2026-04-19 15:45:00.584954528 +0000 UTC m=+1199.690869894" lastFinishedPulling="2026-04-19 15:45:01.190105182 +0000 UTC m=+1200.296020548" observedRunningTime="2026-04-19 15:45:01.657042403 +0000 UTC m=+1200.762957832" watchObservedRunningTime="2026-04-19 15:45:01.659448527 +0000 UTC m=+1200.765363917" Apr 19 15:45:22.736793 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:45:22.736745 2579 generic.go:358] "Generic (PLEG): container finished" podID="5884a3d5-b5ef-4566-9750-d4c195488baa" containerID="3f7701791056a27ef8ddbe113d8f279be3abe3c16defe21125679d12d4932517" exitCode=6 Apr 19 15:45:22.737279 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:45:22.737011 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29610225-pnggm" event={"ID":"5884a3d5-b5ef-4566-9750-d4c195488baa","Type":"ContainerDied","Data":"3f7701791056a27ef8ddbe113d8f279be3abe3c16defe21125679d12d4932517"} Apr 19 15:45:22.737623 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:45:22.737583 2579 scope.go:117] "RemoveContainer" containerID="3f7701791056a27ef8ddbe113d8f279be3abe3c16defe21125679d12d4932517" Apr 19 15:45:23.742654 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:45:23.742610 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29610225-pnggm" event={"ID":"5884a3d5-b5ef-4566-9750-d4c195488baa","Type":"ContainerStarted","Data":"feb06e8bbce428f76a288360efb7b1ae745a6cc00f313d9335e27f849a442b4b"} Apr 19 15:45:43.824128 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:45:43.824038 2579 generic.go:358] "Generic (PLEG): container finished" podID="5884a3d5-b5ef-4566-9750-d4c195488baa" containerID="feb06e8bbce428f76a288360efb7b1ae745a6cc00f313d9335e27f849a442b4b" exitCode=6 Apr 19 15:45:43.824128 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:45:43.824100 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29610225-pnggm" event={"ID":"5884a3d5-b5ef-4566-9750-d4c195488baa","Type":"ContainerDied","Data":"feb06e8bbce428f76a288360efb7b1ae745a6cc00f313d9335e27f849a442b4b"} Apr 19 15:45:43.824623 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:45:43.824132 2579 scope.go:117] "RemoveContainer" containerID="3f7701791056a27ef8ddbe113d8f279be3abe3c16defe21125679d12d4932517" Apr 19 15:45:43.824623 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:45:43.824508 2579 scope.go:117] "RemoveContainer" containerID="feb06e8bbce428f76a288360efb7b1ae745a6cc00f313d9335e27f849a442b4b" Apr 19 15:45:43.824854 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:45:43.824827 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cleanup\" with CrashLoopBackOff: \"back-off 10s restarting failed container=cleanup pod=maas-api-key-cleanup-29610225-pnggm_opendatahub(5884a3d5-b5ef-4566-9750-d4c195488baa)\"" pod="opendatahub/maas-api-key-cleanup-29610225-pnggm" podUID="5884a3d5-b5ef-4566-9750-d4c195488baa" Apr 19 15:45:57.541682 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:45:57.541599 2579 scope.go:117] "RemoveContainer" containerID="feb06e8bbce428f76a288360efb7b1ae745a6cc00f313d9335e27f849a442b4b" Apr 19 15:45:58.887763 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:45:58.887699 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29610225-pnggm" event={"ID":"5884a3d5-b5ef-4566-9750-d4c195488baa","Type":"ContainerStarted","Data":"dda3cc8a20d29d90fdd92ce79904f2adbe312ec77c8d97949c668c5af1123a3b"} Apr 19 15:45:59.912313 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:45:59.912276 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29610225-pnggm"] Apr 19 15:45:59.912801 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:45:59.912488 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-key-cleanup-29610225-pnggm" podUID="5884a3d5-b5ef-4566-9750-d4c195488baa" containerName="cleanup" containerID="cri-o://dda3cc8a20d29d90fdd92ce79904f2adbe312ec77c8d97949c668c5af1123a3b" gracePeriod=30 Apr 19 15:46:18.661798 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:46:18.661772 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29610225-pnggm" Apr 19 15:46:18.672017 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:46:18.671990 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mxj94\" (UniqueName: \"kubernetes.io/projected/5884a3d5-b5ef-4566-9750-d4c195488baa-kube-api-access-mxj94\") pod \"5884a3d5-b5ef-4566-9750-d4c195488baa\" (UID: \"5884a3d5-b5ef-4566-9750-d4c195488baa\") " Apr 19 15:46:18.674141 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:46:18.674115 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5884a3d5-b5ef-4566-9750-d4c195488baa-kube-api-access-mxj94" (OuterVolumeSpecName: "kube-api-access-mxj94") pod "5884a3d5-b5ef-4566-9750-d4c195488baa" (UID: "5884a3d5-b5ef-4566-9750-d4c195488baa"). InnerVolumeSpecName "kube-api-access-mxj94". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 15:46:18.772670 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:46:18.772581 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mxj94\" (UniqueName: \"kubernetes.io/projected/5884a3d5-b5ef-4566-9750-d4c195488baa-kube-api-access-mxj94\") on node \"ip-10-0-133-218.ec2.internal\" DevicePath \"\"" Apr 19 15:46:18.971906 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:46:18.971869 2579 generic.go:358] "Generic (PLEG): container finished" podID="5884a3d5-b5ef-4566-9750-d4c195488baa" containerID="dda3cc8a20d29d90fdd92ce79904f2adbe312ec77c8d97949c668c5af1123a3b" exitCode=6 Apr 19 15:46:18.972099 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:46:18.971937 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29610225-pnggm" Apr 19 15:46:18.972099 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:46:18.971955 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29610225-pnggm" event={"ID":"5884a3d5-b5ef-4566-9750-d4c195488baa","Type":"ContainerDied","Data":"dda3cc8a20d29d90fdd92ce79904f2adbe312ec77c8d97949c668c5af1123a3b"} Apr 19 15:46:18.972099 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:46:18.971988 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29610225-pnggm" event={"ID":"5884a3d5-b5ef-4566-9750-d4c195488baa","Type":"ContainerDied","Data":"93bfd6856847c789d5a4eca023ca81e40ad5e64646fc6f29173114d979d06e70"} Apr 19 15:46:18.972099 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:46:18.972003 2579 scope.go:117] "RemoveContainer" containerID="dda3cc8a20d29d90fdd92ce79904f2adbe312ec77c8d97949c668c5af1123a3b" Apr 19 15:46:18.981695 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:46:18.981676 2579 scope.go:117] "RemoveContainer" containerID="feb06e8bbce428f76a288360efb7b1ae745a6cc00f313d9335e27f849a442b4b" Apr 19 15:46:18.989765 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:46:18.989747 2579 scope.go:117] "RemoveContainer" containerID="dda3cc8a20d29d90fdd92ce79904f2adbe312ec77c8d97949c668c5af1123a3b" Apr 19 15:46:18.990075 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:46:18.990050 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dda3cc8a20d29d90fdd92ce79904f2adbe312ec77c8d97949c668c5af1123a3b\": container with ID starting with dda3cc8a20d29d90fdd92ce79904f2adbe312ec77c8d97949c668c5af1123a3b not found: ID does not exist" containerID="dda3cc8a20d29d90fdd92ce79904f2adbe312ec77c8d97949c668c5af1123a3b" Apr 19 15:46:18.990258 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:46:18.990082 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dda3cc8a20d29d90fdd92ce79904f2adbe312ec77c8d97949c668c5af1123a3b"} err="failed to get container status \"dda3cc8a20d29d90fdd92ce79904f2adbe312ec77c8d97949c668c5af1123a3b\": rpc error: code = NotFound desc = could not find container \"dda3cc8a20d29d90fdd92ce79904f2adbe312ec77c8d97949c668c5af1123a3b\": container with ID starting with dda3cc8a20d29d90fdd92ce79904f2adbe312ec77c8d97949c668c5af1123a3b not found: ID does not exist" Apr 19 15:46:18.990258 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:46:18.990100 2579 scope.go:117] "RemoveContainer" containerID="feb06e8bbce428f76a288360efb7b1ae745a6cc00f313d9335e27f849a442b4b" Apr 19 15:46:18.990471 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:46:18.990396 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"feb06e8bbce428f76a288360efb7b1ae745a6cc00f313d9335e27f849a442b4b\": container with ID starting with feb06e8bbce428f76a288360efb7b1ae745a6cc00f313d9335e27f849a442b4b not found: ID does not exist" containerID="feb06e8bbce428f76a288360efb7b1ae745a6cc00f313d9335e27f849a442b4b" Apr 19 15:46:18.990471 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:46:18.990438 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"feb06e8bbce428f76a288360efb7b1ae745a6cc00f313d9335e27f849a442b4b"} err="failed to get container status \"feb06e8bbce428f76a288360efb7b1ae745a6cc00f313d9335e27f849a442b4b\": rpc error: code = NotFound desc = could not find container \"feb06e8bbce428f76a288360efb7b1ae745a6cc00f313d9335e27f849a442b4b\": container with ID starting with feb06e8bbce428f76a288360efb7b1ae745a6cc00f313d9335e27f849a442b4b not found: ID does not exist" Apr 19 15:46:18.994190 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:46:18.994169 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29610225-pnggm"] Apr 19 15:46:18.996615 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:46:18.996594 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-key-cleanup-29610225-pnggm"] Apr 19 15:46:19.547147 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:46:19.547116 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5884a3d5-b5ef-4566-9750-d4c195488baa" path="/var/lib/kubelet/pods/5884a3d5-b5ef-4566-9750-d4c195488baa/volumes" Apr 19 15:47:22.797517 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:47:22.797443 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-zgp2n"] Apr 19 15:47:22.798000 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:47:22.797682 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-zgp2n" podUID="1af66041-ebb4-40f6-a6a9-f491ef0efd14" containerName="manager" containerID="cri-o://010b9a17f02af55e7d40b58a5c87073d43505efb755714ce991f93ac6630f87f" gracePeriod=10 Apr 19 15:47:23.672191 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:47:23.672161 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-zgp2n" Apr 19 15:47:23.797259 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:47:23.797167 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d5h7b\" (UniqueName: \"kubernetes.io/projected/1af66041-ebb4-40f6-a6a9-f491ef0efd14-kube-api-access-d5h7b\") pod \"1af66041-ebb4-40f6-a6a9-f491ef0efd14\" (UID: \"1af66041-ebb4-40f6-a6a9-f491ef0efd14\") " Apr 19 15:47:23.797418 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:47:23.797301 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/1af66041-ebb4-40f6-a6a9-f491ef0efd14-extensions-socket-volume\") pod \"1af66041-ebb4-40f6-a6a9-f491ef0efd14\" (UID: \"1af66041-ebb4-40f6-a6a9-f491ef0efd14\") " Apr 19 15:47:23.797694 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:47:23.797665 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1af66041-ebb4-40f6-a6a9-f491ef0efd14-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "1af66041-ebb4-40f6-a6a9-f491ef0efd14" (UID: "1af66041-ebb4-40f6-a6a9-f491ef0efd14"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 19 15:47:23.799501 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:47:23.799473 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1af66041-ebb4-40f6-a6a9-f491ef0efd14-kube-api-access-d5h7b" (OuterVolumeSpecName: "kube-api-access-d5h7b") pod "1af66041-ebb4-40f6-a6a9-f491ef0efd14" (UID: "1af66041-ebb4-40f6-a6a9-f491ef0efd14"). InnerVolumeSpecName "kube-api-access-d5h7b". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 15:47:23.899079 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:47:23.899043 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-d5h7b\" (UniqueName: \"kubernetes.io/projected/1af66041-ebb4-40f6-a6a9-f491ef0efd14-kube-api-access-d5h7b\") on node \"ip-10-0-133-218.ec2.internal\" DevicePath \"\"" Apr 19 15:47:23.899079 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:47:23.899072 2579 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/1af66041-ebb4-40f6-a6a9-f491ef0efd14-extensions-socket-volume\") on node \"ip-10-0-133-218.ec2.internal\" DevicePath \"\"" Apr 19 15:47:24.241903 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:47:24.241861 2579 generic.go:358] "Generic (PLEG): container finished" podID="1af66041-ebb4-40f6-a6a9-f491ef0efd14" containerID="010b9a17f02af55e7d40b58a5c87073d43505efb755714ce991f93ac6630f87f" exitCode=0 Apr 19 15:47:24.242083 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:47:24.241939 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-zgp2n" Apr 19 15:47:24.242083 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:47:24.241946 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-zgp2n" event={"ID":"1af66041-ebb4-40f6-a6a9-f491ef0efd14","Type":"ContainerDied","Data":"010b9a17f02af55e7d40b58a5c87073d43505efb755714ce991f93ac6630f87f"} Apr 19 15:47:24.242083 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:47:24.241989 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-zgp2n" event={"ID":"1af66041-ebb4-40f6-a6a9-f491ef0efd14","Type":"ContainerDied","Data":"2a2952d6bc156c91bffa275486988b1db91c5806063dfb4fac0eddb6b7c4c657"} Apr 19 15:47:24.242083 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:47:24.242010 2579 scope.go:117] "RemoveContainer" containerID="010b9a17f02af55e7d40b58a5c87073d43505efb755714ce991f93ac6630f87f" Apr 19 15:47:24.254904 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:47:24.254860 2579 scope.go:117] "RemoveContainer" containerID="010b9a17f02af55e7d40b58a5c87073d43505efb755714ce991f93ac6630f87f" Apr 19 15:47:24.255176 ip-10-0-133-218 kubenswrapper[2579]: E0419 15:47:24.255156 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"010b9a17f02af55e7d40b58a5c87073d43505efb755714ce991f93ac6630f87f\": container with ID starting with 010b9a17f02af55e7d40b58a5c87073d43505efb755714ce991f93ac6630f87f not found: ID does not exist" containerID="010b9a17f02af55e7d40b58a5c87073d43505efb755714ce991f93ac6630f87f" Apr 19 15:47:24.255230 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:47:24.255185 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"010b9a17f02af55e7d40b58a5c87073d43505efb755714ce991f93ac6630f87f"} err="failed to get container status \"010b9a17f02af55e7d40b58a5c87073d43505efb755714ce991f93ac6630f87f\": rpc error: code = NotFound desc = could not find container \"010b9a17f02af55e7d40b58a5c87073d43505efb755714ce991f93ac6630f87f\": container with ID starting with 010b9a17f02af55e7d40b58a5c87073d43505efb755714ce991f93ac6630f87f not found: ID does not exist" Apr 19 15:47:24.263505 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:47:24.263482 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-zgp2n"] Apr 19 15:47:24.266940 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:47:24.266915 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-zgp2n"] Apr 19 15:47:25.552501 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:47:25.552464 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1af66041-ebb4-40f6-a6a9-f491ef0efd14" path="/var/lib/kubelet/pods/1af66041-ebb4-40f6-a6a9-f491ef0efd14/volumes" Apr 19 15:48:28.882235 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:48:28.882202 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-bdtkn"] Apr 19 15:48:28.884629 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:48:28.882786 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5884a3d5-b5ef-4566-9750-d4c195488baa" containerName="cleanup" Apr 19 15:48:28.884629 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:48:28.882805 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="5884a3d5-b5ef-4566-9750-d4c195488baa" containerName="cleanup" Apr 19 15:48:28.884629 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:48:28.882820 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5884a3d5-b5ef-4566-9750-d4c195488baa" containerName="cleanup" Apr 19 15:48:28.884629 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:48:28.882828 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="5884a3d5-b5ef-4566-9750-d4c195488baa" containerName="cleanup" Apr 19 15:48:28.884629 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:48:28.882851 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1af66041-ebb4-40f6-a6a9-f491ef0efd14" containerName="manager" Apr 19 15:48:28.884629 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:48:28.882860 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="1af66041-ebb4-40f6-a6a9-f491ef0efd14" containerName="manager" Apr 19 15:48:28.884629 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:48:28.882953 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="1af66041-ebb4-40f6-a6a9-f491ef0efd14" containerName="manager" Apr 19 15:48:28.884629 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:48:28.882971 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="5884a3d5-b5ef-4566-9750-d4c195488baa" containerName="cleanup" Apr 19 15:48:28.884629 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:48:28.882978 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="5884a3d5-b5ef-4566-9750-d4c195488baa" containerName="cleanup" Apr 19 15:48:28.884629 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:48:28.882985 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="5884a3d5-b5ef-4566-9750-d4c195488baa" containerName="cleanup" Apr 19 15:48:28.885964 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:48:28.885945 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-bdtkn" Apr 19 15:48:28.888982 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:48:28.888960 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-ltvvm\"" Apr 19 15:48:28.901231 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:48:28.901207 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-bdtkn"] Apr 19 15:48:29.005927 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:48:29.005886 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thbv2\" (UniqueName: \"kubernetes.io/projected/0c0c92fd-79cd-4ee8-bffd-3d085e8f2698-kube-api-access-thbv2\") pod \"kuadrant-operator-controller-manager-55c7f4c975-bdtkn\" (UID: \"0c0c92fd-79cd-4ee8-bffd-3d085e8f2698\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-bdtkn" Apr 19 15:48:29.006099 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:48:29.005973 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/0c0c92fd-79cd-4ee8-bffd-3d085e8f2698-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-bdtkn\" (UID: \"0c0c92fd-79cd-4ee8-bffd-3d085e8f2698\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-bdtkn" Apr 19 15:48:29.107020 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:48:29.106981 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-thbv2\" (UniqueName: \"kubernetes.io/projected/0c0c92fd-79cd-4ee8-bffd-3d085e8f2698-kube-api-access-thbv2\") pod \"kuadrant-operator-controller-manager-55c7f4c975-bdtkn\" (UID: \"0c0c92fd-79cd-4ee8-bffd-3d085e8f2698\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-bdtkn" Apr 19 15:48:29.107193 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:48:29.107042 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/0c0c92fd-79cd-4ee8-bffd-3d085e8f2698-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-bdtkn\" (UID: \"0c0c92fd-79cd-4ee8-bffd-3d085e8f2698\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-bdtkn" Apr 19 15:48:29.107408 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:48:29.107391 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/0c0c92fd-79cd-4ee8-bffd-3d085e8f2698-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-bdtkn\" (UID: \"0c0c92fd-79cd-4ee8-bffd-3d085e8f2698\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-bdtkn" Apr 19 15:48:29.115822 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:48:29.115797 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-thbv2\" (UniqueName: \"kubernetes.io/projected/0c0c92fd-79cd-4ee8-bffd-3d085e8f2698-kube-api-access-thbv2\") pod \"kuadrant-operator-controller-manager-55c7f4c975-bdtkn\" (UID: \"0c0c92fd-79cd-4ee8-bffd-3d085e8f2698\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-bdtkn" Apr 19 15:48:29.198016 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:48:29.197980 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-bdtkn" Apr 19 15:48:29.332952 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:48:29.332923 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-bdtkn"] Apr 19 15:48:29.335028 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:48:29.334996 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c0c92fd_79cd_4ee8_bffd_3d085e8f2698.slice/crio-167ec07ae35c7c9a6d794c44e4457aae4f3a89b901ad675a375ea987cc89d5a6 WatchSource:0}: Error finding container 167ec07ae35c7c9a6d794c44e4457aae4f3a89b901ad675a375ea987cc89d5a6: Status 404 returned error can't find the container with id 167ec07ae35c7c9a6d794c44e4457aae4f3a89b901ad675a375ea987cc89d5a6 Apr 19 15:48:29.519665 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:48:29.519575 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-bdtkn" event={"ID":"0c0c92fd-79cd-4ee8-bffd-3d085e8f2698","Type":"ContainerStarted","Data":"652cf8c00ad9f5f8667385cc7e67e81b752a18063ac1001847e8157d5c9d7da4"} Apr 19 15:48:29.519665 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:48:29.519616 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-bdtkn" event={"ID":"0c0c92fd-79cd-4ee8-bffd-3d085e8f2698","Type":"ContainerStarted","Data":"167ec07ae35c7c9a6d794c44e4457aae4f3a89b901ad675a375ea987cc89d5a6"} Apr 19 15:48:29.519964 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:48:29.519677 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-bdtkn" Apr 19 15:48:29.538448 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:48:29.538399 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-bdtkn" podStartSLOduration=1.538384056 podStartE2EDuration="1.538384056s" podCreationTimestamp="2026-04-19 15:48:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-19 15:48:29.535181576 +0000 UTC m=+1408.641096956" watchObservedRunningTime="2026-04-19 15:48:29.538384056 +0000 UTC m=+1408.644299443" Apr 19 15:48:40.525221 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:48:40.525187 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-bdtkn" Apr 19 15:50:01.580829 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:50:01.580799 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xxqlx_73514b32-300b-4466-b414-022b4c2e1f8e/ovn-acl-logging/0.log" Apr 19 15:50:01.586512 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:50:01.586489 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xxqlx_73514b32-300b-4466-b414-022b4c2e1f8e/ovn-acl-logging/0.log" Apr 19 15:55:01.615239 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:55:01.615208 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xxqlx_73514b32-300b-4466-b414-022b4c2e1f8e/ovn-acl-logging/0.log" Apr 19 15:55:01.625756 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:55:01.625708 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xxqlx_73514b32-300b-4466-b414-022b4c2e1f8e/ovn-acl-logging/0.log" Apr 19 15:58:10.049021 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:10.048942 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-c5b548-nxjvc_170bddb3-b518-4c61-ba0d-b51af950506f/authorino/0.log" Apr 19 15:58:13.818190 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:13.818159 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-78fd9b446f-sz2gj_cbeb5ce2-5e31-46a3-822d-5ca343065cf0/manager/0.log" Apr 19 15:58:14.048239 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:14.048209 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-67944f454b-2kv92_9ff7b2f4-5dfd-491b-ab80-88ce950644c4/manager/0.log" Apr 19 15:58:15.102006 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:15.101977 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759fjgzz_6a58af15-197e-4703-9c60-0b75920abcf7/pull/0.log" Apr 19 15:58:15.107138 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:15.107117 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759fjgzz_6a58af15-197e-4703-9c60-0b75920abcf7/extract/0.log" Apr 19 15:58:15.112284 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:15.112265 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759fjgzz_6a58af15-197e-4703-9c60-0b75920abcf7/util/0.log" Apr 19 15:58:15.218483 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:15.218451 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05wccs_8b042e94-664b-4e73-9ca2-3a8f1330cacd/extract/0.log" Apr 19 15:58:15.223686 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:15.223665 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05wccs_8b042e94-664b-4e73-9ca2-3a8f1330cacd/util/0.log" Apr 19 15:58:15.229216 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:15.229185 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05wccs_8b042e94-664b-4e73-9ca2-3a8f1330cacd/pull/0.log" Apr 19 15:58:15.329672 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:15.329642 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73gx7bw_6fa5de8e-282f-472b-bc6d-b56bf8c2fc82/util/0.log" Apr 19 15:58:15.334861 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:15.334841 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73gx7bw_6fa5de8e-282f-472b-bc6d-b56bf8c2fc82/pull/0.log" Apr 19 15:58:15.339755 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:15.339736 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73gx7bw_6fa5de8e-282f-472b-bc6d-b56bf8c2fc82/extract/0.log" Apr 19 15:58:15.438292 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:15.438264 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1wpbws_29a72e0e-4e70-414b-95a6-6ab81cf351b6/util/0.log" Apr 19 15:58:15.443682 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:15.443659 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1wpbws_29a72e0e-4e70-414b-95a6-6ab81cf351b6/pull/0.log" Apr 19 15:58:15.448690 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:15.448670 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1wpbws_29a72e0e-4e70-414b-95a6-6ab81cf351b6/extract/0.log" Apr 19 15:58:15.558171 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:15.558143 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-c5b548-nxjvc_170bddb3-b518-4c61-ba0d-b51af950506f/authorino/0.log" Apr 19 15:58:15.767324 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:15.767249 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-shj9m_63ad4b6f-3b46-4d70-abc4-8ecbc1a30d94/manager/0.log" Apr 19 15:58:15.863694 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:15.863666 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-6xkts_486cbe01-c696-4420-b8d2-7a243e85b26d/kuadrant-console-plugin/0.log" Apr 19 15:58:15.981362 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:15.981329 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-2r9zr_4b6aecfb-f463-4b12-8f75-9252f1a3d5ee/registry-server/0.log" Apr 19 15:58:16.092433 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:16.092354 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-55c7f4c975-bdtkn_0c0c92fd-79cd-4ee8-bffd-3d085e8f2698/manager/0.log" Apr 19 15:58:16.629131 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:16.629105 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-5cb8b776cfwmhkc_3bfb65c0-effa-41d5-bdd1-2e41eb93cdbb/istio-proxy/0.log" Apr 19 15:58:17.044896 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:17.044853 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-58b6f876-k8m5w_bc0715c9-b497-4e43-9ffb-a87664024408/istio-proxy/0.log" Apr 19 15:58:17.577375 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:17.577343 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-7bb4cdb4d7-z4x8p_fff9a6a1-6553-41bf-8c3d-80c138fdee6e/storage-initializer/0.log" Apr 19 15:58:17.583244 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:17.583222 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-7bb4cdb4d7-z4x8p_fff9a6a1-6553-41bf-8c3d-80c138fdee6e/main/0.log" Apr 19 15:58:17.689389 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:17.689361 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-trlp-test-simulated-kserve-84db68679b-947g7_cc45878c-6dbb-47e9-9a41-a1bfd9c6edf8/storage-initializer/0.log" Apr 19 15:58:17.695775 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:17.695753 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-trlp-test-simulated-kserve-84db68679b-947g7_cc45878c-6dbb-47e9-9a41-a1bfd9c6edf8/main/0.log" Apr 19 15:58:17.918658 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:17.918579 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-8f8dc67b7-9lcrk_ac0fb6b7-ff74-4721-9cf5-f7df68c1c205/storage-initializer/0.log" Apr 19 15:58:17.924642 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:17.924619 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-8f8dc67b7-9lcrk_ac0fb6b7-ff74-4721-9cf5-f7df68c1c205/main/0.log" Apr 19 15:58:24.467001 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:24.466974 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-spsbv_55899c70-f0e0-481b-94b0-9aad2305242f/global-pull-secret-syncer/0.log" Apr 19 15:58:24.512179 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:24.512148 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-7r729_3f939e64-6dbd-4802-9d83-7251b53cdcb5/konnectivity-agent/0.log" Apr 19 15:58:24.635661 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:24.635631 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-133-218.ec2.internal_1ddb39581fd73259e0c27abfdf033d32/haproxy/0.log" Apr 19 15:58:28.023683 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:28.023650 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759fjgzz_6a58af15-197e-4703-9c60-0b75920abcf7/extract/0.log" Apr 19 15:58:28.050196 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:28.050166 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759fjgzz_6a58af15-197e-4703-9c60-0b75920abcf7/util/0.log" Apr 19 15:58:28.074568 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:28.074534 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759fjgzz_6a58af15-197e-4703-9c60-0b75920abcf7/pull/0.log" Apr 19 15:58:28.103132 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:28.103106 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05wccs_8b042e94-664b-4e73-9ca2-3a8f1330cacd/extract/0.log" Apr 19 15:58:28.123088 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:28.123062 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05wccs_8b042e94-664b-4e73-9ca2-3a8f1330cacd/util/0.log" Apr 19 15:58:28.143367 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:28.143340 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05wccs_8b042e94-664b-4e73-9ca2-3a8f1330cacd/pull/0.log" Apr 19 15:58:28.186342 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:28.186306 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73gx7bw_6fa5de8e-282f-472b-bc6d-b56bf8c2fc82/extract/0.log" Apr 19 15:58:28.203979 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:28.203956 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73gx7bw_6fa5de8e-282f-472b-bc6d-b56bf8c2fc82/util/0.log" Apr 19 15:58:28.228545 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:28.228524 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73gx7bw_6fa5de8e-282f-472b-bc6d-b56bf8c2fc82/pull/0.log" Apr 19 15:58:28.257897 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:28.257872 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1wpbws_29a72e0e-4e70-414b-95a6-6ab81cf351b6/extract/0.log" Apr 19 15:58:28.278130 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:28.278071 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1wpbws_29a72e0e-4e70-414b-95a6-6ab81cf351b6/util/0.log" Apr 19 15:58:28.301104 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:28.301083 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1wpbws_29a72e0e-4e70-414b-95a6-6ab81cf351b6/pull/0.log" Apr 19 15:58:28.513074 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:28.513040 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-c5b548-nxjvc_170bddb3-b518-4c61-ba0d-b51af950506f/authorino/0.log" Apr 19 15:58:28.563251 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:28.563167 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-shj9m_63ad4b6f-3b46-4d70-abc4-8ecbc1a30d94/manager/0.log" Apr 19 15:58:28.585986 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:28.585955 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-6xkts_486cbe01-c696-4420-b8d2-7a243e85b26d/kuadrant-console-plugin/0.log" Apr 19 15:58:28.616656 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:28.616627 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-2r9zr_4b6aecfb-f463-4b12-8f75-9252f1a3d5ee/registry-server/0.log" Apr 19 15:58:28.671219 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:28.671186 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-55c7f4c975-bdtkn_0c0c92fd-79cd-4ee8-bffd-3d085e8f2698/manager/0.log" Apr 19 15:58:30.352261 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:30.352236 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-lgqmg_6185d8e4-ff61-4ce3-9885-8aaeca0c15ca/cluster-monitoring-operator/0.log" Apr 19 15:58:30.375062 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:30.375036 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-94dch_036941a2-bcef-4eb6-8c54-93c25b36bbb2/kube-state-metrics/0.log" Apr 19 15:58:30.397365 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:30.397290 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-94dch_036941a2-bcef-4eb6-8c54-93c25b36bbb2/kube-rbac-proxy-main/0.log" Apr 19 15:58:30.418582 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:30.418556 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-94dch_036941a2-bcef-4eb6-8c54-93c25b36bbb2/kube-rbac-proxy-self/0.log" Apr 19 15:58:30.443569 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:30.443539 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-6bbcbb9c4d-x7js9_5317d4de-3d06-404c-a2c2-c50612bb163d/metrics-server/0.log" Apr 19 15:58:30.466845 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:30.466817 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-jd7xh_87f936be-8278-46c5-8f16-d81b0eb95086/monitoring-plugin/0.log" Apr 19 15:58:30.493115 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:30.493085 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-4jl6h_8e99a9b5-7492-4de3-860f-2ea9f58eaf8a/node-exporter/0.log" Apr 19 15:58:30.511129 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:30.511103 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-4jl6h_8e99a9b5-7492-4de3-860f-2ea9f58eaf8a/kube-rbac-proxy/0.log" Apr 19 15:58:30.529993 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:30.529966 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-4jl6h_8e99a9b5-7492-4de3-860f-2ea9f58eaf8a/init-textfile/0.log" Apr 19 15:58:30.696678 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:30.696649 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-x6zt4_497c2a9d-d724-4361-a355-cc733a36f75f/kube-rbac-proxy-main/0.log" Apr 19 15:58:30.717362 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:30.717331 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-x6zt4_497c2a9d-d724-4361-a355-cc733a36f75f/kube-rbac-proxy-self/0.log" Apr 19 15:58:30.736381 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:30.736352 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-x6zt4_497c2a9d-d724-4361-a355-cc733a36f75f/openshift-state-metrics/0.log" Apr 19 15:58:30.999381 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:30.999302 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-d8927_20ed457f-c873-4a9b-ae0c-36282b3723a2/prometheus-operator-admission-webhook/0.log" Apr 19 15:58:32.386525 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:32.386492 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-vlqlr_edae232e-010b-4d24-a5b7-d4e138925f66/networking-console-plugin/0.log" Apr 19 15:58:32.982117 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:32.982083 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-w8x7s/perf-node-gather-daemonset-kfr5r"] Apr 19 15:58:32.982530 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:32.982517 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5884a3d5-b5ef-4566-9750-d4c195488baa" containerName="cleanup" Apr 19 15:58:32.982575 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:32.982532 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="5884a3d5-b5ef-4566-9750-d4c195488baa" containerName="cleanup" Apr 19 15:58:32.985730 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:32.985701 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-w8x7s/perf-node-gather-daemonset-kfr5r" Apr 19 15:58:32.988103 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:32.988076 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-w8x7s\"/\"openshift-service-ca.crt\"" Apr 19 15:58:32.988103 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:32.988098 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-w8x7s\"/\"default-dockercfg-775qz\"" Apr 19 15:58:32.988607 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:32.988594 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-w8x7s\"/\"kube-root-ca.crt\"" Apr 19 15:58:32.994029 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:32.994007 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-w8x7s/perf-node-gather-daemonset-kfr5r"] Apr 19 15:58:33.138378 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:33.138342 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/68e9b2d5-d48c-4ad7-9698-ba61bfbdb233-sys\") pod \"perf-node-gather-daemonset-kfr5r\" (UID: \"68e9b2d5-d48c-4ad7-9698-ba61bfbdb233\") " pod="openshift-must-gather-w8x7s/perf-node-gather-daemonset-kfr5r" Apr 19 15:58:33.138560 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:33.138393 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nv8f6\" (UniqueName: \"kubernetes.io/projected/68e9b2d5-d48c-4ad7-9698-ba61bfbdb233-kube-api-access-nv8f6\") pod \"perf-node-gather-daemonset-kfr5r\" (UID: \"68e9b2d5-d48c-4ad7-9698-ba61bfbdb233\") " pod="openshift-must-gather-w8x7s/perf-node-gather-daemonset-kfr5r" Apr 19 15:58:33.138560 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:33.138486 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/68e9b2d5-d48c-4ad7-9698-ba61bfbdb233-lib-modules\") pod \"perf-node-gather-daemonset-kfr5r\" (UID: \"68e9b2d5-d48c-4ad7-9698-ba61bfbdb233\") " pod="openshift-must-gather-w8x7s/perf-node-gather-daemonset-kfr5r" Apr 19 15:58:33.138669 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:33.138592 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/68e9b2d5-d48c-4ad7-9698-ba61bfbdb233-proc\") pod \"perf-node-gather-daemonset-kfr5r\" (UID: \"68e9b2d5-d48c-4ad7-9698-ba61bfbdb233\") " pod="openshift-must-gather-w8x7s/perf-node-gather-daemonset-kfr5r" Apr 19 15:58:33.138669 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:33.138648 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/68e9b2d5-d48c-4ad7-9698-ba61bfbdb233-podres\") pod \"perf-node-gather-daemonset-kfr5r\" (UID: \"68e9b2d5-d48c-4ad7-9698-ba61bfbdb233\") " pod="openshift-must-gather-w8x7s/perf-node-gather-daemonset-kfr5r" Apr 19 15:58:33.239454 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:33.239354 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/68e9b2d5-d48c-4ad7-9698-ba61bfbdb233-sys\") pod \"perf-node-gather-daemonset-kfr5r\" (UID: \"68e9b2d5-d48c-4ad7-9698-ba61bfbdb233\") " pod="openshift-must-gather-w8x7s/perf-node-gather-daemonset-kfr5r" Apr 19 15:58:33.239454 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:33.239418 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nv8f6\" (UniqueName: \"kubernetes.io/projected/68e9b2d5-d48c-4ad7-9698-ba61bfbdb233-kube-api-access-nv8f6\") pod \"perf-node-gather-daemonset-kfr5r\" (UID: \"68e9b2d5-d48c-4ad7-9698-ba61bfbdb233\") " pod="openshift-must-gather-w8x7s/perf-node-gather-daemonset-kfr5r" Apr 19 15:58:33.239692 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:33.239468 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/68e9b2d5-d48c-4ad7-9698-ba61bfbdb233-lib-modules\") pod \"perf-node-gather-daemonset-kfr5r\" (UID: \"68e9b2d5-d48c-4ad7-9698-ba61bfbdb233\") " pod="openshift-must-gather-w8x7s/perf-node-gather-daemonset-kfr5r" Apr 19 15:58:33.239692 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:33.239488 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/68e9b2d5-d48c-4ad7-9698-ba61bfbdb233-sys\") pod \"perf-node-gather-daemonset-kfr5r\" (UID: \"68e9b2d5-d48c-4ad7-9698-ba61bfbdb233\") " pod="openshift-must-gather-w8x7s/perf-node-gather-daemonset-kfr5r" Apr 19 15:58:33.239692 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:33.239549 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/68e9b2d5-d48c-4ad7-9698-ba61bfbdb233-proc\") pod \"perf-node-gather-daemonset-kfr5r\" (UID: \"68e9b2d5-d48c-4ad7-9698-ba61bfbdb233\") " pod="openshift-must-gather-w8x7s/perf-node-gather-daemonset-kfr5r" Apr 19 15:58:33.239692 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:33.239576 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/68e9b2d5-d48c-4ad7-9698-ba61bfbdb233-podres\") pod \"perf-node-gather-daemonset-kfr5r\" (UID: \"68e9b2d5-d48c-4ad7-9698-ba61bfbdb233\") " pod="openshift-must-gather-w8x7s/perf-node-gather-daemonset-kfr5r" Apr 19 15:58:33.239692 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:33.239632 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/68e9b2d5-d48c-4ad7-9698-ba61bfbdb233-lib-modules\") pod \"perf-node-gather-daemonset-kfr5r\" (UID: \"68e9b2d5-d48c-4ad7-9698-ba61bfbdb233\") " pod="openshift-must-gather-w8x7s/perf-node-gather-daemonset-kfr5r" Apr 19 15:58:33.239923 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:33.239695 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/68e9b2d5-d48c-4ad7-9698-ba61bfbdb233-podres\") pod \"perf-node-gather-daemonset-kfr5r\" (UID: \"68e9b2d5-d48c-4ad7-9698-ba61bfbdb233\") " pod="openshift-must-gather-w8x7s/perf-node-gather-daemonset-kfr5r" Apr 19 15:58:33.239923 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:33.239703 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/68e9b2d5-d48c-4ad7-9698-ba61bfbdb233-proc\") pod \"perf-node-gather-daemonset-kfr5r\" (UID: \"68e9b2d5-d48c-4ad7-9698-ba61bfbdb233\") " pod="openshift-must-gather-w8x7s/perf-node-gather-daemonset-kfr5r" Apr 19 15:58:33.246566 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:33.246545 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nv8f6\" (UniqueName: \"kubernetes.io/projected/68e9b2d5-d48c-4ad7-9698-ba61bfbdb233-kube-api-access-nv8f6\") pod \"perf-node-gather-daemonset-kfr5r\" (UID: \"68e9b2d5-d48c-4ad7-9698-ba61bfbdb233\") " pod="openshift-must-gather-w8x7s/perf-node-gather-daemonset-kfr5r" Apr 19 15:58:33.297682 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:33.297635 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-w8x7s/perf-node-gather-daemonset-kfr5r" Apr 19 15:58:33.425901 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:33.425870 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-w8x7s/perf-node-gather-daemonset-kfr5r"] Apr 19 15:58:33.427651 ip-10-0-133-218 kubenswrapper[2579]: W0419 15:58:33.427621 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod68e9b2d5_d48c_4ad7_9698_ba61bfbdb233.slice/crio-54369359be7a3246fb8d46877346bf54cba66ae8550bc13a4bf8a02a7d15185c WatchSource:0}: Error finding container 54369359be7a3246fb8d46877346bf54cba66ae8550bc13a4bf8a02a7d15185c: Status 404 returned error can't find the container with id 54369359be7a3246fb8d46877346bf54cba66ae8550bc13a4bf8a02a7d15185c Apr 19 15:58:33.429348 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:33.429319 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 19 15:58:33.458905 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:33.458872 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7cd8d7499f-z7gk7_e64bbe5b-551a-4ec8-845b-34a15429f076/console/0.log" Apr 19 15:58:33.492317 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:33.492236 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-jq6s6_a60042c0-48bc-4697-aa4f-73a65e8b15a5/download-server/0.log" Apr 19 15:58:34.056487 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:34.056455 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-w8x7s/perf-node-gather-daemonset-kfr5r" event={"ID":"68e9b2d5-d48c-4ad7-9698-ba61bfbdb233","Type":"ContainerStarted","Data":"4781d0da4ed52b2577b20e78e87af3c239f0ed7755b081ca58df48e08714eaec"} Apr 19 15:58:34.056487 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:34.056489 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-w8x7s/perf-node-gather-daemonset-kfr5r" event={"ID":"68e9b2d5-d48c-4ad7-9698-ba61bfbdb233","Type":"ContainerStarted","Data":"54369359be7a3246fb8d46877346bf54cba66ae8550bc13a4bf8a02a7d15185c"} Apr 19 15:58:34.056753 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:34.056610 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-w8x7s/perf-node-gather-daemonset-kfr5r" Apr 19 15:58:34.074886 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:34.074831 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-w8x7s/perf-node-gather-daemonset-kfr5r" podStartSLOduration=2.074813109 podStartE2EDuration="2.074813109s" podCreationTimestamp="2026-04-19 15:58:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-19 15:58:34.072851387 +0000 UTC m=+2013.178766781" watchObservedRunningTime="2026-04-19 15:58:34.074813109 +0000 UTC m=+2013.180728498" Apr 19 15:58:34.792017 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:34.791985 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-rs4pv_835a8643-4c16-4d4b-bbc6-4e4a5fa3a156/dns/0.log" Apr 19 15:58:34.825436 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:34.825408 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-rs4pv_835a8643-4c16-4d4b-bbc6-4e4a5fa3a156/kube-rbac-proxy/0.log" Apr 19 15:58:34.920334 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:34.920300 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-pss7s_dac7973c-ee33-410c-8f77-093953d73a03/dns-node-resolver/0.log" Apr 19 15:58:35.383423 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:35.383387 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-548869cb46-b9zkq_18816144-4892-4c9c-8870-c79f4966f41a/registry/0.log" Apr 19 15:58:35.401463 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:35.401433 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-4tfml_b3a083b4-d7b2-4f52-b323-b957d5ebc531/node-ca/0.log" Apr 19 15:58:36.206576 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:36.206548 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-5cb8b776cfwmhkc_3bfb65c0-effa-41d5-bdd1-2e41eb93cdbb/istio-proxy/0.log" Apr 19 15:58:36.389916 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:36.389881 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-58b6f876-k8m5w_bc0715c9-b497-4e43-9ffb-a87664024408/istio-proxy/0.log" Apr 19 15:58:36.890008 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:36.889979 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-2xfjp_b10af7e0-ddd6-409a-bf97-0223a35bb81a/serve-healthcheck-canary/0.log" Apr 19 15:58:37.339844 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:37.339811 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-dbkrb_0d8e4508-63ae-4c34-9e5a-88f0e8d37185/insights-operator/0.log" Apr 19 15:58:37.340266 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:37.340044 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-dbkrb_0d8e4508-63ae-4c34-9e5a-88f0e8d37185/insights-operator/1.log" Apr 19 15:58:37.419333 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:37.419305 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-mf22j_8aec71fb-cb0e-441f-9f19-9346759e030b/kube-rbac-proxy/0.log" Apr 19 15:58:37.437882 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:37.437855 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-mf22j_8aec71fb-cb0e-441f-9f19-9346759e030b/exporter/0.log" Apr 19 15:58:37.456645 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:37.456618 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-mf22j_8aec71fb-cb0e-441f-9f19-9346759e030b/extractor/0.log" Apr 19 15:58:39.391343 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:39.391308 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-78fd9b446f-sz2gj_cbeb5ce2-5e31-46a3-822d-5ca343065cf0/manager/0.log" Apr 19 15:58:39.440148 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:39.440117 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-67944f454b-2kv92_9ff7b2f4-5dfd-491b-ab80-88ce950644c4/manager/0.log" Apr 19 15:58:40.071250 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:40.071224 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-w8x7s/perf-node-gather-daemonset-kfr5r" Apr 19 15:58:40.631235 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:40.631208 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-65b77f8fcd-m49lh_667a5519-f1d2-40e9-b3bc-46d649ba3525/manager/0.log" Apr 19 15:58:40.654214 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:40.654188 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_openshift-lws-operator-bfc7f696d-8dr4d_92319b45-1aa1-4418-9292-8eacaf99ec5f/openshift-lws-operator/0.log" Apr 19 15:58:45.367811 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:45.367775 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-k9hxh_e18d4ee3-accb-4d8b-aad0-8801d1395e00/kube-storage-version-migrator-operator/1.log" Apr 19 15:58:45.368707 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:45.368692 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-k9hxh_e18d4ee3-accb-4d8b-aad0-8801d1395e00/kube-storage-version-migrator-operator/0.log" Apr 19 15:58:46.687762 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:46.687708 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-sz2ds_d6faab90-56cc-458f-bf13-4b00ae0b1686/kube-multus-additional-cni-plugins/0.log" Apr 19 15:58:46.706584 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:46.706549 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-sz2ds_d6faab90-56cc-458f-bf13-4b00ae0b1686/egress-router-binary-copy/0.log" Apr 19 15:58:46.725840 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:46.725801 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-sz2ds_d6faab90-56cc-458f-bf13-4b00ae0b1686/cni-plugins/0.log" Apr 19 15:58:46.745517 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:46.745494 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-sz2ds_d6faab90-56cc-458f-bf13-4b00ae0b1686/bond-cni-plugin/0.log" Apr 19 15:58:46.763362 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:46.763334 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-sz2ds_d6faab90-56cc-458f-bf13-4b00ae0b1686/routeoverride-cni/0.log" Apr 19 15:58:46.781326 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:46.781301 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-sz2ds_d6faab90-56cc-458f-bf13-4b00ae0b1686/whereabouts-cni-bincopy/0.log" Apr 19 15:58:46.799754 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:46.799708 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-sz2ds_d6faab90-56cc-458f-bf13-4b00ae0b1686/whereabouts-cni/0.log" Apr 19 15:58:46.864401 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:46.864359 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wnr7b_119caa96-ae84-4d21-8b14-6d528d9a67fd/kube-multus/0.log" Apr 19 15:58:46.929213 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:46.929181 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-8cprr_41bb40b9-2854-47c5-8759-3fbea6b42b53/network-metrics-daemon/0.log" Apr 19 15:58:46.945895 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:46.945830 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-8cprr_41bb40b9-2854-47c5-8759-3fbea6b42b53/kube-rbac-proxy/0.log" Apr 19 15:58:48.442027 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:48.441983 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xxqlx_73514b32-300b-4466-b414-022b4c2e1f8e/ovn-controller/0.log" Apr 19 15:58:48.457222 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:48.457192 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xxqlx_73514b32-300b-4466-b414-022b4c2e1f8e/ovn-acl-logging/0.log" Apr 19 15:58:48.467513 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:48.467484 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xxqlx_73514b32-300b-4466-b414-022b4c2e1f8e/ovn-acl-logging/1.log" Apr 19 15:58:48.484335 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:48.484303 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xxqlx_73514b32-300b-4466-b414-022b4c2e1f8e/kube-rbac-proxy-node/0.log" Apr 19 15:58:48.502116 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:48.502086 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xxqlx_73514b32-300b-4466-b414-022b4c2e1f8e/kube-rbac-proxy-ovn-metrics/0.log" Apr 19 15:58:48.518916 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:48.518883 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xxqlx_73514b32-300b-4466-b414-022b4c2e1f8e/northd/0.log" Apr 19 15:58:48.537997 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:48.537955 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xxqlx_73514b32-300b-4466-b414-022b4c2e1f8e/nbdb/0.log" Apr 19 15:58:48.556292 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:48.556264 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xxqlx_73514b32-300b-4466-b414-022b4c2e1f8e/sbdb/0.log" Apr 19 15:58:48.656853 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:48.656821 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xxqlx_73514b32-300b-4466-b414-022b4c2e1f8e/ovnkube-controller/0.log" Apr 19 15:58:49.797339 ip-10-0-133-218 kubenswrapper[2579]: I0419 15:58:49.797306 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-r46tx_445f4ff9-7c10-4b4e-8d46-b2e4e449c5bc/network-check-target-container/0.log"