Apr 20 15:02:19.758743 ip-10-0-134-230 systemd[1]: Starting Kubernetes Kubelet... Apr 20 15:02:20.177319 ip-10-0-134-230 kubenswrapper[2572]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 15:02:20.177319 ip-10-0-134-230 kubenswrapper[2572]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 20 15:02:20.177319 ip-10-0-134-230 kubenswrapper[2572]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 15:02:20.177319 ip-10-0-134-230 kubenswrapper[2572]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 20 15:02:20.177319 ip-10-0-134-230 kubenswrapper[2572]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 15:02:20.178802 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.178684 2572 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 20 15:02:20.181066 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181049 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 15:02:20.181066 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181066 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 15:02:20.181127 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181069 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 15:02:20.181127 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181073 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 15:02:20.181127 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181076 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 15:02:20.181127 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181079 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 15:02:20.181127 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181082 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 15:02:20.181127 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181085 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 15:02:20.181127 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181088 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 15:02:20.181127 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181091 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 15:02:20.181127 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181094 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 15:02:20.181127 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181097 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 15:02:20.181127 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181100 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 15:02:20.181127 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181103 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 15:02:20.181127 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181106 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 15:02:20.181127 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181110 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 15:02:20.181127 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181114 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 15:02:20.181127 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181117 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 15:02:20.181127 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181122 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 15:02:20.181127 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181125 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 15:02:20.181127 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181129 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 15:02:20.181599 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181132 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 15:02:20.181599 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181135 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 15:02:20.181599 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181138 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 15:02:20.181599 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181141 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 15:02:20.181599 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181144 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 15:02:20.181599 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181146 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 15:02:20.181599 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181149 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 15:02:20.181599 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181152 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 15:02:20.181599 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181155 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 15:02:20.181599 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181157 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 15:02:20.181599 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181160 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 15:02:20.181599 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181162 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 15:02:20.181599 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181166 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 15:02:20.181599 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181169 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 15:02:20.181599 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181171 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 15:02:20.181599 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181174 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 15:02:20.181599 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181177 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 15:02:20.181599 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181180 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 15:02:20.181599 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181182 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 15:02:20.182073 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181185 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 15:02:20.182073 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181187 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 15:02:20.182073 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181190 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 15:02:20.182073 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181192 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 15:02:20.182073 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181195 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 15:02:20.182073 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181197 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 15:02:20.182073 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181200 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 15:02:20.182073 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181203 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 15:02:20.182073 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181205 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 15:02:20.182073 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181208 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 15:02:20.182073 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181210 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 15:02:20.182073 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181213 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 15:02:20.182073 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181215 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 15:02:20.182073 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181218 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 15:02:20.182073 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181221 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 15:02:20.182073 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181224 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 15:02:20.182073 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181227 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 15:02:20.182073 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181230 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 15:02:20.182073 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181233 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 15:02:20.182073 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181235 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 15:02:20.182584 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181238 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 15:02:20.182584 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181241 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 15:02:20.182584 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181243 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 15:02:20.182584 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181247 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 15:02:20.182584 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181250 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 15:02:20.182584 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181253 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 15:02:20.182584 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181256 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 15:02:20.182584 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181259 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 15:02:20.182584 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181262 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 15:02:20.182584 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181264 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 15:02:20.182584 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181267 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 15:02:20.182584 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181270 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 15:02:20.182584 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181272 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 15:02:20.182584 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181275 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 15:02:20.182584 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181277 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 15:02:20.182584 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181280 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 15:02:20.182584 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181282 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 15:02:20.182584 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181285 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 15:02:20.182584 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181288 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 15:02:20.182584 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181290 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 15:02:20.183117 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181293 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 15:02:20.183117 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181295 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 15:02:20.183117 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181298 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 15:02:20.183117 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181300 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 15:02:20.183117 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181303 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 20 15:02:20.183117 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181305 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 15:02:20.183117 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181721 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 15:02:20.183117 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181727 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 15:02:20.183117 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181730 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 15:02:20.183117 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181733 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 15:02:20.183117 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181735 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 15:02:20.183117 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181738 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 15:02:20.183117 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181740 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 15:02:20.183117 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181743 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 15:02:20.183117 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181746 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 15:02:20.183117 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181748 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 15:02:20.183117 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181751 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 15:02:20.183117 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181754 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 15:02:20.183117 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181757 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 15:02:20.183631 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181760 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 15:02:20.183631 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181763 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 15:02:20.183631 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181765 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 15:02:20.183631 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181768 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 15:02:20.183631 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181770 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 15:02:20.183631 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181773 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 15:02:20.183631 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181776 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 15:02:20.183631 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181778 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 20 15:02:20.183631 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181781 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 15:02:20.183631 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181783 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 15:02:20.183631 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181786 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 15:02:20.183631 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181789 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 15:02:20.183631 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181791 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 15:02:20.183631 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181794 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 15:02:20.183631 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181797 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 15:02:20.183631 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181799 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 15:02:20.183631 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181802 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 15:02:20.183631 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181805 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 15:02:20.183631 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181807 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 15:02:20.183631 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181812 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 15:02:20.183631 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181815 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 15:02:20.184145 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181817 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 15:02:20.184145 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181820 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 15:02:20.184145 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181822 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 15:02:20.184145 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181825 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 15:02:20.184145 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181828 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 15:02:20.184145 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181830 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 15:02:20.184145 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181833 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 15:02:20.184145 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181835 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 15:02:20.184145 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181838 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 15:02:20.184145 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181841 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 15:02:20.184145 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181844 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 15:02:20.184145 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181847 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 15:02:20.184145 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181850 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 15:02:20.184145 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181853 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 15:02:20.184145 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181855 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 15:02:20.184145 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181858 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 15:02:20.184145 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181860 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 15:02:20.184145 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181863 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 15:02:20.184145 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181866 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 15:02:20.184701 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181868 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 15:02:20.184701 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181871 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 15:02:20.184701 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181874 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 15:02:20.184701 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181877 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 15:02:20.184701 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181880 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 15:02:20.184701 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181883 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 15:02:20.184701 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181887 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 15:02:20.184701 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181891 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 15:02:20.184701 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181895 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 15:02:20.184701 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181898 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 15:02:20.184701 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181902 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 15:02:20.184701 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181905 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 15:02:20.184701 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181908 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 15:02:20.184701 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181911 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 15:02:20.184701 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181913 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 15:02:20.184701 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181916 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 15:02:20.184701 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181919 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 15:02:20.184701 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181922 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 15:02:20.185147 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181924 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 15:02:20.185147 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181927 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 15:02:20.185147 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181930 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 15:02:20.185147 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181932 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 15:02:20.185147 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181935 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 15:02:20.185147 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181938 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 15:02:20.185147 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181940 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 15:02:20.185147 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181943 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 15:02:20.185147 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181946 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 15:02:20.185147 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181949 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 15:02:20.185147 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181951 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 15:02:20.185147 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181954 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 15:02:20.185147 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181956 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 15:02:20.185147 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181959 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 15:02:20.185147 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.181961 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 15:02:20.185147 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182035 2572 flags.go:64] FLAG: --address="0.0.0.0" Apr 20 15:02:20.185147 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182042 2572 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 20 15:02:20.185147 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182049 2572 flags.go:64] FLAG: --anonymous-auth="true" Apr 20 15:02:20.185147 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182053 2572 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 20 15:02:20.185147 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182058 2572 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 20 15:02:20.185147 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182061 2572 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 20 15:02:20.185147 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182065 2572 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 20 15:02:20.185700 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182070 2572 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 20 15:02:20.185700 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182073 2572 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 20 15:02:20.185700 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182078 2572 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 20 15:02:20.185700 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182082 2572 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 20 15:02:20.185700 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182086 2572 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 20 15:02:20.185700 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182089 2572 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 20 15:02:20.185700 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182092 2572 flags.go:64] FLAG: --cgroup-root="" Apr 20 15:02:20.185700 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182095 2572 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 20 15:02:20.185700 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182099 2572 flags.go:64] FLAG: --client-ca-file="" Apr 20 15:02:20.185700 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182102 2572 flags.go:64] FLAG: --cloud-config="" Apr 20 15:02:20.185700 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182104 2572 flags.go:64] FLAG: --cloud-provider="external" Apr 20 15:02:20.185700 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182107 2572 flags.go:64] FLAG: --cluster-dns="[]" Apr 20 15:02:20.185700 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182114 2572 flags.go:64] FLAG: --cluster-domain="" Apr 20 15:02:20.185700 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182117 2572 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 20 15:02:20.185700 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182121 2572 flags.go:64] FLAG: --config-dir="" Apr 20 15:02:20.185700 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182123 2572 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 20 15:02:20.185700 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182127 2572 flags.go:64] FLAG: --container-log-max-files="5" Apr 20 15:02:20.185700 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182131 2572 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 20 15:02:20.185700 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182134 2572 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 20 15:02:20.185700 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182138 2572 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 20 15:02:20.185700 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182141 2572 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 20 15:02:20.185700 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182144 2572 flags.go:64] FLAG: --contention-profiling="false" Apr 20 15:02:20.185700 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182147 2572 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 20 15:02:20.185700 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182150 2572 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 20 15:02:20.186288 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182153 2572 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 20 15:02:20.186288 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182156 2572 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 20 15:02:20.186288 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182160 2572 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 20 15:02:20.186288 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182163 2572 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 20 15:02:20.186288 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182166 2572 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 20 15:02:20.186288 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182169 2572 flags.go:64] FLAG: --enable-load-reader="false" Apr 20 15:02:20.186288 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182172 2572 flags.go:64] FLAG: --enable-server="true" Apr 20 15:02:20.186288 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182174 2572 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 20 15:02:20.186288 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182182 2572 flags.go:64] FLAG: --event-burst="100" Apr 20 15:02:20.186288 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182185 2572 flags.go:64] FLAG: --event-qps="50" Apr 20 15:02:20.186288 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182189 2572 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 20 15:02:20.186288 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182192 2572 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 20 15:02:20.186288 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182195 2572 flags.go:64] FLAG: --eviction-hard="" Apr 20 15:02:20.186288 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182199 2572 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 20 15:02:20.186288 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182202 2572 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 20 15:02:20.186288 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182205 2572 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 20 15:02:20.186288 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182208 2572 flags.go:64] FLAG: --eviction-soft="" Apr 20 15:02:20.186288 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182211 2572 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 20 15:02:20.186288 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182214 2572 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 20 15:02:20.186288 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182217 2572 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 20 15:02:20.186288 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182220 2572 flags.go:64] FLAG: --experimental-mounter-path="" Apr 20 15:02:20.186288 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182223 2572 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 20 15:02:20.186288 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182226 2572 flags.go:64] FLAG: --fail-swap-on="true" Apr 20 15:02:20.186288 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182228 2572 flags.go:64] FLAG: --feature-gates="" Apr 20 15:02:20.186288 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182232 2572 flags.go:64] FLAG: --file-check-frequency="20s" Apr 20 15:02:20.186967 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182235 2572 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 20 15:02:20.186967 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182238 2572 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 20 15:02:20.186967 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182242 2572 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 20 15:02:20.186967 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182245 2572 flags.go:64] FLAG: --healthz-port="10248" Apr 20 15:02:20.186967 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182248 2572 flags.go:64] FLAG: --help="false" Apr 20 15:02:20.186967 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182251 2572 flags.go:64] FLAG: --hostname-override="ip-10-0-134-230.ec2.internal" Apr 20 15:02:20.186967 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182255 2572 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 20 15:02:20.186967 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182257 2572 flags.go:64] FLAG: --http-check-frequency="20s" Apr 20 15:02:20.186967 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182260 2572 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 20 15:02:20.186967 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182264 2572 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 20 15:02:20.186967 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182268 2572 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 20 15:02:20.186967 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182271 2572 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 20 15:02:20.186967 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182273 2572 flags.go:64] FLAG: --image-service-endpoint="" Apr 20 15:02:20.186967 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182276 2572 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 20 15:02:20.186967 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182279 2572 flags.go:64] FLAG: --kube-api-burst="100" Apr 20 15:02:20.186967 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182300 2572 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 20 15:02:20.186967 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182304 2572 flags.go:64] FLAG: --kube-api-qps="50" Apr 20 15:02:20.186967 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182307 2572 flags.go:64] FLAG: --kube-reserved="" Apr 20 15:02:20.186967 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182314 2572 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 20 15:02:20.186967 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182317 2572 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 20 15:02:20.186967 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182320 2572 flags.go:64] FLAG: --kubelet-cgroups="" Apr 20 15:02:20.186967 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182323 2572 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 20 15:02:20.186967 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182326 2572 flags.go:64] FLAG: --lock-file="" Apr 20 15:02:20.186967 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182329 2572 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 20 15:02:20.187592 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182332 2572 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 20 15:02:20.187592 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182335 2572 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 20 15:02:20.187592 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182340 2572 flags.go:64] FLAG: --log-json-split-stream="false" Apr 20 15:02:20.187592 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182342 2572 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 20 15:02:20.187592 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182346 2572 flags.go:64] FLAG: --log-text-split-stream="false" Apr 20 15:02:20.187592 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182349 2572 flags.go:64] FLAG: --logging-format="text" Apr 20 15:02:20.187592 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182351 2572 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 20 15:02:20.187592 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182355 2572 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 20 15:02:20.187592 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182358 2572 flags.go:64] FLAG: --manifest-url="" Apr 20 15:02:20.187592 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182360 2572 flags.go:64] FLAG: --manifest-url-header="" Apr 20 15:02:20.187592 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182365 2572 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 20 15:02:20.187592 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182369 2572 flags.go:64] FLAG: --max-open-files="1000000" Apr 20 15:02:20.187592 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182373 2572 flags.go:64] FLAG: --max-pods="110" Apr 20 15:02:20.187592 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182376 2572 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 20 15:02:20.187592 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182379 2572 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 20 15:02:20.187592 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182382 2572 flags.go:64] FLAG: --memory-manager-policy="None" Apr 20 15:02:20.187592 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182385 2572 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 20 15:02:20.187592 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182388 2572 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 20 15:02:20.187592 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182391 2572 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 20 15:02:20.187592 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182394 2572 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 20 15:02:20.187592 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182401 2572 flags.go:64] FLAG: --node-status-max-images="50" Apr 20 15:02:20.187592 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182404 2572 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 20 15:02:20.187592 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182407 2572 flags.go:64] FLAG: --oom-score-adj="-999" Apr 20 15:02:20.187592 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182410 2572 flags.go:64] FLAG: --pod-cidr="" Apr 20 15:02:20.188223 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182413 2572 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 20 15:02:20.188223 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182419 2572 flags.go:64] FLAG: --pod-manifest-path="" Apr 20 15:02:20.188223 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182422 2572 flags.go:64] FLAG: --pod-max-pids="-1" Apr 20 15:02:20.188223 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182427 2572 flags.go:64] FLAG: --pods-per-core="0" Apr 20 15:02:20.188223 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182430 2572 flags.go:64] FLAG: --port="10250" Apr 20 15:02:20.188223 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182433 2572 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 20 15:02:20.188223 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182436 2572 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-07a9e52bf4b075cb8" Apr 20 15:02:20.188223 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182439 2572 flags.go:64] FLAG: --qos-reserved="" Apr 20 15:02:20.188223 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182442 2572 flags.go:64] FLAG: --read-only-port="10255" Apr 20 15:02:20.188223 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182445 2572 flags.go:64] FLAG: --register-node="true" Apr 20 15:02:20.188223 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182448 2572 flags.go:64] FLAG: --register-schedulable="true" Apr 20 15:02:20.188223 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182451 2572 flags.go:64] FLAG: --register-with-taints="" Apr 20 15:02:20.188223 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182454 2572 flags.go:64] FLAG: --registry-burst="10" Apr 20 15:02:20.188223 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182457 2572 flags.go:64] FLAG: --registry-qps="5" Apr 20 15:02:20.188223 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182460 2572 flags.go:64] FLAG: --reserved-cpus="" Apr 20 15:02:20.188223 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182463 2572 flags.go:64] FLAG: --reserved-memory="" Apr 20 15:02:20.188223 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182469 2572 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 20 15:02:20.188223 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182472 2572 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 20 15:02:20.188223 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182475 2572 flags.go:64] FLAG: --rotate-certificates="false" Apr 20 15:02:20.188223 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182478 2572 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 20 15:02:20.188223 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182494 2572 flags.go:64] FLAG: --runonce="false" Apr 20 15:02:20.188223 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182497 2572 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 20 15:02:20.188223 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182500 2572 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 20 15:02:20.188223 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182504 2572 flags.go:64] FLAG: --seccomp-default="false" Apr 20 15:02:20.188223 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182507 2572 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 20 15:02:20.188869 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182510 2572 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 20 15:02:20.188869 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182514 2572 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 20 15:02:20.188869 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182517 2572 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 20 15:02:20.188869 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182520 2572 flags.go:64] FLAG: --storage-driver-password="root" Apr 20 15:02:20.188869 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182523 2572 flags.go:64] FLAG: --storage-driver-secure="false" Apr 20 15:02:20.188869 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182526 2572 flags.go:64] FLAG: --storage-driver-table="stats" Apr 20 15:02:20.188869 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182529 2572 flags.go:64] FLAG: --storage-driver-user="root" Apr 20 15:02:20.188869 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182532 2572 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 20 15:02:20.188869 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182535 2572 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 20 15:02:20.188869 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182538 2572 flags.go:64] FLAG: --system-cgroups="" Apr 20 15:02:20.188869 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182542 2572 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 20 15:02:20.188869 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182548 2572 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 20 15:02:20.188869 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182551 2572 flags.go:64] FLAG: --tls-cert-file="" Apr 20 15:02:20.188869 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182554 2572 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 20 15:02:20.188869 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182560 2572 flags.go:64] FLAG: --tls-min-version="" Apr 20 15:02:20.188869 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182563 2572 flags.go:64] FLAG: --tls-private-key-file="" Apr 20 15:02:20.188869 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182566 2572 flags.go:64] FLAG: --topology-manager-policy="none" Apr 20 15:02:20.188869 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182569 2572 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 20 15:02:20.188869 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182572 2572 flags.go:64] FLAG: --topology-manager-scope="container" Apr 20 15:02:20.188869 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182575 2572 flags.go:64] FLAG: --v="2" Apr 20 15:02:20.188869 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182579 2572 flags.go:64] FLAG: --version="false" Apr 20 15:02:20.188869 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182583 2572 flags.go:64] FLAG: --vmodule="" Apr 20 15:02:20.188869 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182588 2572 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 20 15:02:20.188869 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.182591 2572 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 20 15:02:20.188869 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.182698 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 15:02:20.189474 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.182702 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 15:02:20.189474 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.182706 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 15:02:20.189474 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.182709 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 15:02:20.189474 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.182719 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 15:02:20.189474 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.182723 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 15:02:20.189474 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.182726 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 15:02:20.189474 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.182728 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 15:02:20.189474 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.182732 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 15:02:20.189474 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.182734 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 15:02:20.189474 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.182737 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 15:02:20.189474 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.182740 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 15:02:20.189474 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.182742 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 15:02:20.189474 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.182745 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 15:02:20.189474 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.182747 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 15:02:20.189474 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.182750 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 15:02:20.189474 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.182753 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 15:02:20.189474 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.182755 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 15:02:20.189474 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.182759 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 15:02:20.189474 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.182762 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 15:02:20.189474 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.182765 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 15:02:20.189996 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.182768 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 15:02:20.189996 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.182770 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 15:02:20.189996 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.182773 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 15:02:20.189996 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.182778 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 15:02:20.189996 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.182781 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 15:02:20.189996 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.182785 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 15:02:20.189996 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.182788 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 15:02:20.189996 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.182791 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 15:02:20.189996 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.182794 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 15:02:20.189996 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.182797 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 15:02:20.189996 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.182800 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 15:02:20.189996 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.182803 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 15:02:20.189996 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.182805 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 15:02:20.189996 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.182808 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 15:02:20.189996 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.182811 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 15:02:20.189996 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.182814 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 15:02:20.189996 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.182817 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 15:02:20.189996 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.182820 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 15:02:20.189996 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.182823 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 15:02:20.190490 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.182825 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 15:02:20.190490 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.182828 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 15:02:20.190490 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.182830 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 15:02:20.190490 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.182833 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 15:02:20.190490 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.182836 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 15:02:20.190490 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.182838 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 15:02:20.190490 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.182841 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 15:02:20.190490 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.182843 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 15:02:20.190490 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.182846 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 15:02:20.190490 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.182850 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 15:02:20.190490 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.182855 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 15:02:20.190490 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.182857 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 15:02:20.190490 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.182860 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 15:02:20.190490 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.182863 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 15:02:20.190490 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.182865 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 15:02:20.190490 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.182868 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 15:02:20.190490 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.182872 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 15:02:20.190490 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.182874 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 15:02:20.190490 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.182877 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 20 15:02:20.190490 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.182880 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 15:02:20.190983 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.182883 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 15:02:20.190983 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.182885 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 15:02:20.190983 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.182888 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 15:02:20.190983 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.182891 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 15:02:20.190983 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.182893 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 15:02:20.190983 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.182896 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 15:02:20.190983 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.182898 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 15:02:20.190983 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.182901 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 15:02:20.190983 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.182903 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 15:02:20.190983 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.182906 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 15:02:20.190983 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.182914 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 15:02:20.190983 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.182917 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 15:02:20.190983 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.182920 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 15:02:20.190983 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.182922 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 15:02:20.190983 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.182925 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 15:02:20.190983 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.182927 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 15:02:20.190983 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.182930 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 15:02:20.190983 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.182932 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 15:02:20.190983 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.182935 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 15:02:20.190983 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.182937 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 15:02:20.191541 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.182940 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 15:02:20.191541 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.182942 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 15:02:20.191541 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.182947 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 15:02:20.191541 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.182949 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 15:02:20.191541 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.182952 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 15:02:20.191541 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.182954 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 15:02:20.191541 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.183529 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 15:02:20.191541 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.189806 2572 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 20 15:02:20.191541 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.189820 2572 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 20 15:02:20.191541 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.189872 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 15:02:20.191541 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.189877 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 15:02:20.191541 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.189881 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 15:02:20.191541 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.189884 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 15:02:20.191541 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.189887 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 15:02:20.191541 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.189891 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 15:02:20.191906 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.189895 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 15:02:20.191906 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.189897 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 15:02:20.191906 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.189900 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 15:02:20.191906 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.189903 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 15:02:20.191906 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.189906 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 15:02:20.191906 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.189909 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 15:02:20.191906 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.189911 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 15:02:20.191906 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.189915 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 15:02:20.191906 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.189918 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 15:02:20.191906 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.189920 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 15:02:20.191906 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.189923 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 15:02:20.191906 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.189926 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 15:02:20.191906 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.189928 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 15:02:20.191906 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.189931 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 15:02:20.191906 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.189933 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 15:02:20.191906 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.189936 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 15:02:20.191906 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.189938 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 15:02:20.191906 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.189941 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 15:02:20.191906 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.189943 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 15:02:20.192373 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.189946 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 15:02:20.192373 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.189949 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 15:02:20.192373 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.189951 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 15:02:20.192373 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.189954 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 15:02:20.192373 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.189956 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 15:02:20.192373 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.189959 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 15:02:20.192373 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.189964 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 15:02:20.192373 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.189968 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 15:02:20.192373 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.189971 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 15:02:20.192373 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.189974 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 15:02:20.192373 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.189977 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 15:02:20.192373 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.189980 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 15:02:20.192373 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.189983 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 15:02:20.192373 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.189986 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 15:02:20.192373 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.189989 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 15:02:20.192373 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.189992 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 15:02:20.192373 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.189995 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 15:02:20.192373 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.189998 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 15:02:20.192373 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190001 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 15:02:20.192373 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190004 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 15:02:20.192882 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190007 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 15:02:20.192882 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190011 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 15:02:20.192882 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190014 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 15:02:20.192882 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190017 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 15:02:20.192882 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190019 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 15:02:20.192882 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190022 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 15:02:20.192882 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190025 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 15:02:20.192882 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190027 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 15:02:20.192882 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190030 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 15:02:20.192882 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190033 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 15:02:20.192882 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190035 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 15:02:20.192882 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190037 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 15:02:20.192882 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190040 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 15:02:20.192882 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190042 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 15:02:20.192882 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190045 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 15:02:20.192882 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190047 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 15:02:20.192882 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190050 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 15:02:20.192882 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190052 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 15:02:20.192882 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190056 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 20 15:02:20.192882 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190059 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 15:02:20.193371 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190062 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 15:02:20.193371 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190065 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 15:02:20.193371 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190068 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 15:02:20.193371 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190071 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 15:02:20.193371 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190073 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 15:02:20.193371 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190076 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 15:02:20.193371 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190079 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 15:02:20.193371 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190081 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 15:02:20.193371 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190084 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 15:02:20.193371 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190087 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 15:02:20.193371 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190089 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 15:02:20.193371 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190092 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 15:02:20.193371 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190094 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 15:02:20.193371 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190097 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 15:02:20.193371 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190100 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 15:02:20.193371 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190102 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 15:02:20.193371 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190105 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 15:02:20.193371 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190107 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 15:02:20.193371 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190110 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 15:02:20.193371 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190113 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 15:02:20.193923 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190115 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 15:02:20.193923 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.190120 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 15:02:20.193923 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190222 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 15:02:20.193923 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190227 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 15:02:20.193923 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190229 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 15:02:20.193923 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190232 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 15:02:20.193923 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190236 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 15:02:20.193923 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190241 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 15:02:20.193923 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190244 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 15:02:20.193923 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190247 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 15:02:20.193923 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190250 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 15:02:20.193923 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190254 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 15:02:20.193923 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190257 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 15:02:20.193923 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190260 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 15:02:20.193923 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190263 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 15:02:20.194303 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190266 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 15:02:20.194303 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190268 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 15:02:20.194303 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190271 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 15:02:20.194303 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190273 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 15:02:20.194303 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190276 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 15:02:20.194303 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190279 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 15:02:20.194303 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190281 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 15:02:20.194303 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190284 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 15:02:20.194303 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190287 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 15:02:20.194303 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190289 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 15:02:20.194303 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190292 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 15:02:20.194303 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190294 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 15:02:20.194303 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190297 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 15:02:20.194303 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190301 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 15:02:20.194303 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190304 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 15:02:20.194303 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190307 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 15:02:20.194303 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190309 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 15:02:20.194303 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190312 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 15:02:20.194303 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190315 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 15:02:20.194805 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190317 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 15:02:20.194805 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190320 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 15:02:20.194805 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190322 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 15:02:20.194805 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190325 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 15:02:20.194805 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190327 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 15:02:20.194805 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190330 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 15:02:20.194805 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190332 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 15:02:20.194805 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190335 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 15:02:20.194805 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190337 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 15:02:20.194805 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190340 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 15:02:20.194805 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190343 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 15:02:20.194805 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190345 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 15:02:20.194805 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190348 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 15:02:20.194805 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190351 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 15:02:20.194805 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190353 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 15:02:20.194805 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190356 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 15:02:20.194805 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190358 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 15:02:20.194805 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190360 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 15:02:20.194805 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190363 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 15:02:20.194805 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190365 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 15:02:20.195342 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190368 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 15:02:20.195342 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190370 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 15:02:20.195342 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190373 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 15:02:20.195342 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190375 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 15:02:20.195342 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190378 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 15:02:20.195342 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190380 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 15:02:20.195342 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190383 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 15:02:20.195342 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190386 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 15:02:20.195342 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190388 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 20 15:02:20.195342 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190391 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 15:02:20.195342 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190393 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 15:02:20.195342 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190396 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 15:02:20.195342 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190398 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 15:02:20.195342 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190401 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 15:02:20.195342 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190403 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 15:02:20.195342 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190406 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 15:02:20.195342 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190408 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 15:02:20.195342 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190411 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 15:02:20.195342 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190413 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 15:02:20.195342 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190416 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 15:02:20.195912 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190418 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 15:02:20.195912 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190421 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 15:02:20.195912 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190423 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 15:02:20.195912 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190426 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 15:02:20.195912 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190428 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 15:02:20.195912 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190431 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 15:02:20.195912 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190433 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 15:02:20.195912 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190436 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 15:02:20.195912 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190438 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 15:02:20.195912 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190441 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 15:02:20.195912 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190443 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 15:02:20.195912 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190446 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 15:02:20.195912 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190448 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 15:02:20.195912 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:20.190451 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 15:02:20.195912 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.190455 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 15:02:20.195912 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.191360 2572 server.go:962] "Client rotation is on, will bootstrap in background" Apr 20 15:02:20.196304 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.193514 2572 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 20 15:02:20.196304 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.194533 2572 server.go:1019] "Starting client certificate rotation" Apr 20 15:02:20.196304 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.194625 2572 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 15:02:20.196304 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.195301 2572 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 15:02:20.218379 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.218355 2572 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 15:02:20.222569 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.222545 2572 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 15:02:20.233099 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.233080 2572 log.go:25] "Validated CRI v1 runtime API" Apr 20 15:02:20.238517 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.238502 2572 log.go:25] "Validated CRI v1 image API" Apr 20 15:02:20.239779 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.239767 2572 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 20 15:02:20.243760 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.243740 2572 fs.go:135] Filesystem UUIDs: map[30fbcc2d-1ae2-4b8d-b8ab-ef117f0106b5:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 8861e576-a04a-4c1e-abcf-a19156495821:/dev/nvme0n1p4] Apr 20 15:02:20.243835 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.243758 2572 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 20 15:02:20.247622 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.247603 2572 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 15:02:20.249323 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.249202 2572 manager.go:217] Machine: {Timestamp:2026-04-20 15:02:20.247393181 +0000 UTC m=+0.378268474 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3102192 MemoryCapacity:33164484608 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec29c5bea5a9eafc0f28f17b38b750cd SystemUUID:ec29c5be-a5a9-eafc-0f28-f17b38b750cd BootID:db3e32b3-54e5-481b-9ce8-6a04ce9243dc Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582242304 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:84:d5:26:32:07 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:84:d5:26:32:07 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:0e:54:f7:c7:8c:63 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164484608 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 20 15:02:20.249323 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.249319 2572 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 20 15:02:20.249460 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.249448 2572 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 20 15:02:20.250406 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.250386 2572 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 20 15:02:20.250559 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.250409 2572 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-134-230.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 20 15:02:20.250607 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.250569 2572 topology_manager.go:138] "Creating topology manager with none policy" Apr 20 15:02:20.250607 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.250577 2572 container_manager_linux.go:306] "Creating device plugin manager" Apr 20 15:02:20.250607 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.250590 2572 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 15:02:20.252093 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.252083 2572 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 15:02:20.252765 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.252756 2572 state_mem.go:36] "Initialized new in-memory state store" Apr 20 15:02:20.252874 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.252866 2572 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 20 15:02:20.254955 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.254946 2572 kubelet.go:491] "Attempting to sync node with API server" Apr 20 15:02:20.254987 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.254965 2572 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 20 15:02:20.254987 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.254985 2572 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 20 15:02:20.255045 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.254994 2572 kubelet.go:397] "Adding apiserver pod source" Apr 20 15:02:20.255045 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.255002 2572 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 20 15:02:20.256132 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.256120 2572 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 15:02:20.256172 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.256141 2572 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 15:02:20.260261 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.260239 2572 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 20 15:02:20.261703 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.261686 2572 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 20 15:02:20.263502 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.263477 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 20 15:02:20.263585 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.263511 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 20 15:02:20.263585 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.263521 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 20 15:02:20.263585 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.263530 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 20 15:02:20.263585 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.263539 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 20 15:02:20.263585 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.263547 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 20 15:02:20.263585 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.263556 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 20 15:02:20.263585 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.263564 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 20 15:02:20.263585 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.263574 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 20 15:02:20.263585 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.263583 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 20 15:02:20.263849 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.263596 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 20 15:02:20.263849 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.263610 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 20 15:02:20.264377 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.264367 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 20 15:02:20.264426 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.264379 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 20 15:02:20.267314 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.267284 2572 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-134-230.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 20 15:02:20.267314 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:20.267296 2572 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 20 15:02:20.267496 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:20.267366 2572 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-134-230.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 20 15:02:20.267921 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.267908 2572 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 20 15:02:20.267982 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.267950 2572 server.go:1295] "Started kubelet" Apr 20 15:02:20.268051 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.268027 2572 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 20 15:02:20.268123 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.268078 2572 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 20 15:02:20.268169 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.268155 2572 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 20 15:02:20.268758 ip-10-0-134-230 systemd[1]: Started Kubernetes Kubelet. Apr 20 15:02:20.269321 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.269307 2572 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 20 15:02:20.270395 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.270383 2572 server.go:317] "Adding debug handlers to kubelet server" Apr 20 15:02:20.274295 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.274274 2572 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 20 15:02:20.274894 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.274872 2572 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 20 15:02:20.275456 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.275436 2572 factory.go:55] Registering systemd factory Apr 20 15:02:20.275456 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.275458 2572 factory.go:223] Registration of the systemd container factory successfully Apr 20 15:02:20.275631 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.275535 2572 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 20 15:02:20.275631 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.275553 2572 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 20 15:02:20.275631 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.275562 2572 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 20 15:02:20.275631 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.275631 2572 reconstruct.go:97] "Volume reconstruction finished" Apr 20 15:02:20.275808 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.275639 2572 reconciler.go:26] "Reconciler: start to sync state" Apr 20 15:02:20.275808 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:20.275726 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-230.ec2.internal\" not found" Apr 20 15:02:20.275808 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.275745 2572 factory.go:153] Registering CRI-O factory Apr 20 15:02:20.275808 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.275761 2572 factory.go:223] Registration of the crio container factory successfully Apr 20 15:02:20.276006 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.275818 2572 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 20 15:02:20.276006 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.275843 2572 factory.go:103] Registering Raw factory Apr 20 15:02:20.276006 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.275859 2572 manager.go:1196] Started watching for new ooms in manager Apr 20 15:02:20.276205 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.276186 2572 manager.go:319] Starting recovery of all containers Apr 20 15:02:20.286924 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.286888 2572 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 20 15:02:20.287235 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:20.287201 2572 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 20 15:02:20.287962 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.287939 2572 manager.go:324] Recovery completed Apr 20 15:02:20.288336 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:20.287308 2572 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-134-230.ec2.internal.18a818d1500d0109 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-134-230.ec2.internal,UID:ip-10-0-134-230.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-134-230.ec2.internal,},FirstTimestamp:2026-04-20 15:02:20.267921673 +0000 UTC m=+0.398796972,LastTimestamp:2026-04-20 15:02:20.267921673 +0000 UTC m=+0.398796972,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-134-230.ec2.internal,}" Apr 20 15:02:20.288336 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:20.288319 2572 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-134-230.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 20 15:02:20.293561 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.293549 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 15:02:20.295111 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.295092 2572 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-5snzc" Apr 20 15:02:20.296259 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.296244 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-230.ec2.internal" event="NodeHasSufficientMemory" Apr 20 15:02:20.296334 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.296277 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-230.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 15:02:20.296334 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.296307 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-230.ec2.internal" event="NodeHasSufficientPID" Apr 20 15:02:20.296811 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.296792 2572 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 20 15:02:20.296811 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.296810 2572 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 20 15:02:20.296921 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.296827 2572 state_mem.go:36] "Initialized new in-memory state store" Apr 20 15:02:20.298731 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.298719 2572 policy_none.go:49] "None policy: Start" Apr 20 15:02:20.298765 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.298736 2572 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 20 15:02:20.298765 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.298746 2572 state_mem.go:35] "Initializing new in-memory state store" Apr 20 15:02:20.303024 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.303004 2572 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-5snzc" Apr 20 15:02:20.304653 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:20.304577 2572 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-134-230.ec2.internal.18a818d151bd6710 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-134-230.ec2.internal,UID:ip-10-0-134-230.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-134-230.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-134-230.ec2.internal,},FirstTimestamp:2026-04-20 15:02:20.296259344 +0000 UTC m=+0.427134646,LastTimestamp:2026-04-20 15:02:20.296259344 +0000 UTC m=+0.427134646,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-134-230.ec2.internal,}" Apr 20 15:02:20.347049 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.347032 2572 manager.go:341] "Starting Device Plugin manager" Apr 20 15:02:20.348119 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:20.347063 2572 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 20 15:02:20.348119 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.347073 2572 server.go:85] "Starting device plugin registration server" Apr 20 15:02:20.348119 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.347295 2572 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 20 15:02:20.348119 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.347305 2572 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 20 15:02:20.348119 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.347401 2572 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 20 15:02:20.348119 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.347509 2572 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 20 15:02:20.348119 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.347516 2572 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 20 15:02:20.348119 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:20.347975 2572 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 20 15:02:20.348119 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:20.348012 2572 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-134-230.ec2.internal\" not found" Apr 20 15:02:20.404713 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.404683 2572 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 20 15:02:20.404713 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.404719 2572 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 20 15:02:20.404914 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.404744 2572 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 20 15:02:20.404914 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.404759 2572 kubelet.go:2451] "Starting kubelet main sync loop" Apr 20 15:02:20.404914 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:20.404864 2572 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 20 15:02:20.407199 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.407180 2572 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 15:02:20.447907 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.447847 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 15:02:20.448657 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.448640 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-230.ec2.internal" event="NodeHasSufficientMemory" Apr 20 15:02:20.448757 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.448673 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-230.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 15:02:20.448757 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.448689 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-230.ec2.internal" event="NodeHasSufficientPID" Apr 20 15:02:20.448757 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.448718 2572 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-134-230.ec2.internal" Apr 20 15:02:20.456931 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.456917 2572 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-134-230.ec2.internal" Apr 20 15:02:20.456994 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:20.456939 2572 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-134-230.ec2.internal\": node \"ip-10-0-134-230.ec2.internal\" not found" Apr 20 15:02:20.472109 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:20.472088 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-230.ec2.internal\" not found" Apr 20 15:02:20.504982 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.504960 2572 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-230.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-134-230.ec2.internal"] Apr 20 15:02:20.505038 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.505022 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 15:02:20.506338 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.506322 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-230.ec2.internal" event="NodeHasSufficientMemory" Apr 20 15:02:20.506407 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.506348 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-230.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 15:02:20.506407 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.506359 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-230.ec2.internal" event="NodeHasSufficientPID" Apr 20 15:02:20.507470 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.507458 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 15:02:20.507625 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.507611 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-230.ec2.internal" Apr 20 15:02:20.507667 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.507642 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 15:02:20.508102 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.508084 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-230.ec2.internal" event="NodeHasSufficientMemory" Apr 20 15:02:20.508171 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.508112 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-230.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 15:02:20.508171 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.508123 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-230.ec2.internal" event="NodeHasSufficientPID" Apr 20 15:02:20.508171 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.508090 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-230.ec2.internal" event="NodeHasSufficientMemory" Apr 20 15:02:20.508277 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.508184 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-230.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 15:02:20.508277 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.508198 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-230.ec2.internal" event="NodeHasSufficientPID" Apr 20 15:02:20.509217 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.509204 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-230.ec2.internal" Apr 20 15:02:20.509261 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.509227 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 15:02:20.509871 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.509857 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-230.ec2.internal" event="NodeHasSufficientMemory" Apr 20 15:02:20.509931 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.509881 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-230.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 15:02:20.509931 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.509891 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-230.ec2.internal" event="NodeHasSufficientPID" Apr 20 15:02:20.535026 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:20.535007 2572 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-134-230.ec2.internal\" not found" node="ip-10-0-134-230.ec2.internal" Apr 20 15:02:20.539305 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:20.539289 2572 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-134-230.ec2.internal\" not found" node="ip-10-0-134-230.ec2.internal" Apr 20 15:02:20.572408 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:20.572390 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-230.ec2.internal\" not found" Apr 20 15:02:20.672884 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:20.672860 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-230.ec2.internal\" not found" Apr 20 15:02:20.677156 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.677141 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/22ceeb306285ffc0c74b817e3784b6a3-config\") pod \"kube-apiserver-proxy-ip-10-0-134-230.ec2.internal\" (UID: \"22ceeb306285ffc0c74b817e3784b6a3\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-230.ec2.internal" Apr 20 15:02:20.677210 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.677163 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/00a4ac28b2d170e2b762126f5342671d-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-230.ec2.internal\" (UID: \"00a4ac28b2d170e2b762126f5342671d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-230.ec2.internal" Apr 20 15:02:20.677210 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.677180 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/00a4ac28b2d170e2b762126f5342671d-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-230.ec2.internal\" (UID: \"00a4ac28b2d170e2b762126f5342671d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-230.ec2.internal" Apr 20 15:02:20.773574 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:20.773513 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-230.ec2.internal\" not found" Apr 20 15:02:20.777875 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.777854 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/00a4ac28b2d170e2b762126f5342671d-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-230.ec2.internal\" (UID: \"00a4ac28b2d170e2b762126f5342671d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-230.ec2.internal" Apr 20 15:02:20.777926 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.777886 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/00a4ac28b2d170e2b762126f5342671d-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-230.ec2.internal\" (UID: \"00a4ac28b2d170e2b762126f5342671d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-230.ec2.internal" Apr 20 15:02:20.777926 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.777903 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/22ceeb306285ffc0c74b817e3784b6a3-config\") pod \"kube-apiserver-proxy-ip-10-0-134-230.ec2.internal\" (UID: \"22ceeb306285ffc0c74b817e3784b6a3\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-230.ec2.internal" Apr 20 15:02:20.777985 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.777939 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/00a4ac28b2d170e2b762126f5342671d-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-230.ec2.internal\" (UID: \"00a4ac28b2d170e2b762126f5342671d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-230.ec2.internal" Apr 20 15:02:20.777985 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.777958 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/00a4ac28b2d170e2b762126f5342671d-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-230.ec2.internal\" (UID: \"00a4ac28b2d170e2b762126f5342671d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-230.ec2.internal" Apr 20 15:02:20.778045 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.777944 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/22ceeb306285ffc0c74b817e3784b6a3-config\") pod \"kube-apiserver-proxy-ip-10-0-134-230.ec2.internal\" (UID: \"22ceeb306285ffc0c74b817e3784b6a3\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-230.ec2.internal" Apr 20 15:02:20.837005 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.836979 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-230.ec2.internal" Apr 20 15:02:20.841281 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:20.841264 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-230.ec2.internal" Apr 20 15:02:20.874224 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:20.874200 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-230.ec2.internal\" not found" Apr 20 15:02:20.974717 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:20.974682 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-230.ec2.internal\" not found" Apr 20 15:02:21.075319 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:21.075237 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-230.ec2.internal\" not found" Apr 20 15:02:21.079353 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:21.079334 2572 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 15:02:21.176391 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:21.176358 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-230.ec2.internal\" not found" Apr 20 15:02:21.194634 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:21.194619 2572 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 20 15:02:21.195183 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:21.194747 2572 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 15:02:21.195183 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:21.194785 2572 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 15:02:21.275221 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:21.275082 2572 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 20 15:02:21.276550 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:21.276528 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-230.ec2.internal\" not found" Apr 20 15:02:21.289447 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:21.289427 2572 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 15:02:21.305228 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:21.305198 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-19 14:57:20 +0000 UTC" deadline="2028-01-07 01:32:07.287346768 +0000 UTC" Apr 20 15:02:21.305228 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:21.305228 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15034h29m45.982124758s" Apr 20 15:02:21.307275 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:21.307257 2572 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-rhps5" Apr 20 15:02:21.316201 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:21.316183 2572 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-rhps5" Apr 20 15:02:21.337841 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:21.337807 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22ceeb306285ffc0c74b817e3784b6a3.slice/crio-9e8f2c6e7100177986d7b425a22af3e857551ffb0378fa456c3831339438c3e6 WatchSource:0}: Error finding container 9e8f2c6e7100177986d7b425a22af3e857551ffb0378fa456c3831339438c3e6: Status 404 returned error can't find the container with id 9e8f2c6e7100177986d7b425a22af3e857551ffb0378fa456c3831339438c3e6 Apr 20 15:02:21.343040 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:21.343027 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 15:02:21.353302 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:21.353284 2572 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 15:02:21.377418 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:21.377395 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-230.ec2.internal\" not found" Apr 20 15:02:21.407263 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:21.407214 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-230.ec2.internal" event={"ID":"22ceeb306285ffc0c74b817e3784b6a3","Type":"ContainerStarted","Data":"9e8f2c6e7100177986d7b425a22af3e857551ffb0378fa456c3831339438c3e6"} Apr 20 15:02:21.408044 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:21.408020 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-230.ec2.internal" event={"ID":"00a4ac28b2d170e2b762126f5342671d","Type":"ContainerStarted","Data":"5274d905e9422cda37c82a4c17385a2d5db11d23295abca03ab152c72b3a3de0"} Apr 20 15:02:21.478423 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:21.478403 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-230.ec2.internal\" not found" Apr 20 15:02:21.578873 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:21.578853 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-230.ec2.internal\" not found" Apr 20 15:02:21.602789 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:21.602730 2572 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 15:02:21.675653 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:21.675625 2572 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-230.ec2.internal" Apr 20 15:02:21.689036 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:21.689013 2572 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 15:02:21.689820 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:21.689806 2572 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-230.ec2.internal" Apr 20 15:02:21.695965 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:21.695943 2572 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 15:02:22.128092 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.128063 2572 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 15:02:22.256972 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.256937 2572 apiserver.go:52] "Watching apiserver" Apr 20 15:02:22.264359 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.264175 2572 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 20 15:02:22.266085 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.266001 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-x8tdg","openshift-network-operator/iptables-alerter-5nnpq","openshift-ovn-kubernetes/ovnkube-node-nhq74","kube-system/konnectivity-agent-qh2tf","kube-system/kube-apiserver-proxy-ip-10-0-134-230.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xt47l","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-230.ec2.internal","openshift-multus/multus-additional-cni-plugins-v4jvg","openshift-multus/multus-b568r","openshift-multus/network-metrics-daemon-sq52t","openshift-network-diagnostics/network-check-target-5f8cl","openshift-cluster-node-tuning-operator/tuned-m6lqv"] Apr 20 15:02:22.269275 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.269252 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-x8tdg" Apr 20 15:02:22.272562 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.271684 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-nhq74" Apr 20 15:02:22.272562 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.271738 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 20 15:02:22.272562 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.271991 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-drcvv\"" Apr 20 15:02:22.272562 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.272184 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 20 15:02:22.272562 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.272338 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 20 15:02:22.275171 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.274731 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 20 15:02:22.275171 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.274988 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 20 15:02:22.275171 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.275013 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 20 15:02:22.275171 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.275160 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-cg2g6\"" Apr 20 15:02:22.276062 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.275748 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 20 15:02:22.276062 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.275781 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 20 15:02:22.276062 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.275810 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-5nnpq" Apr 20 15:02:22.276062 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.275756 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 20 15:02:22.276900 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.276860 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-qh2tf" Apr 20 15:02:22.278089 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.278067 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 20 15:02:22.278455 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.278288 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 20 15:02:22.278455 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.278292 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-fgj9r\"" Apr 20 15:02:22.278684 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.278468 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xt47l" Apr 20 15:02:22.280422 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.280154 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 20 15:02:22.280422 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.280264 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-v4jvg" Apr 20 15:02:22.280958 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.280926 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 20 15:02:22.281027 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.280992 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-cb8qj\"" Apr 20 15:02:22.281213 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.281197 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 20 15:02:22.282977 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.282958 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-xrl2v\"" Apr 20 15:02:22.283215 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.283195 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 20 15:02:22.283874 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.283781 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 20 15:02:22.284064 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.284045 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 20 15:02:22.284314 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.284298 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 20 15:02:22.284618 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.284600 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 20 15:02:22.285757 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.285375 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 20 15:02:22.285757 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.285555 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 20 15:02:22.285757 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.285703 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 20 15:02:22.285977 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.285832 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-jbg2c\"" Apr 20 15:02:22.286162 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.286110 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/772b645d-af27-49ac-9efa-a2cf5ea2725a-ovnkube-config\") pod \"ovnkube-node-nhq74\" (UID: \"772b645d-af27-49ac-9efa-a2cf5ea2725a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhq74" Apr 20 15:02:22.286162 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.286139 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69nr6\" (UniqueName: \"kubernetes.io/projected/772b645d-af27-49ac-9efa-a2cf5ea2725a-kube-api-access-69nr6\") pod \"ovnkube-node-nhq74\" (UID: \"772b645d-af27-49ac-9efa-a2cf5ea2725a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhq74" Apr 20 15:02:22.286162 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.286157 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c683848d-ce2a-4a44-8396-37b7a8863b07-host-slash\") pod \"iptables-alerter-5nnpq\" (UID: \"c683848d-ce2a-4a44-8396-37b7a8863b07\") " pod="openshift-network-operator/iptables-alerter-5nnpq" Apr 20 15:02:22.286333 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.286186 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/772b645d-af27-49ac-9efa-a2cf5ea2725a-host-slash\") pod \"ovnkube-node-nhq74\" (UID: \"772b645d-af27-49ac-9efa-a2cf5ea2725a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhq74" Apr 20 15:02:22.286333 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.286254 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/772b645d-af27-49ac-9efa-a2cf5ea2725a-host-run-ovn-kubernetes\") pod \"ovnkube-node-nhq74\" (UID: \"772b645d-af27-49ac-9efa-a2cf5ea2725a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhq74" Apr 20 15:02:22.286333 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.286294 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/cb13e155-4ffa-49d2-8c49-c34c374a7d61-konnectivity-ca\") pod \"konnectivity-agent-qh2tf\" (UID: \"cb13e155-4ffa-49d2-8c49-c34c374a7d61\") " pod="kube-system/konnectivity-agent-qh2tf" Apr 20 15:02:22.286333 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.286321 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5907ebec-df4b-4653-b568-7e4913dcec73-kubelet-dir\") pod \"aws-ebs-csi-driver-node-xt47l\" (UID: \"5907ebec-df4b-4653-b568-7e4913dcec73\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xt47l" Apr 20 15:02:22.286544 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.286382 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/5907ebec-df4b-4653-b568-7e4913dcec73-etc-selinux\") pod \"aws-ebs-csi-driver-node-xt47l\" (UID: \"5907ebec-df4b-4653-b568-7e4913dcec73\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xt47l" Apr 20 15:02:22.286637 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.286540 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/772b645d-af27-49ac-9efa-a2cf5ea2725a-var-lib-openvswitch\") pod \"ovnkube-node-nhq74\" (UID: \"772b645d-af27-49ac-9efa-a2cf5ea2725a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhq74" Apr 20 15:02:22.286707 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.286635 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/772b645d-af27-49ac-9efa-a2cf5ea2725a-run-ovn\") pod \"ovnkube-node-nhq74\" (UID: \"772b645d-af27-49ac-9efa-a2cf5ea2725a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhq74" Apr 20 15:02:22.286707 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.286650 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-b568r" Apr 20 15:02:22.286707 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.286686 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfpfk\" (UniqueName: \"kubernetes.io/projected/5907ebec-df4b-4653-b568-7e4913dcec73-kube-api-access-pfpfk\") pod \"aws-ebs-csi-driver-node-xt47l\" (UID: \"5907ebec-df4b-4653-b568-7e4913dcec73\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xt47l" Apr 20 15:02:22.286854 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.286719 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/772b645d-af27-49ac-9efa-a2cf5ea2725a-host-kubelet\") pod \"ovnkube-node-nhq74\" (UID: \"772b645d-af27-49ac-9efa-a2cf5ea2725a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhq74" Apr 20 15:02:22.286854 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.286745 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/772b645d-af27-49ac-9efa-a2cf5ea2725a-host-run-netns\") pod \"ovnkube-node-nhq74\" (UID: \"772b645d-af27-49ac-9efa-a2cf5ea2725a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhq74" Apr 20 15:02:22.286854 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.286819 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/772b645d-af27-49ac-9efa-a2cf5ea2725a-etc-openvswitch\") pod \"ovnkube-node-nhq74\" (UID: \"772b645d-af27-49ac-9efa-a2cf5ea2725a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhq74" Apr 20 15:02:22.287012 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.286866 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/772b645d-af27-49ac-9efa-a2cf5ea2725a-env-overrides\") pod \"ovnkube-node-nhq74\" (UID: \"772b645d-af27-49ac-9efa-a2cf5ea2725a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhq74" Apr 20 15:02:22.287012 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.286897 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/772b645d-af27-49ac-9efa-a2cf5ea2725a-ovnkube-script-lib\") pod \"ovnkube-node-nhq74\" (UID: \"772b645d-af27-49ac-9efa-a2cf5ea2725a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhq74" Apr 20 15:02:22.287012 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.286929 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/5907ebec-df4b-4653-b568-7e4913dcec73-registration-dir\") pod \"aws-ebs-csi-driver-node-xt47l\" (UID: \"5907ebec-df4b-4653-b568-7e4913dcec73\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xt47l" Apr 20 15:02:22.287012 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.286962 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/22757559-941f-4d9e-9128-3aeefc6665f3-serviceca\") pod \"node-ca-x8tdg\" (UID: \"22757559-941f-4d9e-9128-3aeefc6665f3\") " pod="openshift-image-registry/node-ca-x8tdg" Apr 20 15:02:22.287212 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.287025 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7h7mx\" (UniqueName: \"kubernetes.io/projected/c683848d-ce2a-4a44-8396-37b7a8863b07-kube-api-access-7h7mx\") pod \"iptables-alerter-5nnpq\" (UID: \"c683848d-ce2a-4a44-8396-37b7a8863b07\") " pod="openshift-network-operator/iptables-alerter-5nnpq" Apr 20 15:02:22.287212 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.287086 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/772b645d-af27-49ac-9efa-a2cf5ea2725a-run-systemd\") pod \"ovnkube-node-nhq74\" (UID: \"772b645d-af27-49ac-9efa-a2cf5ea2725a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhq74" Apr 20 15:02:22.287212 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.287117 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/772b645d-af27-49ac-9efa-a2cf5ea2725a-run-openvswitch\") pod \"ovnkube-node-nhq74\" (UID: \"772b645d-af27-49ac-9efa-a2cf5ea2725a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhq74" Apr 20 15:02:22.287212 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.287172 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/5907ebec-df4b-4653-b568-7e4913dcec73-socket-dir\") pod \"aws-ebs-csi-driver-node-xt47l\" (UID: \"5907ebec-df4b-4653-b568-7e4913dcec73\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xt47l" Apr 20 15:02:22.287212 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.287203 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/22757559-941f-4d9e-9128-3aeefc6665f3-host\") pod \"node-ca-x8tdg\" (UID: \"22757559-941f-4d9e-9128-3aeefc6665f3\") " pod="openshift-image-registry/node-ca-x8tdg" Apr 20 15:02:22.287474 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.287254 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/772b645d-af27-49ac-9efa-a2cf5ea2725a-host-cni-bin\") pod \"ovnkube-node-nhq74\" (UID: \"772b645d-af27-49ac-9efa-a2cf5ea2725a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhq74" Apr 20 15:02:22.287474 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.287341 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/772b645d-af27-49ac-9efa-a2cf5ea2725a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nhq74\" (UID: \"772b645d-af27-49ac-9efa-a2cf5ea2725a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhq74" Apr 20 15:02:22.287474 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.287397 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/772b645d-af27-49ac-9efa-a2cf5ea2725a-ovn-node-metrics-cert\") pod \"ovnkube-node-nhq74\" (UID: \"772b645d-af27-49ac-9efa-a2cf5ea2725a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhq74" Apr 20 15:02:22.287474 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.287428 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vf7vt\" (UniqueName: \"kubernetes.io/projected/22757559-941f-4d9e-9128-3aeefc6665f3-kube-api-access-vf7vt\") pod \"node-ca-x8tdg\" (UID: \"22757559-941f-4d9e-9128-3aeefc6665f3\") " pod="openshift-image-registry/node-ca-x8tdg" Apr 20 15:02:22.287819 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.287518 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/c683848d-ce2a-4a44-8396-37b7a8863b07-iptables-alerter-script\") pod \"iptables-alerter-5nnpq\" (UID: \"c683848d-ce2a-4a44-8396-37b7a8863b07\") " pod="openshift-network-operator/iptables-alerter-5nnpq" Apr 20 15:02:22.287819 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.287571 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/772b645d-af27-49ac-9efa-a2cf5ea2725a-log-socket\") pod \"ovnkube-node-nhq74\" (UID: \"772b645d-af27-49ac-9efa-a2cf5ea2725a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhq74" Apr 20 15:02:22.287819 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.287605 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/772b645d-af27-49ac-9efa-a2cf5ea2725a-host-cni-netd\") pod \"ovnkube-node-nhq74\" (UID: \"772b645d-af27-49ac-9efa-a2cf5ea2725a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhq74" Apr 20 15:02:22.287819 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.287630 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/cb13e155-4ffa-49d2-8c49-c34c374a7d61-agent-certs\") pod \"konnectivity-agent-qh2tf\" (UID: \"cb13e155-4ffa-49d2-8c49-c34c374a7d61\") " pod="kube-system/konnectivity-agent-qh2tf" Apr 20 15:02:22.287819 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.287667 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/772b645d-af27-49ac-9efa-a2cf5ea2725a-systemd-units\") pod \"ovnkube-node-nhq74\" (UID: \"772b645d-af27-49ac-9efa-a2cf5ea2725a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhq74" Apr 20 15:02:22.287819 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.287698 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/772b645d-af27-49ac-9efa-a2cf5ea2725a-node-log\") pod \"ovnkube-node-nhq74\" (UID: \"772b645d-af27-49ac-9efa-a2cf5ea2725a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhq74" Apr 20 15:02:22.287819 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.287733 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/5907ebec-df4b-4653-b568-7e4913dcec73-device-dir\") pod \"aws-ebs-csi-driver-node-xt47l\" (UID: \"5907ebec-df4b-4653-b568-7e4913dcec73\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xt47l" Apr 20 15:02:22.287819 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.287766 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/5907ebec-df4b-4653-b568-7e4913dcec73-sys-fs\") pod \"aws-ebs-csi-driver-node-xt47l\" (UID: \"5907ebec-df4b-4653-b568-7e4913dcec73\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xt47l" Apr 20 15:02:22.289195 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.289102 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-z2vjb\"" Apr 20 15:02:22.289195 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.289128 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sq52t" Apr 20 15:02:22.289332 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.289240 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 20 15:02:22.289647 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.289432 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5f8cl" Apr 20 15:02:22.289647 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:22.289497 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5f8cl" podUID="31c75c33-7396-4dd6-8404-bd0e038f65b3" Apr 20 15:02:22.289647 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:22.289317 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sq52t" podUID="7215f3fe-093a-42b9-bea0-26a93cb4e1ff" Apr 20 15:02:22.290614 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.290575 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-m6lqv" Apr 20 15:02:22.292623 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.292602 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 20 15:02:22.292754 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.292696 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 20 15:02:22.292935 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.292915 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-tfhgl\"" Apr 20 15:02:22.316859 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.316836 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 14:57:21 +0000 UTC" deadline="2027-09-29 18:47:35.053729915 +0000 UTC" Apr 20 15:02:22.316859 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.316858 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12651h45m12.736874624s" Apr 20 15:02:22.376370 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.376345 2572 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 20 15:02:22.388232 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.388167 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/772b645d-af27-49ac-9efa-a2cf5ea2725a-host-slash\") pod \"ovnkube-node-nhq74\" (UID: \"772b645d-af27-49ac-9efa-a2cf5ea2725a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhq74" Apr 20 15:02:22.388232 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.388206 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/772b645d-af27-49ac-9efa-a2cf5ea2725a-host-run-ovn-kubernetes\") pod \"ovnkube-node-nhq74\" (UID: \"772b645d-af27-49ac-9efa-a2cf5ea2725a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhq74" Apr 20 15:02:22.388390 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.388233 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/cb13e155-4ffa-49d2-8c49-c34c374a7d61-konnectivity-ca\") pod \"konnectivity-agent-qh2tf\" (UID: \"cb13e155-4ffa-49d2-8c49-c34c374a7d61\") " pod="kube-system/konnectivity-agent-qh2tf" Apr 20 15:02:22.388390 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.388262 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8345e9c1-deff-477f-ba7b-f5320279bda9-etc-kubernetes\") pod \"tuned-m6lqv\" (UID: \"8345e9c1-deff-477f-ba7b-f5320279bda9\") " pod="openshift-cluster-node-tuning-operator/tuned-m6lqv" Apr 20 15:02:22.388390 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.388283 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/8345e9c1-deff-477f-ba7b-f5320279bda9-etc-tuned\") pod \"tuned-m6lqv\" (UID: \"8345e9c1-deff-477f-ba7b-f5320279bda9\") " pod="openshift-cluster-node-tuning-operator/tuned-m6lqv" Apr 20 15:02:22.388390 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.388290 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/772b645d-af27-49ac-9efa-a2cf5ea2725a-host-slash\") pod \"ovnkube-node-nhq74\" (UID: \"772b645d-af27-49ac-9efa-a2cf5ea2725a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhq74" Apr 20 15:02:22.388390 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.388293 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/772b645d-af27-49ac-9efa-a2cf5ea2725a-host-run-ovn-kubernetes\") pod \"ovnkube-node-nhq74\" (UID: \"772b645d-af27-49ac-9efa-a2cf5ea2725a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhq74" Apr 20 15:02:22.388390 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.388332 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/1f117f6f-4e31-4bc5-91d0-9a6176af628e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-v4jvg\" (UID: \"1f117f6f-4e31-4bc5-91d0-9a6176af628e\") " pod="openshift-multus/multus-additional-cni-plugins-v4jvg" Apr 20 15:02:22.388390 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.388367 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/772b645d-af27-49ac-9efa-a2cf5ea2725a-var-lib-openvswitch\") pod \"ovnkube-node-nhq74\" (UID: \"772b645d-af27-49ac-9efa-a2cf5ea2725a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhq74" Apr 20 15:02:22.388813 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.388393 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/772b645d-af27-49ac-9efa-a2cf5ea2725a-run-ovn\") pod \"ovnkube-node-nhq74\" (UID: \"772b645d-af27-49ac-9efa-a2cf5ea2725a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhq74" Apr 20 15:02:22.388813 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.388418 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pfpfk\" (UniqueName: \"kubernetes.io/projected/5907ebec-df4b-4653-b568-7e4913dcec73-kube-api-access-pfpfk\") pod \"aws-ebs-csi-driver-node-xt47l\" (UID: \"5907ebec-df4b-4653-b568-7e4913dcec73\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xt47l" Apr 20 15:02:22.388813 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.388403 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/772b645d-af27-49ac-9efa-a2cf5ea2725a-var-lib-openvswitch\") pod \"ovnkube-node-nhq74\" (UID: \"772b645d-af27-49ac-9efa-a2cf5ea2725a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhq74" Apr 20 15:02:22.388813 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.388447 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/8345e9c1-deff-477f-ba7b-f5320279bda9-etc-sysctl-d\") pod \"tuned-m6lqv\" (UID: \"8345e9c1-deff-477f-ba7b-f5320279bda9\") " pod="openshift-cluster-node-tuning-operator/tuned-m6lqv" Apr 20 15:02:22.388813 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.388472 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wdxz\" (UniqueName: \"kubernetes.io/projected/31c75c33-7396-4dd6-8404-bd0e038f65b3-kube-api-access-8wdxz\") pod \"network-check-target-5f8cl\" (UID: \"31c75c33-7396-4dd6-8404-bd0e038f65b3\") " pod="openshift-network-diagnostics/network-check-target-5f8cl" Apr 20 15:02:22.388813 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.388478 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/772b645d-af27-49ac-9efa-a2cf5ea2725a-run-ovn\") pod \"ovnkube-node-nhq74\" (UID: \"772b645d-af27-49ac-9efa-a2cf5ea2725a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhq74" Apr 20 15:02:22.388813 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.388531 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1f117f6f-4e31-4bc5-91d0-9a6176af628e-cnibin\") pod \"multus-additional-cni-plugins-v4jvg\" (UID: \"1f117f6f-4e31-4bc5-91d0-9a6176af628e\") " pod="openshift-multus/multus-additional-cni-plugins-v4jvg" Apr 20 15:02:22.388813 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.388583 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/1f117f6f-4e31-4bc5-91d0-9a6176af628e-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-v4jvg\" (UID: \"1f117f6f-4e31-4bc5-91d0-9a6176af628e\") " pod="openshift-multus/multus-additional-cni-plugins-v4jvg" Apr 20 15:02:22.388813 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.388638 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttd7h\" (UniqueName: \"kubernetes.io/projected/1f117f6f-4e31-4bc5-91d0-9a6176af628e-kube-api-access-ttd7h\") pod \"multus-additional-cni-plugins-v4jvg\" (UID: \"1f117f6f-4e31-4bc5-91d0-9a6176af628e\") " pod="openshift-multus/multus-additional-cni-plugins-v4jvg" Apr 20 15:02:22.388813 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.388662 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/de3eeafd-2ce5-4f51-9232-ac55f91bb7af-cnibin\") pod \"multus-b568r\" (UID: \"de3eeafd-2ce5-4f51-9232-ac55f91bb7af\") " pod="openshift-multus/multus-b568r" Apr 20 15:02:22.388813 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.388686 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/de3eeafd-2ce5-4f51-9232-ac55f91bb7af-cni-binary-copy\") pod \"multus-b568r\" (UID: \"de3eeafd-2ce5-4f51-9232-ac55f91bb7af\") " pod="openshift-multus/multus-b568r" Apr 20 15:02:22.388813 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.388721 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/de3eeafd-2ce5-4f51-9232-ac55f91bb7af-etc-kubernetes\") pod \"multus-b568r\" (UID: \"de3eeafd-2ce5-4f51-9232-ac55f91bb7af\") " pod="openshift-multus/multus-b568r" Apr 20 15:02:22.388813 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.388751 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/de3eeafd-2ce5-4f51-9232-ac55f91bb7af-multus-socket-dir-parent\") pod \"multus-b568r\" (UID: \"de3eeafd-2ce5-4f51-9232-ac55f91bb7af\") " pod="openshift-multus/multus-b568r" Apr 20 15:02:22.388813 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.388776 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/de3eeafd-2ce5-4f51-9232-ac55f91bb7af-host-run-netns\") pod \"multus-b568r\" (UID: \"de3eeafd-2ce5-4f51-9232-ac55f91bb7af\") " pod="openshift-multus/multus-b568r" Apr 20 15:02:22.388813 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.388813 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/de3eeafd-2ce5-4f51-9232-ac55f91bb7af-host-var-lib-kubelet\") pod \"multus-b568r\" (UID: \"de3eeafd-2ce5-4f51-9232-ac55f91bb7af\") " pod="openshift-multus/multus-b568r" Apr 20 15:02:22.389456 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.388835 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/772b645d-af27-49ac-9efa-a2cf5ea2725a-host-kubelet\") pod \"ovnkube-node-nhq74\" (UID: \"772b645d-af27-49ac-9efa-a2cf5ea2725a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhq74" Apr 20 15:02:22.389456 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.388853 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/cb13e155-4ffa-49d2-8c49-c34c374a7d61-konnectivity-ca\") pod \"konnectivity-agent-qh2tf\" (UID: \"cb13e155-4ffa-49d2-8c49-c34c374a7d61\") " pod="kube-system/konnectivity-agent-qh2tf" Apr 20 15:02:22.389456 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.388862 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/772b645d-af27-49ac-9efa-a2cf5ea2725a-env-overrides\") pod \"ovnkube-node-nhq74\" (UID: \"772b645d-af27-49ac-9efa-a2cf5ea2725a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhq74" Apr 20 15:02:22.389456 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.388911 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/772b645d-af27-49ac-9efa-a2cf5ea2725a-host-kubelet\") pod \"ovnkube-node-nhq74\" (UID: \"772b645d-af27-49ac-9efa-a2cf5ea2725a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhq74" Apr 20 15:02:22.389456 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.388926 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7h7mx\" (UniqueName: \"kubernetes.io/projected/c683848d-ce2a-4a44-8396-37b7a8863b07-kube-api-access-7h7mx\") pod \"iptables-alerter-5nnpq\" (UID: \"c683848d-ce2a-4a44-8396-37b7a8863b07\") " pod="openshift-network-operator/iptables-alerter-5nnpq" Apr 20 15:02:22.389456 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.388951 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/5907ebec-df4b-4653-b568-7e4913dcec73-socket-dir\") pod \"aws-ebs-csi-driver-node-xt47l\" (UID: \"5907ebec-df4b-4653-b568-7e4913dcec73\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xt47l" Apr 20 15:02:22.389456 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.388974 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/772b645d-af27-49ac-9efa-a2cf5ea2725a-ovnkube-script-lib\") pod \"ovnkube-node-nhq74\" (UID: \"772b645d-af27-49ac-9efa-a2cf5ea2725a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhq74" Apr 20 15:02:22.389456 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.388998 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/5907ebec-df4b-4653-b568-7e4913dcec73-registration-dir\") pod \"aws-ebs-csi-driver-node-xt47l\" (UID: \"5907ebec-df4b-4653-b568-7e4913dcec73\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xt47l" Apr 20 15:02:22.389456 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.389329 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/5907ebec-df4b-4653-b568-7e4913dcec73-registration-dir\") pod \"aws-ebs-csi-driver-node-xt47l\" (UID: \"5907ebec-df4b-4653-b568-7e4913dcec73\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xt47l" Apr 20 15:02:22.389854 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.389513 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/5907ebec-df4b-4653-b568-7e4913dcec73-socket-dir\") pod \"aws-ebs-csi-driver-node-xt47l\" (UID: \"5907ebec-df4b-4653-b568-7e4913dcec73\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xt47l" Apr 20 15:02:22.389854 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.389667 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/22757559-941f-4d9e-9128-3aeefc6665f3-serviceca\") pod \"node-ca-x8tdg\" (UID: \"22757559-941f-4d9e-9128-3aeefc6665f3\") " pod="openshift-image-registry/node-ca-x8tdg" Apr 20 15:02:22.389854 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.389717 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/de3eeafd-2ce5-4f51-9232-ac55f91bb7af-multus-cni-dir\") pod \"multus-b568r\" (UID: \"de3eeafd-2ce5-4f51-9232-ac55f91bb7af\") " pod="openshift-multus/multus-b568r" Apr 20 15:02:22.389854 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.389828 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/8345e9c1-deff-477f-ba7b-f5320279bda9-etc-systemd\") pod \"tuned-m6lqv\" (UID: \"8345e9c1-deff-477f-ba7b-f5320279bda9\") " pod="openshift-cluster-node-tuning-operator/tuned-m6lqv" Apr 20 15:02:22.390027 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.389863 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8345e9c1-deff-477f-ba7b-f5320279bda9-sys\") pod \"tuned-m6lqv\" (UID: \"8345e9c1-deff-477f-ba7b-f5320279bda9\") " pod="openshift-cluster-node-tuning-operator/tuned-m6lqv" Apr 20 15:02:22.390027 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.389910 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/772b645d-af27-49ac-9efa-a2cf5ea2725a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nhq74\" (UID: \"772b645d-af27-49ac-9efa-a2cf5ea2725a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhq74" Apr 20 15:02:22.390027 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.389961 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/772b645d-af27-49ac-9efa-a2cf5ea2725a-ovn-node-metrics-cert\") pod \"ovnkube-node-nhq74\" (UID: \"772b645d-af27-49ac-9efa-a2cf5ea2725a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhq74" Apr 20 15:02:22.390027 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.389993 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8345e9c1-deff-477f-ba7b-f5320279bda9-host\") pod \"tuned-m6lqv\" (UID: \"8345e9c1-deff-477f-ba7b-f5320279bda9\") " pod="openshift-cluster-node-tuning-operator/tuned-m6lqv" Apr 20 15:02:22.390204 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.390028 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1f117f6f-4e31-4bc5-91d0-9a6176af628e-system-cni-dir\") pod \"multus-additional-cni-plugins-v4jvg\" (UID: \"1f117f6f-4e31-4bc5-91d0-9a6176af628e\") " pod="openshift-multus/multus-additional-cni-plugins-v4jvg" Apr 20 15:02:22.390204 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.390057 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/772b645d-af27-49ac-9efa-a2cf5ea2725a-log-socket\") pod \"ovnkube-node-nhq74\" (UID: \"772b645d-af27-49ac-9efa-a2cf5ea2725a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhq74" Apr 20 15:02:22.390204 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.390124 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/22757559-941f-4d9e-9128-3aeefc6665f3-serviceca\") pod \"node-ca-x8tdg\" (UID: \"22757559-941f-4d9e-9128-3aeefc6665f3\") " pod="openshift-image-registry/node-ca-x8tdg" Apr 20 15:02:22.390204 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.390153 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/cb13e155-4ffa-49d2-8c49-c34c374a7d61-agent-certs\") pod \"konnectivity-agent-qh2tf\" (UID: \"cb13e155-4ffa-49d2-8c49-c34c374a7d61\") " pod="kube-system/konnectivity-agent-qh2tf" Apr 20 15:02:22.390204 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.389710 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/772b645d-af27-49ac-9efa-a2cf5ea2725a-env-overrides\") pod \"ovnkube-node-nhq74\" (UID: \"772b645d-af27-49ac-9efa-a2cf5ea2725a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhq74" Apr 20 15:02:22.390204 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.390198 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/de3eeafd-2ce5-4f51-9232-ac55f91bb7af-host-var-lib-cni-bin\") pod \"multus-b568r\" (UID: \"de3eeafd-2ce5-4f51-9232-ac55f91bb7af\") " pod="openshift-multus/multus-b568r" Apr 20 15:02:22.390462 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.390257 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/8345e9c1-deff-477f-ba7b-f5320279bda9-etc-sysconfig\") pod \"tuned-m6lqv\" (UID: \"8345e9c1-deff-477f-ba7b-f5320279bda9\") " pod="openshift-cluster-node-tuning-operator/tuned-m6lqv" Apr 20 15:02:22.390462 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.390272 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/772b645d-af27-49ac-9efa-a2cf5ea2725a-log-socket\") pod \"ovnkube-node-nhq74\" (UID: \"772b645d-af27-49ac-9efa-a2cf5ea2725a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhq74" Apr 20 15:02:22.390462 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.390310 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/5907ebec-df4b-4653-b568-7e4913dcec73-sys-fs\") pod \"aws-ebs-csi-driver-node-xt47l\" (UID: \"5907ebec-df4b-4653-b568-7e4913dcec73\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xt47l" Apr 20 15:02:22.390462 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.390343 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/de3eeafd-2ce5-4f51-9232-ac55f91bb7af-multus-conf-dir\") pod \"multus-b568r\" (UID: \"de3eeafd-2ce5-4f51-9232-ac55f91bb7af\") " pod="openshift-multus/multus-b568r" Apr 20 15:02:22.390462 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.390371 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/772b645d-af27-49ac-9efa-a2cf5ea2725a-ovnkube-script-lib\") pod \"ovnkube-node-nhq74\" (UID: \"772b645d-af27-49ac-9efa-a2cf5ea2725a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhq74" Apr 20 15:02:22.390462 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.390425 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/5907ebec-df4b-4653-b568-7e4913dcec73-sys-fs\") pod \"aws-ebs-csi-driver-node-xt47l\" (UID: \"5907ebec-df4b-4653-b568-7e4913dcec73\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xt47l" Apr 20 15:02:22.390462 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.390157 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/772b645d-af27-49ac-9efa-a2cf5ea2725a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nhq74\" (UID: \"772b645d-af27-49ac-9efa-a2cf5ea2725a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhq74" Apr 20 15:02:22.390462 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.390424 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7215f3fe-093a-42b9-bea0-26a93cb4e1ff-metrics-certs\") pod \"network-metrics-daemon-sq52t\" (UID: \"7215f3fe-093a-42b9-bea0-26a93cb4e1ff\") " pod="openshift-multus/network-metrics-daemon-sq52t" Apr 20 15:02:22.390852 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.390498 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1f117f6f-4e31-4bc5-91d0-9a6176af628e-cni-binary-copy\") pod \"multus-additional-cni-plugins-v4jvg\" (UID: \"1f117f6f-4e31-4bc5-91d0-9a6176af628e\") " pod="openshift-multus/multus-additional-cni-plugins-v4jvg" Apr 20 15:02:22.390852 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.390475 2572 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 20 15:02:22.390852 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.390536 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1f117f6f-4e31-4bc5-91d0-9a6176af628e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-v4jvg\" (UID: \"1f117f6f-4e31-4bc5-91d0-9a6176af628e\") " pod="openshift-multus/multus-additional-cni-plugins-v4jvg" Apr 20 15:02:22.390852 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.390583 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/772b645d-af27-49ac-9efa-a2cf5ea2725a-ovnkube-config\") pod \"ovnkube-node-nhq74\" (UID: \"772b645d-af27-49ac-9efa-a2cf5ea2725a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhq74" Apr 20 15:02:22.390852 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.390616 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-69nr6\" (UniqueName: \"kubernetes.io/projected/772b645d-af27-49ac-9efa-a2cf5ea2725a-kube-api-access-69nr6\") pod \"ovnkube-node-nhq74\" (UID: \"772b645d-af27-49ac-9efa-a2cf5ea2725a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhq74" Apr 20 15:02:22.390852 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.390647 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/de3eeafd-2ce5-4f51-9232-ac55f91bb7af-hostroot\") pod \"multus-b568r\" (UID: \"de3eeafd-2ce5-4f51-9232-ac55f91bb7af\") " pod="openshift-multus/multus-b568r" Apr 20 15:02:22.390852 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.390673 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/de3eeafd-2ce5-4f51-9232-ac55f91bb7af-multus-daemon-config\") pod \"multus-b568r\" (UID: \"de3eeafd-2ce5-4f51-9232-ac55f91bb7af\") " pod="openshift-multus/multus-b568r" Apr 20 15:02:22.390852 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.390707 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8345e9c1-deff-477f-ba7b-f5320279bda9-var-lib-kubelet\") pod \"tuned-m6lqv\" (UID: \"8345e9c1-deff-477f-ba7b-f5320279bda9\") " pod="openshift-cluster-node-tuning-operator/tuned-m6lqv" Apr 20 15:02:22.390852 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.390755 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c683848d-ce2a-4a44-8396-37b7a8863b07-host-slash\") pod \"iptables-alerter-5nnpq\" (UID: \"c683848d-ce2a-4a44-8396-37b7a8863b07\") " pod="openshift-network-operator/iptables-alerter-5nnpq" Apr 20 15:02:22.390852 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.390785 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5907ebec-df4b-4653-b568-7e4913dcec73-kubelet-dir\") pod \"aws-ebs-csi-driver-node-xt47l\" (UID: \"5907ebec-df4b-4653-b568-7e4913dcec73\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xt47l" Apr 20 15:02:22.390852 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.390815 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/5907ebec-df4b-4653-b568-7e4913dcec73-etc-selinux\") pod \"aws-ebs-csi-driver-node-xt47l\" (UID: \"5907ebec-df4b-4653-b568-7e4913dcec73\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xt47l" Apr 20 15:02:22.390852 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.390841 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2gwn\" (UniqueName: \"kubernetes.io/projected/7215f3fe-093a-42b9-bea0-26a93cb4e1ff-kube-api-access-h2gwn\") pod \"network-metrics-daemon-sq52t\" (UID: \"7215f3fe-093a-42b9-bea0-26a93cb4e1ff\") " pod="openshift-multus/network-metrics-daemon-sq52t" Apr 20 15:02:22.391350 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.390873 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpgwb\" (UniqueName: \"kubernetes.io/projected/de3eeafd-2ce5-4f51-9232-ac55f91bb7af-kube-api-access-mpgwb\") pod \"multus-b568r\" (UID: \"de3eeafd-2ce5-4f51-9232-ac55f91bb7af\") " pod="openshift-multus/multus-b568r" Apr 20 15:02:22.391350 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.390903 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/de3eeafd-2ce5-4f51-9232-ac55f91bb7af-os-release\") pod \"multus-b568r\" (UID: \"de3eeafd-2ce5-4f51-9232-ac55f91bb7af\") " pod="openshift-multus/multus-b568r" Apr 20 15:02:22.391350 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.390933 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/772b645d-af27-49ac-9efa-a2cf5ea2725a-host-run-netns\") pod \"ovnkube-node-nhq74\" (UID: \"772b645d-af27-49ac-9efa-a2cf5ea2725a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhq74" Apr 20 15:02:22.391350 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.390968 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/772b645d-af27-49ac-9efa-a2cf5ea2725a-etc-openvswitch\") pod \"ovnkube-node-nhq74\" (UID: \"772b645d-af27-49ac-9efa-a2cf5ea2725a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhq74" Apr 20 15:02:22.391350 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.391000 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/8345e9c1-deff-477f-ba7b-f5320279bda9-etc-modprobe-d\") pod \"tuned-m6lqv\" (UID: \"8345e9c1-deff-477f-ba7b-f5320279bda9\") " pod="openshift-cluster-node-tuning-operator/tuned-m6lqv" Apr 20 15:02:22.391350 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.391023 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8345e9c1-deff-477f-ba7b-f5320279bda9-lib-modules\") pod \"tuned-m6lqv\" (UID: \"8345e9c1-deff-477f-ba7b-f5320279bda9\") " pod="openshift-cluster-node-tuning-operator/tuned-m6lqv" Apr 20 15:02:22.391350 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.391055 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/772b645d-af27-49ac-9efa-a2cf5ea2725a-run-systemd\") pod \"ovnkube-node-nhq74\" (UID: \"772b645d-af27-49ac-9efa-a2cf5ea2725a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhq74" Apr 20 15:02:22.391350 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.391084 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/772b645d-af27-49ac-9efa-a2cf5ea2725a-run-openvswitch\") pod \"ovnkube-node-nhq74\" (UID: \"772b645d-af27-49ac-9efa-a2cf5ea2725a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhq74" Apr 20 15:02:22.391350 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.391097 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/772b645d-af27-49ac-9efa-a2cf5ea2725a-ovnkube-config\") pod \"ovnkube-node-nhq74\" (UID: \"772b645d-af27-49ac-9efa-a2cf5ea2725a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhq74" Apr 20 15:02:22.391350 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.391112 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/22757559-941f-4d9e-9128-3aeefc6665f3-host\") pod \"node-ca-x8tdg\" (UID: \"22757559-941f-4d9e-9128-3aeefc6665f3\") " pod="openshift-image-registry/node-ca-x8tdg" Apr 20 15:02:22.391350 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.391143 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/772b645d-af27-49ac-9efa-a2cf5ea2725a-host-cni-bin\") pod \"ovnkube-node-nhq74\" (UID: \"772b645d-af27-49ac-9efa-a2cf5ea2725a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhq74" Apr 20 15:02:22.391350 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.391199 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/772b645d-af27-49ac-9efa-a2cf5ea2725a-host-cni-bin\") pod \"ovnkube-node-nhq74\" (UID: \"772b645d-af27-49ac-9efa-a2cf5ea2725a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhq74" Apr 20 15:02:22.391350 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.391286 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/772b645d-af27-49ac-9efa-a2cf5ea2725a-host-run-netns\") pod \"ovnkube-node-nhq74\" (UID: \"772b645d-af27-49ac-9efa-a2cf5ea2725a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhq74" Apr 20 15:02:22.391350 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.391337 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/772b645d-af27-49ac-9efa-a2cf5ea2725a-etc-openvswitch\") pod \"ovnkube-node-nhq74\" (UID: \"772b645d-af27-49ac-9efa-a2cf5ea2725a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhq74" Apr 20 15:02:22.391952 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.391417 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/772b645d-af27-49ac-9efa-a2cf5ea2725a-run-systemd\") pod \"ovnkube-node-nhq74\" (UID: \"772b645d-af27-49ac-9efa-a2cf5ea2725a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhq74" Apr 20 15:02:22.391952 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.391469 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/22757559-941f-4d9e-9128-3aeefc6665f3-host\") pod \"node-ca-x8tdg\" (UID: \"22757559-941f-4d9e-9128-3aeefc6665f3\") " pod="openshift-image-registry/node-ca-x8tdg" Apr 20 15:02:22.391952 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.391519 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5907ebec-df4b-4653-b568-7e4913dcec73-kubelet-dir\") pod \"aws-ebs-csi-driver-node-xt47l\" (UID: \"5907ebec-df4b-4653-b568-7e4913dcec73\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xt47l" Apr 20 15:02:22.391952 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.391526 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/772b645d-af27-49ac-9efa-a2cf5ea2725a-run-openvswitch\") pod \"ovnkube-node-nhq74\" (UID: \"772b645d-af27-49ac-9efa-a2cf5ea2725a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhq74" Apr 20 15:02:22.391952 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.391562 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c683848d-ce2a-4a44-8396-37b7a8863b07-host-slash\") pod \"iptables-alerter-5nnpq\" (UID: \"c683848d-ce2a-4a44-8396-37b7a8863b07\") " pod="openshift-network-operator/iptables-alerter-5nnpq" Apr 20 15:02:22.391952 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.391615 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/5907ebec-df4b-4653-b568-7e4913dcec73-etc-selinux\") pod \"aws-ebs-csi-driver-node-xt47l\" (UID: \"5907ebec-df4b-4653-b568-7e4913dcec73\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xt47l" Apr 20 15:02:22.391952 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.391652 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vf7vt\" (UniqueName: \"kubernetes.io/projected/22757559-941f-4d9e-9128-3aeefc6665f3-kube-api-access-vf7vt\") pod \"node-ca-x8tdg\" (UID: \"22757559-941f-4d9e-9128-3aeefc6665f3\") " pod="openshift-image-registry/node-ca-x8tdg" Apr 20 15:02:22.391952 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.391685 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8345e9c1-deff-477f-ba7b-f5320279bda9-run\") pod \"tuned-m6lqv\" (UID: \"8345e9c1-deff-477f-ba7b-f5320279bda9\") " pod="openshift-cluster-node-tuning-operator/tuned-m6lqv" Apr 20 15:02:22.391952 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.391708 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5844k\" (UniqueName: \"kubernetes.io/projected/8345e9c1-deff-477f-ba7b-f5320279bda9-kube-api-access-5844k\") pod \"tuned-m6lqv\" (UID: \"8345e9c1-deff-477f-ba7b-f5320279bda9\") " pod="openshift-cluster-node-tuning-operator/tuned-m6lqv" Apr 20 15:02:22.391952 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.391739 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/c683848d-ce2a-4a44-8396-37b7a8863b07-iptables-alerter-script\") pod \"iptables-alerter-5nnpq\" (UID: \"c683848d-ce2a-4a44-8396-37b7a8863b07\") " pod="openshift-network-operator/iptables-alerter-5nnpq" Apr 20 15:02:22.391952 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.391768 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/772b645d-af27-49ac-9efa-a2cf5ea2725a-host-cni-netd\") pod \"ovnkube-node-nhq74\" (UID: \"772b645d-af27-49ac-9efa-a2cf5ea2725a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhq74" Apr 20 15:02:22.391952 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.391797 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/de3eeafd-2ce5-4f51-9232-ac55f91bb7af-system-cni-dir\") pod \"multus-b568r\" (UID: \"de3eeafd-2ce5-4f51-9232-ac55f91bb7af\") " pod="openshift-multus/multus-b568r" Apr 20 15:02:22.391952 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.391828 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/de3eeafd-2ce5-4f51-9232-ac55f91bb7af-host-run-k8s-cni-cncf-io\") pod \"multus-b568r\" (UID: \"de3eeafd-2ce5-4f51-9232-ac55f91bb7af\") " pod="openshift-multus/multus-b568r" Apr 20 15:02:22.391952 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.391855 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/de3eeafd-2ce5-4f51-9232-ac55f91bb7af-host-var-lib-cni-multus\") pod \"multus-b568r\" (UID: \"de3eeafd-2ce5-4f51-9232-ac55f91bb7af\") " pod="openshift-multus/multus-b568r" Apr 20 15:02:22.391952 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.391916 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/772b645d-af27-49ac-9efa-a2cf5ea2725a-host-cni-netd\") pod \"ovnkube-node-nhq74\" (UID: \"772b645d-af27-49ac-9efa-a2cf5ea2725a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhq74" Apr 20 15:02:22.391952 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.391956 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/de3eeafd-2ce5-4f51-9232-ac55f91bb7af-host-run-multus-certs\") pod \"multus-b568r\" (UID: \"de3eeafd-2ce5-4f51-9232-ac55f91bb7af\") " pod="openshift-multus/multus-b568r" Apr 20 15:02:22.392637 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.391993 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/772b645d-af27-49ac-9efa-a2cf5ea2725a-systemd-units\") pod \"ovnkube-node-nhq74\" (UID: \"772b645d-af27-49ac-9efa-a2cf5ea2725a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhq74" Apr 20 15:02:22.392637 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.392042 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/772b645d-af27-49ac-9efa-a2cf5ea2725a-node-log\") pod \"ovnkube-node-nhq74\" (UID: \"772b645d-af27-49ac-9efa-a2cf5ea2725a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhq74" Apr 20 15:02:22.392637 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.392073 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/5907ebec-df4b-4653-b568-7e4913dcec73-device-dir\") pod \"aws-ebs-csi-driver-node-xt47l\" (UID: \"5907ebec-df4b-4653-b568-7e4913dcec73\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xt47l" Apr 20 15:02:22.392637 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.392102 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8345e9c1-deff-477f-ba7b-f5320279bda9-tmp\") pod \"tuned-m6lqv\" (UID: \"8345e9c1-deff-477f-ba7b-f5320279bda9\") " pod="openshift-cluster-node-tuning-operator/tuned-m6lqv" Apr 20 15:02:22.392637 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.392177 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/5907ebec-df4b-4653-b568-7e4913dcec73-device-dir\") pod \"aws-ebs-csi-driver-node-xt47l\" (UID: \"5907ebec-df4b-4653-b568-7e4913dcec73\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xt47l" Apr 20 15:02:22.392637 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.392214 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/8345e9c1-deff-477f-ba7b-f5320279bda9-etc-sysctl-conf\") pod \"tuned-m6lqv\" (UID: \"8345e9c1-deff-477f-ba7b-f5320279bda9\") " pod="openshift-cluster-node-tuning-operator/tuned-m6lqv" Apr 20 15:02:22.392637 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.392245 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1f117f6f-4e31-4bc5-91d0-9a6176af628e-os-release\") pod \"multus-additional-cni-plugins-v4jvg\" (UID: \"1f117f6f-4e31-4bc5-91d0-9a6176af628e\") " pod="openshift-multus/multus-additional-cni-plugins-v4jvg" Apr 20 15:02:22.392637 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.392304 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/c683848d-ce2a-4a44-8396-37b7a8863b07-iptables-alerter-script\") pod \"iptables-alerter-5nnpq\" (UID: \"c683848d-ce2a-4a44-8396-37b7a8863b07\") " pod="openshift-network-operator/iptables-alerter-5nnpq" Apr 20 15:02:22.392637 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.392362 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/772b645d-af27-49ac-9efa-a2cf5ea2725a-systemd-units\") pod \"ovnkube-node-nhq74\" (UID: \"772b645d-af27-49ac-9efa-a2cf5ea2725a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhq74" Apr 20 15:02:22.392637 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.392391 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/772b645d-af27-49ac-9efa-a2cf5ea2725a-node-log\") pod \"ovnkube-node-nhq74\" (UID: \"772b645d-af27-49ac-9efa-a2cf5ea2725a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhq74" Apr 20 15:02:22.394563 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.394539 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/772b645d-af27-49ac-9efa-a2cf5ea2725a-ovn-node-metrics-cert\") pod \"ovnkube-node-nhq74\" (UID: \"772b645d-af27-49ac-9efa-a2cf5ea2725a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhq74" Apr 20 15:02:22.394648 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.394584 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/cb13e155-4ffa-49d2-8c49-c34c374a7d61-agent-certs\") pod \"konnectivity-agent-qh2tf\" (UID: \"cb13e155-4ffa-49d2-8c49-c34c374a7d61\") " pod="kube-system/konnectivity-agent-qh2tf" Apr 20 15:02:22.401234 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.401209 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-69nr6\" (UniqueName: \"kubernetes.io/projected/772b645d-af27-49ac-9efa-a2cf5ea2725a-kube-api-access-69nr6\") pod \"ovnkube-node-nhq74\" (UID: \"772b645d-af27-49ac-9efa-a2cf5ea2725a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhq74" Apr 20 15:02:22.401335 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.401287 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vf7vt\" (UniqueName: \"kubernetes.io/projected/22757559-941f-4d9e-9128-3aeefc6665f3-kube-api-access-vf7vt\") pod \"node-ca-x8tdg\" (UID: \"22757559-941f-4d9e-9128-3aeefc6665f3\") " pod="openshift-image-registry/node-ca-x8tdg" Apr 20 15:02:22.401544 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.401523 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfpfk\" (UniqueName: \"kubernetes.io/projected/5907ebec-df4b-4653-b568-7e4913dcec73-kube-api-access-pfpfk\") pod \"aws-ebs-csi-driver-node-xt47l\" (UID: \"5907ebec-df4b-4653-b568-7e4913dcec73\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xt47l" Apr 20 15:02:22.402062 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.401919 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7h7mx\" (UniqueName: \"kubernetes.io/projected/c683848d-ce2a-4a44-8396-37b7a8863b07-kube-api-access-7h7mx\") pod \"iptables-alerter-5nnpq\" (UID: \"c683848d-ce2a-4a44-8396-37b7a8863b07\") " pod="openshift-network-operator/iptables-alerter-5nnpq" Apr 20 15:02:22.492670 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.492633 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/8345e9c1-deff-477f-ba7b-f5320279bda9-etc-modprobe-d\") pod \"tuned-m6lqv\" (UID: \"8345e9c1-deff-477f-ba7b-f5320279bda9\") " pod="openshift-cluster-node-tuning-operator/tuned-m6lqv" Apr 20 15:02:22.492833 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.492679 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8345e9c1-deff-477f-ba7b-f5320279bda9-lib-modules\") pod \"tuned-m6lqv\" (UID: \"8345e9c1-deff-477f-ba7b-f5320279bda9\") " pod="openshift-cluster-node-tuning-operator/tuned-m6lqv" Apr 20 15:02:22.492833 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.492709 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8345e9c1-deff-477f-ba7b-f5320279bda9-run\") pod \"tuned-m6lqv\" (UID: \"8345e9c1-deff-477f-ba7b-f5320279bda9\") " pod="openshift-cluster-node-tuning-operator/tuned-m6lqv" Apr 20 15:02:22.492833 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.492736 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5844k\" (UniqueName: \"kubernetes.io/projected/8345e9c1-deff-477f-ba7b-f5320279bda9-kube-api-access-5844k\") pod \"tuned-m6lqv\" (UID: \"8345e9c1-deff-477f-ba7b-f5320279bda9\") " pod="openshift-cluster-node-tuning-operator/tuned-m6lqv" Apr 20 15:02:22.492833 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.492763 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/de3eeafd-2ce5-4f51-9232-ac55f91bb7af-system-cni-dir\") pod \"multus-b568r\" (UID: \"de3eeafd-2ce5-4f51-9232-ac55f91bb7af\") " pod="openshift-multus/multus-b568r" Apr 20 15:02:22.492833 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.492787 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/de3eeafd-2ce5-4f51-9232-ac55f91bb7af-host-run-k8s-cni-cncf-io\") pod \"multus-b568r\" (UID: \"de3eeafd-2ce5-4f51-9232-ac55f91bb7af\") " pod="openshift-multus/multus-b568r" Apr 20 15:02:22.492833 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.492816 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/de3eeafd-2ce5-4f51-9232-ac55f91bb7af-host-var-lib-cni-multus\") pod \"multus-b568r\" (UID: \"de3eeafd-2ce5-4f51-9232-ac55f91bb7af\") " pod="openshift-multus/multus-b568r" Apr 20 15:02:22.493124 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.492827 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8345e9c1-deff-477f-ba7b-f5320279bda9-run\") pod \"tuned-m6lqv\" (UID: \"8345e9c1-deff-477f-ba7b-f5320279bda9\") " pod="openshift-cluster-node-tuning-operator/tuned-m6lqv" Apr 20 15:02:22.493124 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.492839 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/de3eeafd-2ce5-4f51-9232-ac55f91bb7af-host-run-multus-certs\") pod \"multus-b568r\" (UID: \"de3eeafd-2ce5-4f51-9232-ac55f91bb7af\") " pod="openshift-multus/multus-b568r" Apr 20 15:02:22.493124 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.492849 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/de3eeafd-2ce5-4f51-9232-ac55f91bb7af-system-cni-dir\") pod \"multus-b568r\" (UID: \"de3eeafd-2ce5-4f51-9232-ac55f91bb7af\") " pod="openshift-multus/multus-b568r" Apr 20 15:02:22.493124 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.492865 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8345e9c1-deff-477f-ba7b-f5320279bda9-tmp\") pod \"tuned-m6lqv\" (UID: \"8345e9c1-deff-477f-ba7b-f5320279bda9\") " pod="openshift-cluster-node-tuning-operator/tuned-m6lqv" Apr 20 15:02:22.493124 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.492864 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/de3eeafd-2ce5-4f51-9232-ac55f91bb7af-host-run-k8s-cni-cncf-io\") pod \"multus-b568r\" (UID: \"de3eeafd-2ce5-4f51-9232-ac55f91bb7af\") " pod="openshift-multus/multus-b568r" Apr 20 15:02:22.493124 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.492827 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/8345e9c1-deff-477f-ba7b-f5320279bda9-etc-modprobe-d\") pod \"tuned-m6lqv\" (UID: \"8345e9c1-deff-477f-ba7b-f5320279bda9\") " pod="openshift-cluster-node-tuning-operator/tuned-m6lqv" Apr 20 15:02:22.493124 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.492901 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/de3eeafd-2ce5-4f51-9232-ac55f91bb7af-host-run-multus-certs\") pod \"multus-b568r\" (UID: \"de3eeafd-2ce5-4f51-9232-ac55f91bb7af\") " pod="openshift-multus/multus-b568r" Apr 20 15:02:22.493124 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.492903 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/de3eeafd-2ce5-4f51-9232-ac55f91bb7af-host-var-lib-cni-multus\") pod \"multus-b568r\" (UID: \"de3eeafd-2ce5-4f51-9232-ac55f91bb7af\") " pod="openshift-multus/multus-b568r" Apr 20 15:02:22.493124 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.492920 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8345e9c1-deff-477f-ba7b-f5320279bda9-lib-modules\") pod \"tuned-m6lqv\" (UID: \"8345e9c1-deff-477f-ba7b-f5320279bda9\") " pod="openshift-cluster-node-tuning-operator/tuned-m6lqv" Apr 20 15:02:22.493124 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.492926 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/8345e9c1-deff-477f-ba7b-f5320279bda9-etc-sysctl-conf\") pod \"tuned-m6lqv\" (UID: \"8345e9c1-deff-477f-ba7b-f5320279bda9\") " pod="openshift-cluster-node-tuning-operator/tuned-m6lqv" Apr 20 15:02:22.493124 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.492981 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1f117f6f-4e31-4bc5-91d0-9a6176af628e-os-release\") pod \"multus-additional-cni-plugins-v4jvg\" (UID: \"1f117f6f-4e31-4bc5-91d0-9a6176af628e\") " pod="openshift-multus/multus-additional-cni-plugins-v4jvg" Apr 20 15:02:22.493124 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.493014 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8345e9c1-deff-477f-ba7b-f5320279bda9-etc-kubernetes\") pod \"tuned-m6lqv\" (UID: \"8345e9c1-deff-477f-ba7b-f5320279bda9\") " pod="openshift-cluster-node-tuning-operator/tuned-m6lqv" Apr 20 15:02:22.493124 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.493032 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/8345e9c1-deff-477f-ba7b-f5320279bda9-etc-sysctl-conf\") pod \"tuned-m6lqv\" (UID: \"8345e9c1-deff-477f-ba7b-f5320279bda9\") " pod="openshift-cluster-node-tuning-operator/tuned-m6lqv" Apr 20 15:02:22.493124 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.493045 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/8345e9c1-deff-477f-ba7b-f5320279bda9-etc-tuned\") pod \"tuned-m6lqv\" (UID: \"8345e9c1-deff-477f-ba7b-f5320279bda9\") " pod="openshift-cluster-node-tuning-operator/tuned-m6lqv" Apr 20 15:02:22.493124 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.493073 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/1f117f6f-4e31-4bc5-91d0-9a6176af628e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-v4jvg\" (UID: \"1f117f6f-4e31-4bc5-91d0-9a6176af628e\") " pod="openshift-multus/multus-additional-cni-plugins-v4jvg" Apr 20 15:02:22.493124 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.493091 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1f117f6f-4e31-4bc5-91d0-9a6176af628e-os-release\") pod \"multus-additional-cni-plugins-v4jvg\" (UID: \"1f117f6f-4e31-4bc5-91d0-9a6176af628e\") " pod="openshift-multus/multus-additional-cni-plugins-v4jvg" Apr 20 15:02:22.493124 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.493098 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8345e9c1-deff-477f-ba7b-f5320279bda9-etc-kubernetes\") pod \"tuned-m6lqv\" (UID: \"8345e9c1-deff-477f-ba7b-f5320279bda9\") " pod="openshift-cluster-node-tuning-operator/tuned-m6lqv" Apr 20 15:02:22.493891 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.493138 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/8345e9c1-deff-477f-ba7b-f5320279bda9-etc-sysctl-d\") pod \"tuned-m6lqv\" (UID: \"8345e9c1-deff-477f-ba7b-f5320279bda9\") " pod="openshift-cluster-node-tuning-operator/tuned-m6lqv" Apr 20 15:02:22.493891 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.493184 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8wdxz\" (UniqueName: \"kubernetes.io/projected/31c75c33-7396-4dd6-8404-bd0e038f65b3-kube-api-access-8wdxz\") pod \"network-check-target-5f8cl\" (UID: \"31c75c33-7396-4dd6-8404-bd0e038f65b3\") " pod="openshift-network-diagnostics/network-check-target-5f8cl" Apr 20 15:02:22.493891 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.493221 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1f117f6f-4e31-4bc5-91d0-9a6176af628e-cnibin\") pod \"multus-additional-cni-plugins-v4jvg\" (UID: \"1f117f6f-4e31-4bc5-91d0-9a6176af628e\") " pod="openshift-multus/multus-additional-cni-plugins-v4jvg" Apr 20 15:02:22.493891 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.493252 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/1f117f6f-4e31-4bc5-91d0-9a6176af628e-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-v4jvg\" (UID: \"1f117f6f-4e31-4bc5-91d0-9a6176af628e\") " pod="openshift-multus/multus-additional-cni-plugins-v4jvg" Apr 20 15:02:22.493891 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.493279 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ttd7h\" (UniqueName: \"kubernetes.io/projected/1f117f6f-4e31-4bc5-91d0-9a6176af628e-kube-api-access-ttd7h\") pod \"multus-additional-cni-plugins-v4jvg\" (UID: \"1f117f6f-4e31-4bc5-91d0-9a6176af628e\") " pod="openshift-multus/multus-additional-cni-plugins-v4jvg" Apr 20 15:02:22.493891 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.493286 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/8345e9c1-deff-477f-ba7b-f5320279bda9-etc-sysctl-d\") pod \"tuned-m6lqv\" (UID: \"8345e9c1-deff-477f-ba7b-f5320279bda9\") " pod="openshift-cluster-node-tuning-operator/tuned-m6lqv" Apr 20 15:02:22.493891 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.493302 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/de3eeafd-2ce5-4f51-9232-ac55f91bb7af-cnibin\") pod \"multus-b568r\" (UID: \"de3eeafd-2ce5-4f51-9232-ac55f91bb7af\") " pod="openshift-multus/multus-b568r" Apr 20 15:02:22.493891 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.493326 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/de3eeafd-2ce5-4f51-9232-ac55f91bb7af-cni-binary-copy\") pod \"multus-b568r\" (UID: \"de3eeafd-2ce5-4f51-9232-ac55f91bb7af\") " pod="openshift-multus/multus-b568r" Apr 20 15:02:22.493891 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.493349 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/de3eeafd-2ce5-4f51-9232-ac55f91bb7af-etc-kubernetes\") pod \"multus-b568r\" (UID: \"de3eeafd-2ce5-4f51-9232-ac55f91bb7af\") " pod="openshift-multus/multus-b568r" Apr 20 15:02:22.493891 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.493374 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/de3eeafd-2ce5-4f51-9232-ac55f91bb7af-multus-socket-dir-parent\") pod \"multus-b568r\" (UID: \"de3eeafd-2ce5-4f51-9232-ac55f91bb7af\") " pod="openshift-multus/multus-b568r" Apr 20 15:02:22.493891 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.493399 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/de3eeafd-2ce5-4f51-9232-ac55f91bb7af-host-run-netns\") pod \"multus-b568r\" (UID: \"de3eeafd-2ce5-4f51-9232-ac55f91bb7af\") " pod="openshift-multus/multus-b568r" Apr 20 15:02:22.493891 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.493426 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/de3eeafd-2ce5-4f51-9232-ac55f91bb7af-host-var-lib-kubelet\") pod \"multus-b568r\" (UID: \"de3eeafd-2ce5-4f51-9232-ac55f91bb7af\") " pod="openshift-multus/multus-b568r" Apr 20 15:02:22.493891 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.493459 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/de3eeafd-2ce5-4f51-9232-ac55f91bb7af-multus-cni-dir\") pod \"multus-b568r\" (UID: \"de3eeafd-2ce5-4f51-9232-ac55f91bb7af\") " pod="openshift-multus/multus-b568r" Apr 20 15:02:22.493891 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.493524 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/de3eeafd-2ce5-4f51-9232-ac55f91bb7af-etc-kubernetes\") pod \"multus-b568r\" (UID: \"de3eeafd-2ce5-4f51-9232-ac55f91bb7af\") " pod="openshift-multus/multus-b568r" Apr 20 15:02:22.493891 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.493543 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/de3eeafd-2ce5-4f51-9232-ac55f91bb7af-multus-socket-dir-parent\") pod \"multus-b568r\" (UID: \"de3eeafd-2ce5-4f51-9232-ac55f91bb7af\") " pod="openshift-multus/multus-b568r" Apr 20 15:02:22.493891 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.493595 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/de3eeafd-2ce5-4f51-9232-ac55f91bb7af-cnibin\") pod \"multus-b568r\" (UID: \"de3eeafd-2ce5-4f51-9232-ac55f91bb7af\") " pod="openshift-multus/multus-b568r" Apr 20 15:02:22.493891 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.493636 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/de3eeafd-2ce5-4f51-9232-ac55f91bb7af-host-run-netns\") pod \"multus-b568r\" (UID: \"de3eeafd-2ce5-4f51-9232-ac55f91bb7af\") " pod="openshift-multus/multus-b568r" Apr 20 15:02:22.493891 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.493673 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/1f117f6f-4e31-4bc5-91d0-9a6176af628e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-v4jvg\" (UID: \"1f117f6f-4e31-4bc5-91d0-9a6176af628e\") " pod="openshift-multus/multus-additional-cni-plugins-v4jvg" Apr 20 15:02:22.494810 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.493677 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/de3eeafd-2ce5-4f51-9232-ac55f91bb7af-host-var-lib-kubelet\") pod \"multus-b568r\" (UID: \"de3eeafd-2ce5-4f51-9232-ac55f91bb7af\") " pod="openshift-multus/multus-b568r" Apr 20 15:02:22.494810 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.493714 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1f117f6f-4e31-4bc5-91d0-9a6176af628e-cnibin\") pod \"multus-additional-cni-plugins-v4jvg\" (UID: \"1f117f6f-4e31-4bc5-91d0-9a6176af628e\") " pod="openshift-multus/multus-additional-cni-plugins-v4jvg" Apr 20 15:02:22.494810 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.493720 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/8345e9c1-deff-477f-ba7b-f5320279bda9-etc-systemd\") pod \"tuned-m6lqv\" (UID: \"8345e9c1-deff-477f-ba7b-f5320279bda9\") " pod="openshift-cluster-node-tuning-operator/tuned-m6lqv" Apr 20 15:02:22.494810 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.493739 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/de3eeafd-2ce5-4f51-9232-ac55f91bb7af-multus-cni-dir\") pod \"multus-b568r\" (UID: \"de3eeafd-2ce5-4f51-9232-ac55f91bb7af\") " pod="openshift-multus/multus-b568r" Apr 20 15:02:22.494810 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.493747 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8345e9c1-deff-477f-ba7b-f5320279bda9-sys\") pod \"tuned-m6lqv\" (UID: \"8345e9c1-deff-477f-ba7b-f5320279bda9\") " pod="openshift-cluster-node-tuning-operator/tuned-m6lqv" Apr 20 15:02:22.494810 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.493785 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8345e9c1-deff-477f-ba7b-f5320279bda9-sys\") pod \"tuned-m6lqv\" (UID: \"8345e9c1-deff-477f-ba7b-f5320279bda9\") " pod="openshift-cluster-node-tuning-operator/tuned-m6lqv" Apr 20 15:02:22.494810 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.493793 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8345e9c1-deff-477f-ba7b-f5320279bda9-host\") pod \"tuned-m6lqv\" (UID: \"8345e9c1-deff-477f-ba7b-f5320279bda9\") " pod="openshift-cluster-node-tuning-operator/tuned-m6lqv" Apr 20 15:02:22.494810 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.493809 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/8345e9c1-deff-477f-ba7b-f5320279bda9-etc-systemd\") pod \"tuned-m6lqv\" (UID: \"8345e9c1-deff-477f-ba7b-f5320279bda9\") " pod="openshift-cluster-node-tuning-operator/tuned-m6lqv" Apr 20 15:02:22.494810 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.493821 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1f117f6f-4e31-4bc5-91d0-9a6176af628e-system-cni-dir\") pod \"multus-additional-cni-plugins-v4jvg\" (UID: \"1f117f6f-4e31-4bc5-91d0-9a6176af628e\") " pod="openshift-multus/multus-additional-cni-plugins-v4jvg" Apr 20 15:02:22.494810 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.493813 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/1f117f6f-4e31-4bc5-91d0-9a6176af628e-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-v4jvg\" (UID: \"1f117f6f-4e31-4bc5-91d0-9a6176af628e\") " pod="openshift-multus/multus-additional-cni-plugins-v4jvg" Apr 20 15:02:22.494810 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.493832 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8345e9c1-deff-477f-ba7b-f5320279bda9-host\") pod \"tuned-m6lqv\" (UID: \"8345e9c1-deff-477f-ba7b-f5320279bda9\") " pod="openshift-cluster-node-tuning-operator/tuned-m6lqv" Apr 20 15:02:22.494810 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.493865 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/de3eeafd-2ce5-4f51-9232-ac55f91bb7af-host-var-lib-cni-bin\") pod \"multus-b568r\" (UID: \"de3eeafd-2ce5-4f51-9232-ac55f91bb7af\") " pod="openshift-multus/multus-b568r" Apr 20 15:02:22.494810 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.493895 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/8345e9c1-deff-477f-ba7b-f5320279bda9-etc-sysconfig\") pod \"tuned-m6lqv\" (UID: \"8345e9c1-deff-477f-ba7b-f5320279bda9\") " pod="openshift-cluster-node-tuning-operator/tuned-m6lqv" Apr 20 15:02:22.494810 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.493900 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/de3eeafd-2ce5-4f51-9232-ac55f91bb7af-host-var-lib-cni-bin\") pod \"multus-b568r\" (UID: \"de3eeafd-2ce5-4f51-9232-ac55f91bb7af\") " pod="openshift-multus/multus-b568r" Apr 20 15:02:22.494810 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.493869 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1f117f6f-4e31-4bc5-91d0-9a6176af628e-system-cni-dir\") pod \"multus-additional-cni-plugins-v4jvg\" (UID: \"1f117f6f-4e31-4bc5-91d0-9a6176af628e\") " pod="openshift-multus/multus-additional-cni-plugins-v4jvg" Apr 20 15:02:22.494810 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.493923 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/de3eeafd-2ce5-4f51-9232-ac55f91bb7af-multus-conf-dir\") pod \"multus-b568r\" (UID: \"de3eeafd-2ce5-4f51-9232-ac55f91bb7af\") " pod="openshift-multus/multus-b568r" Apr 20 15:02:22.494810 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.493950 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7215f3fe-093a-42b9-bea0-26a93cb4e1ff-metrics-certs\") pod \"network-metrics-daemon-sq52t\" (UID: \"7215f3fe-093a-42b9-bea0-26a93cb4e1ff\") " pod="openshift-multus/network-metrics-daemon-sq52t" Apr 20 15:02:22.494810 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.493978 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1f117f6f-4e31-4bc5-91d0-9a6176af628e-cni-binary-copy\") pod \"multus-additional-cni-plugins-v4jvg\" (UID: \"1f117f6f-4e31-4bc5-91d0-9a6176af628e\") " pod="openshift-multus/multus-additional-cni-plugins-v4jvg" Apr 20 15:02:22.495750 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.493952 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/de3eeafd-2ce5-4f51-9232-ac55f91bb7af-multus-conf-dir\") pod \"multus-b568r\" (UID: \"de3eeafd-2ce5-4f51-9232-ac55f91bb7af\") " pod="openshift-multus/multus-b568r" Apr 20 15:02:22.495750 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.493966 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/8345e9c1-deff-477f-ba7b-f5320279bda9-etc-sysconfig\") pod \"tuned-m6lqv\" (UID: \"8345e9c1-deff-477f-ba7b-f5320279bda9\") " pod="openshift-cluster-node-tuning-operator/tuned-m6lqv" Apr 20 15:02:22.495750 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.494003 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1f117f6f-4e31-4bc5-91d0-9a6176af628e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-v4jvg\" (UID: \"1f117f6f-4e31-4bc5-91d0-9a6176af628e\") " pod="openshift-multus/multus-additional-cni-plugins-v4jvg" Apr 20 15:02:22.495750 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.494027 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/de3eeafd-2ce5-4f51-9232-ac55f91bb7af-cni-binary-copy\") pod \"multus-b568r\" (UID: \"de3eeafd-2ce5-4f51-9232-ac55f91bb7af\") " pod="openshift-multus/multus-b568r" Apr 20 15:02:22.495750 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:22.494033 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 15:02:22.495750 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:22.494175 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7215f3fe-093a-42b9-bea0-26a93cb4e1ff-metrics-certs podName:7215f3fe-093a-42b9-bea0-26a93cb4e1ff nodeName:}" failed. No retries permitted until 2026-04-20 15:02:22.994132411 +0000 UTC m=+3.125007706 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7215f3fe-093a-42b9-bea0-26a93cb4e1ff-metrics-certs") pod "network-metrics-daemon-sq52t" (UID: "7215f3fe-093a-42b9-bea0-26a93cb4e1ff") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 15:02:22.495750 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.494207 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1f117f6f-4e31-4bc5-91d0-9a6176af628e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-v4jvg\" (UID: \"1f117f6f-4e31-4bc5-91d0-9a6176af628e\") " pod="openshift-multus/multus-additional-cni-plugins-v4jvg" Apr 20 15:02:22.495750 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.494220 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/de3eeafd-2ce5-4f51-9232-ac55f91bb7af-hostroot\") pod \"multus-b568r\" (UID: \"de3eeafd-2ce5-4f51-9232-ac55f91bb7af\") " pod="openshift-multus/multus-b568r" Apr 20 15:02:22.495750 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.494246 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/de3eeafd-2ce5-4f51-9232-ac55f91bb7af-multus-daemon-config\") pod \"multus-b568r\" (UID: \"de3eeafd-2ce5-4f51-9232-ac55f91bb7af\") " pod="openshift-multus/multus-b568r" Apr 20 15:02:22.495750 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.494267 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8345e9c1-deff-477f-ba7b-f5320279bda9-var-lib-kubelet\") pod \"tuned-m6lqv\" (UID: \"8345e9c1-deff-477f-ba7b-f5320279bda9\") " pod="openshift-cluster-node-tuning-operator/tuned-m6lqv" Apr 20 15:02:22.495750 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.494296 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h2gwn\" (UniqueName: \"kubernetes.io/projected/7215f3fe-093a-42b9-bea0-26a93cb4e1ff-kube-api-access-h2gwn\") pod \"network-metrics-daemon-sq52t\" (UID: \"7215f3fe-093a-42b9-bea0-26a93cb4e1ff\") " pod="openshift-multus/network-metrics-daemon-sq52t" Apr 20 15:02:22.495750 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.494323 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mpgwb\" (UniqueName: \"kubernetes.io/projected/de3eeafd-2ce5-4f51-9232-ac55f91bb7af-kube-api-access-mpgwb\") pod \"multus-b568r\" (UID: \"de3eeafd-2ce5-4f51-9232-ac55f91bb7af\") " pod="openshift-multus/multus-b568r" Apr 20 15:02:22.495750 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.494343 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8345e9c1-deff-477f-ba7b-f5320279bda9-var-lib-kubelet\") pod \"tuned-m6lqv\" (UID: \"8345e9c1-deff-477f-ba7b-f5320279bda9\") " pod="openshift-cluster-node-tuning-operator/tuned-m6lqv" Apr 20 15:02:22.495750 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.494268 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/de3eeafd-2ce5-4f51-9232-ac55f91bb7af-hostroot\") pod \"multus-b568r\" (UID: \"de3eeafd-2ce5-4f51-9232-ac55f91bb7af\") " pod="openshift-multus/multus-b568r" Apr 20 15:02:22.495750 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.494346 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/de3eeafd-2ce5-4f51-9232-ac55f91bb7af-os-release\") pod \"multus-b568r\" (UID: \"de3eeafd-2ce5-4f51-9232-ac55f91bb7af\") " pod="openshift-multus/multus-b568r" Apr 20 15:02:22.495750 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.494499 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1f117f6f-4e31-4bc5-91d0-9a6176af628e-cni-binary-copy\") pod \"multus-additional-cni-plugins-v4jvg\" (UID: \"1f117f6f-4e31-4bc5-91d0-9a6176af628e\") " pod="openshift-multus/multus-additional-cni-plugins-v4jvg" Apr 20 15:02:22.495750 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.494533 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/de3eeafd-2ce5-4f51-9232-ac55f91bb7af-os-release\") pod \"multus-b568r\" (UID: \"de3eeafd-2ce5-4f51-9232-ac55f91bb7af\") " pod="openshift-multus/multus-b568r" Apr 20 15:02:22.496518 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.494745 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/de3eeafd-2ce5-4f51-9232-ac55f91bb7af-multus-daemon-config\") pod \"multus-b568r\" (UID: \"de3eeafd-2ce5-4f51-9232-ac55f91bb7af\") " pod="openshift-multus/multus-b568r" Apr 20 15:02:22.496518 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.495287 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8345e9c1-deff-477f-ba7b-f5320279bda9-tmp\") pod \"tuned-m6lqv\" (UID: \"8345e9c1-deff-477f-ba7b-f5320279bda9\") " pod="openshift-cluster-node-tuning-operator/tuned-m6lqv" Apr 20 15:02:22.496518 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.495512 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/8345e9c1-deff-477f-ba7b-f5320279bda9-etc-tuned\") pod \"tuned-m6lqv\" (UID: \"8345e9c1-deff-477f-ba7b-f5320279bda9\") " pod="openshift-cluster-node-tuning-operator/tuned-m6lqv" Apr 20 15:02:22.504362 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:22.504338 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 15:02:22.504362 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:22.504362 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 15:02:22.504634 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:22.504376 2572 projected.go:194] Error preparing data for projected volume kube-api-access-8wdxz for pod openshift-network-diagnostics/network-check-target-5f8cl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 15:02:22.504634 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:22.504506 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/31c75c33-7396-4dd6-8404-bd0e038f65b3-kube-api-access-8wdxz podName:31c75c33-7396-4dd6-8404-bd0e038f65b3 nodeName:}" failed. No retries permitted until 2026-04-20 15:02:23.004471698 +0000 UTC m=+3.135347029 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-8wdxz" (UniqueName: "kubernetes.io/projected/31c75c33-7396-4dd6-8404-bd0e038f65b3-kube-api-access-8wdxz") pod "network-check-target-5f8cl" (UID: "31c75c33-7396-4dd6-8404-bd0e038f65b3") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 15:02:22.505552 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.505528 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttd7h\" (UniqueName: \"kubernetes.io/projected/1f117f6f-4e31-4bc5-91d0-9a6176af628e-kube-api-access-ttd7h\") pod \"multus-additional-cni-plugins-v4jvg\" (UID: \"1f117f6f-4e31-4bc5-91d0-9a6176af628e\") " pod="openshift-multus/multus-additional-cni-plugins-v4jvg" Apr 20 15:02:22.506332 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.506310 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpgwb\" (UniqueName: \"kubernetes.io/projected/de3eeafd-2ce5-4f51-9232-ac55f91bb7af-kube-api-access-mpgwb\") pod \"multus-b568r\" (UID: \"de3eeafd-2ce5-4f51-9232-ac55f91bb7af\") " pod="openshift-multus/multus-b568r" Apr 20 15:02:22.506618 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.506594 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5844k\" (UniqueName: \"kubernetes.io/projected/8345e9c1-deff-477f-ba7b-f5320279bda9-kube-api-access-5844k\") pod \"tuned-m6lqv\" (UID: \"8345e9c1-deff-477f-ba7b-f5320279bda9\") " pod="openshift-cluster-node-tuning-operator/tuned-m6lqv" Apr 20 15:02:22.506846 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.506827 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2gwn\" (UniqueName: \"kubernetes.io/projected/7215f3fe-093a-42b9-bea0-26a93cb4e1ff-kube-api-access-h2gwn\") pod \"network-metrics-daemon-sq52t\" (UID: \"7215f3fe-093a-42b9-bea0-26a93cb4e1ff\") " pod="openshift-multus/network-metrics-daemon-sq52t" Apr 20 15:02:22.585996 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.585955 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-x8tdg" Apr 20 15:02:22.594975 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.594951 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-nhq74" Apr 20 15:02:22.604363 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.604343 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-5nnpq" Apr 20 15:02:22.611951 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.611931 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-qh2tf" Apr 20 15:02:22.618599 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.618582 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xt47l" Apr 20 15:02:22.625180 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.625164 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-v4jvg" Apr 20 15:02:22.631753 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.631734 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-b568r" Apr 20 15:02:22.636348 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.636326 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-m6lqv" Apr 20 15:02:22.960808 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:22.960776 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod772b645d_af27_49ac_9efa_a2cf5ea2725a.slice/crio-aeae1fb9ef7c391bb3105cc206c3732ee95fadc7a63b667fcd46ca2b3a24e08a WatchSource:0}: Error finding container aeae1fb9ef7c391bb3105cc206c3732ee95fadc7a63b667fcd46ca2b3a24e08a: Status 404 returned error can't find the container with id aeae1fb9ef7c391bb3105cc206c3732ee95fadc7a63b667fcd46ca2b3a24e08a Apr 20 15:02:22.962951 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:22.962879 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc683848d_ce2a_4a44_8396_37b7a8863b07.slice/crio-3155d55ea309c4b4ee53cc3e3c0b03ec8ca279950bad7844df5dd10cc0ee7f1f WatchSource:0}: Error finding container 3155d55ea309c4b4ee53cc3e3c0b03ec8ca279950bad7844df5dd10cc0ee7f1f: Status 404 returned error can't find the container with id 3155d55ea309c4b4ee53cc3e3c0b03ec8ca279950bad7844df5dd10cc0ee7f1f Apr 20 15:02:22.964549 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:22.964527 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5907ebec_df4b_4653_b568_7e4913dcec73.slice/crio-3c950cd6e8de94d73a73bd22f23df09235aad79b3708106c1504c4373342690d WatchSource:0}: Error finding container 3c950cd6e8de94d73a73bd22f23df09235aad79b3708106c1504c4373342690d: Status 404 returned error can't find the container with id 3c950cd6e8de94d73a73bd22f23df09235aad79b3708106c1504c4373342690d Apr 20 15:02:22.966257 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:22.966228 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22757559_941f_4d9e_9128_3aeefc6665f3.slice/crio-6c1ce787d8facf2ce8b83f0c8ecf0a6bda96de36d46f861f5df835d284616a1c WatchSource:0}: Error finding container 6c1ce787d8facf2ce8b83f0c8ecf0a6bda96de36d46f861f5df835d284616a1c: Status 404 returned error can't find the container with id 6c1ce787d8facf2ce8b83f0c8ecf0a6bda96de36d46f861f5df835d284616a1c Apr 20 15:02:22.967118 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:22.967092 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8345e9c1_deff_477f_ba7b_f5320279bda9.slice/crio-dc955dc049362a62163277fa1615ba30bd5cce6be6a550a995f26f76170632fb WatchSource:0}: Error finding container dc955dc049362a62163277fa1615ba30bd5cce6be6a550a995f26f76170632fb: Status 404 returned error can't find the container with id dc955dc049362a62163277fa1615ba30bd5cce6be6a550a995f26f76170632fb Apr 20 15:02:22.968454 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:22.968428 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde3eeafd_2ce5_4f51_9232_ac55f91bb7af.slice/crio-4f38838b42d2b2d8d7717db1c95c30734a08c5d00e0b4c1bc0635b7989a70a01 WatchSource:0}: Error finding container 4f38838b42d2b2d8d7717db1c95c30734a08c5d00e0b4c1bc0635b7989a70a01: Status 404 returned error can't find the container with id 4f38838b42d2b2d8d7717db1c95c30734a08c5d00e0b4c1bc0635b7989a70a01 Apr 20 15:02:22.970154 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:22.970131 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f117f6f_4e31_4bc5_91d0_9a6176af628e.slice/crio-8e76982ecbbb9bb49e89eb46d94d8eea74921f989ad2772425f9c9954b6e3ea7 WatchSource:0}: Error finding container 8e76982ecbbb9bb49e89eb46d94d8eea74921f989ad2772425f9c9954b6e3ea7: Status 404 returned error can't find the container with id 8e76982ecbbb9bb49e89eb46d94d8eea74921f989ad2772425f9c9954b6e3ea7 Apr 20 15:02:22.971056 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:22.971029 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb13e155_4ffa_49d2_8c49_c34c374a7d61.slice/crio-baf4b744b18206e7983cd0ae225d0d91682e49613ccc230cbd3417608a4ece23 WatchSource:0}: Error finding container baf4b744b18206e7983cd0ae225d0d91682e49613ccc230cbd3417608a4ece23: Status 404 returned error can't find the container with id baf4b744b18206e7983cd0ae225d0d91682e49613ccc230cbd3417608a4ece23 Apr 20 15:02:22.996641 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:22.996620 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7215f3fe-093a-42b9-bea0-26a93cb4e1ff-metrics-certs\") pod \"network-metrics-daemon-sq52t\" (UID: \"7215f3fe-093a-42b9-bea0-26a93cb4e1ff\") " pod="openshift-multus/network-metrics-daemon-sq52t" Apr 20 15:02:22.996739 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:22.996725 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 15:02:22.996803 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:22.996770 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7215f3fe-093a-42b9-bea0-26a93cb4e1ff-metrics-certs podName:7215f3fe-093a-42b9-bea0-26a93cb4e1ff nodeName:}" failed. No retries permitted until 2026-04-20 15:02:23.996757825 +0000 UTC m=+4.127633107 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7215f3fe-093a-42b9-bea0-26a93cb4e1ff-metrics-certs") pod "network-metrics-daemon-sq52t" (UID: "7215f3fe-093a-42b9-bea0-26a93cb4e1ff") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 15:02:23.097811 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:23.097775 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8wdxz\" (UniqueName: \"kubernetes.io/projected/31c75c33-7396-4dd6-8404-bd0e038f65b3-kube-api-access-8wdxz\") pod \"network-check-target-5f8cl\" (UID: \"31c75c33-7396-4dd6-8404-bd0e038f65b3\") " pod="openshift-network-diagnostics/network-check-target-5f8cl" Apr 20 15:02:23.097936 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:23.097913 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 15:02:23.097936 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:23.097932 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 15:02:23.098037 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:23.097944 2572 projected.go:194] Error preparing data for projected volume kube-api-access-8wdxz for pod openshift-network-diagnostics/network-check-target-5f8cl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 15:02:23.098037 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:23.098003 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/31c75c33-7396-4dd6-8404-bd0e038f65b3-kube-api-access-8wdxz podName:31c75c33-7396-4dd6-8404-bd0e038f65b3 nodeName:}" failed. No retries permitted until 2026-04-20 15:02:24.097984183 +0000 UTC m=+4.228859471 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-8wdxz" (UniqueName: "kubernetes.io/projected/31c75c33-7396-4dd6-8404-bd0e038f65b3-kube-api-access-8wdxz") pod "network-check-target-5f8cl" (UID: "31c75c33-7396-4dd6-8404-bd0e038f65b3") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 15:02:23.317670 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:23.317589 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 14:57:21 +0000 UTC" deadline="2027-12-25 19:29:25.143842793 +0000 UTC" Apr 20 15:02:23.317670 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:23.317626 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14740h27m1.826220862s" Apr 20 15:02:23.354259 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:23.354231 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-wld75"] Apr 20 15:02:23.355835 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:23.355816 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wld75" Apr 20 15:02:23.355948 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:23.355890 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wld75" podUID="1c29cec3-b554-4473-90f3-87629635db89" Apr 20 15:02:23.400319 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:23.400291 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/1c29cec3-b554-4473-90f3-87629635db89-kubelet-config\") pod \"global-pull-secret-syncer-wld75\" (UID: \"1c29cec3-b554-4473-90f3-87629635db89\") " pod="kube-system/global-pull-secret-syncer-wld75" Apr 20 15:02:23.400319 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:23.400324 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/1c29cec3-b554-4473-90f3-87629635db89-original-pull-secret\") pod \"global-pull-secret-syncer-wld75\" (UID: \"1c29cec3-b554-4473-90f3-87629635db89\") " pod="kube-system/global-pull-secret-syncer-wld75" Apr 20 15:02:23.400546 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:23.400399 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/1c29cec3-b554-4473-90f3-87629635db89-dbus\") pod \"global-pull-secret-syncer-wld75\" (UID: \"1c29cec3-b554-4473-90f3-87629635db89\") " pod="kube-system/global-pull-secret-syncer-wld75" Apr 20 15:02:23.412357 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:23.412325 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nhq74" event={"ID":"772b645d-af27-49ac-9efa-a2cf5ea2725a","Type":"ContainerStarted","Data":"aeae1fb9ef7c391bb3105cc206c3732ee95fadc7a63b667fcd46ca2b3a24e08a"} Apr 20 15:02:23.413400 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:23.413364 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-qh2tf" event={"ID":"cb13e155-4ffa-49d2-8c49-c34c374a7d61","Type":"ContainerStarted","Data":"baf4b744b18206e7983cd0ae225d0d91682e49613ccc230cbd3417608a4ece23"} Apr 20 15:02:23.414442 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:23.414414 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-v4jvg" event={"ID":"1f117f6f-4e31-4bc5-91d0-9a6176af628e","Type":"ContainerStarted","Data":"8e76982ecbbb9bb49e89eb46d94d8eea74921f989ad2772425f9c9954b6e3ea7"} Apr 20 15:02:23.415494 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:23.415448 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-x8tdg" event={"ID":"22757559-941f-4d9e-9128-3aeefc6665f3","Type":"ContainerStarted","Data":"6c1ce787d8facf2ce8b83f0c8ecf0a6bda96de36d46f861f5df835d284616a1c"} Apr 20 15:02:23.416524 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:23.416478 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-5nnpq" event={"ID":"c683848d-ce2a-4a44-8396-37b7a8863b07","Type":"ContainerStarted","Data":"3155d55ea309c4b4ee53cc3e3c0b03ec8ca279950bad7844df5dd10cc0ee7f1f"} Apr 20 15:02:23.418093 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:23.418069 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-230.ec2.internal" event={"ID":"22ceeb306285ffc0c74b817e3784b6a3","Type":"ContainerStarted","Data":"0251c219cb601c74723edbcaf66bd690611614b050d646305ec14e1eee00bcb0"} Apr 20 15:02:23.419427 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:23.419400 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-b568r" event={"ID":"de3eeafd-2ce5-4f51-9232-ac55f91bb7af","Type":"ContainerStarted","Data":"4f38838b42d2b2d8d7717db1c95c30734a08c5d00e0b4c1bc0635b7989a70a01"} Apr 20 15:02:23.420597 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:23.420570 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-m6lqv" event={"ID":"8345e9c1-deff-477f-ba7b-f5320279bda9","Type":"ContainerStarted","Data":"dc955dc049362a62163277fa1615ba30bd5cce6be6a550a995f26f76170632fb"} Apr 20 15:02:23.425396 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:23.425341 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xt47l" event={"ID":"5907ebec-df4b-4653-b568-7e4913dcec73","Type":"ContainerStarted","Data":"3c950cd6e8de94d73a73bd22f23df09235aad79b3708106c1504c4373342690d"} Apr 20 15:02:23.434457 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:23.434411 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-230.ec2.internal" podStartSLOduration=2.434398213 podStartE2EDuration="2.434398213s" podCreationTimestamp="2026-04-20 15:02:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 15:02:23.434283747 +0000 UTC m=+3.565159049" watchObservedRunningTime="2026-04-20 15:02:23.434398213 +0000 UTC m=+3.565273517" Apr 20 15:02:23.501126 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:23.500969 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/1c29cec3-b554-4473-90f3-87629635db89-kubelet-config\") pod \"global-pull-secret-syncer-wld75\" (UID: \"1c29cec3-b554-4473-90f3-87629635db89\") " pod="kube-system/global-pull-secret-syncer-wld75" Apr 20 15:02:23.501126 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:23.501015 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/1c29cec3-b554-4473-90f3-87629635db89-original-pull-secret\") pod \"global-pull-secret-syncer-wld75\" (UID: \"1c29cec3-b554-4473-90f3-87629635db89\") " pod="kube-system/global-pull-secret-syncer-wld75" Apr 20 15:02:23.501126 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:23.501048 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/1c29cec3-b554-4473-90f3-87629635db89-kubelet-config\") pod \"global-pull-secret-syncer-wld75\" (UID: \"1c29cec3-b554-4473-90f3-87629635db89\") " pod="kube-system/global-pull-secret-syncer-wld75" Apr 20 15:02:23.501126 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:23.501053 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/1c29cec3-b554-4473-90f3-87629635db89-dbus\") pod \"global-pull-secret-syncer-wld75\" (UID: \"1c29cec3-b554-4473-90f3-87629635db89\") " pod="kube-system/global-pull-secret-syncer-wld75" Apr 20 15:02:23.501126 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:23.501098 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/1c29cec3-b554-4473-90f3-87629635db89-dbus\") pod \"global-pull-secret-syncer-wld75\" (UID: \"1c29cec3-b554-4473-90f3-87629635db89\") " pod="kube-system/global-pull-secret-syncer-wld75" Apr 20 15:02:23.501354 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:23.501150 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 15:02:23.501354 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:23.501204 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1c29cec3-b554-4473-90f3-87629635db89-original-pull-secret podName:1c29cec3-b554-4473-90f3-87629635db89 nodeName:}" failed. No retries permitted until 2026-04-20 15:02:24.001186834 +0000 UTC m=+4.132062115 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/1c29cec3-b554-4473-90f3-87629635db89-original-pull-secret") pod "global-pull-secret-syncer-wld75" (UID: "1c29cec3-b554-4473-90f3-87629635db89") : object "kube-system"/"original-pull-secret" not registered Apr 20 15:02:23.548058 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:23.548002 2572 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 15:02:24.005505 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:24.005197 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/1c29cec3-b554-4473-90f3-87629635db89-original-pull-secret\") pod \"global-pull-secret-syncer-wld75\" (UID: \"1c29cec3-b554-4473-90f3-87629635db89\") " pod="kube-system/global-pull-secret-syncer-wld75" Apr 20 15:02:24.005505 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:24.005472 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7215f3fe-093a-42b9-bea0-26a93cb4e1ff-metrics-certs\") pod \"network-metrics-daemon-sq52t\" (UID: \"7215f3fe-093a-42b9-bea0-26a93cb4e1ff\") " pod="openshift-multus/network-metrics-daemon-sq52t" Apr 20 15:02:24.005719 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:24.005655 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 15:02:24.005719 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:24.005718 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7215f3fe-093a-42b9-bea0-26a93cb4e1ff-metrics-certs podName:7215f3fe-093a-42b9-bea0-26a93cb4e1ff nodeName:}" failed. No retries permitted until 2026-04-20 15:02:26.005699971 +0000 UTC m=+6.136575258 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7215f3fe-093a-42b9-bea0-26a93cb4e1ff-metrics-certs") pod "network-metrics-daemon-sq52t" (UID: "7215f3fe-093a-42b9-bea0-26a93cb4e1ff") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 15:02:24.005973 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:24.005947 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 15:02:24.006025 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:24.006017 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1c29cec3-b554-4473-90f3-87629635db89-original-pull-secret podName:1c29cec3-b554-4473-90f3-87629635db89 nodeName:}" failed. No retries permitted until 2026-04-20 15:02:25.00599985 +0000 UTC m=+5.136875146 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/1c29cec3-b554-4473-90f3-87629635db89-original-pull-secret") pod "global-pull-secret-syncer-wld75" (UID: "1c29cec3-b554-4473-90f3-87629635db89") : object "kube-system"/"original-pull-secret" not registered Apr 20 15:02:24.106035 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:24.105997 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8wdxz\" (UniqueName: \"kubernetes.io/projected/31c75c33-7396-4dd6-8404-bd0e038f65b3-kube-api-access-8wdxz\") pod \"network-check-target-5f8cl\" (UID: \"31c75c33-7396-4dd6-8404-bd0e038f65b3\") " pod="openshift-network-diagnostics/network-check-target-5f8cl" Apr 20 15:02:24.106229 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:24.106207 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 15:02:24.106229 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:24.106229 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 15:02:24.106343 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:24.106242 2572 projected.go:194] Error preparing data for projected volume kube-api-access-8wdxz for pod openshift-network-diagnostics/network-check-target-5f8cl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 15:02:24.106343 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:24.106302 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/31c75c33-7396-4dd6-8404-bd0e038f65b3-kube-api-access-8wdxz podName:31c75c33-7396-4dd6-8404-bd0e038f65b3 nodeName:}" failed. No retries permitted until 2026-04-20 15:02:26.106283823 +0000 UTC m=+6.237159111 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-8wdxz" (UniqueName: "kubernetes.io/projected/31c75c33-7396-4dd6-8404-bd0e038f65b3-kube-api-access-8wdxz") pod "network-check-target-5f8cl" (UID: "31c75c33-7396-4dd6-8404-bd0e038f65b3") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 15:02:24.406776 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:24.406706 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sq52t" Apr 20 15:02:24.407187 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:24.406837 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sq52t" podUID="7215f3fe-093a-42b9-bea0-26a93cb4e1ff" Apr 20 15:02:24.407524 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:24.407477 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5f8cl" Apr 20 15:02:24.407654 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:24.407628 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5f8cl" podUID="31c75c33-7396-4dd6-8404-bd0e038f65b3" Apr 20 15:02:24.437862 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:24.436901 2572 generic.go:358] "Generic (PLEG): container finished" podID="00a4ac28b2d170e2b762126f5342671d" containerID="9ca84e9279e0071d529c054a5fbf137cf9993cf837f2bc4e81a865ccae80079f" exitCode=0 Apr 20 15:02:24.437862 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:24.437758 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-230.ec2.internal" event={"ID":"00a4ac28b2d170e2b762126f5342671d","Type":"ContainerDied","Data":"9ca84e9279e0071d529c054a5fbf137cf9993cf837f2bc4e81a865ccae80079f"} Apr 20 15:02:25.014844 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:25.014767 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/1c29cec3-b554-4473-90f3-87629635db89-original-pull-secret\") pod \"global-pull-secret-syncer-wld75\" (UID: \"1c29cec3-b554-4473-90f3-87629635db89\") " pod="kube-system/global-pull-secret-syncer-wld75" Apr 20 15:02:25.014994 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:25.014914 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 15:02:25.014994 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:25.014973 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1c29cec3-b554-4473-90f3-87629635db89-original-pull-secret podName:1c29cec3-b554-4473-90f3-87629635db89 nodeName:}" failed. No retries permitted until 2026-04-20 15:02:27.014957124 +0000 UTC m=+7.145832420 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/1c29cec3-b554-4473-90f3-87629635db89-original-pull-secret") pod "global-pull-secret-syncer-wld75" (UID: "1c29cec3-b554-4473-90f3-87629635db89") : object "kube-system"/"original-pull-secret" not registered Apr 20 15:02:25.405464 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:25.405434 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wld75" Apr 20 15:02:25.405654 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:25.405583 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wld75" podUID="1c29cec3-b554-4473-90f3-87629635db89" Apr 20 15:02:25.442431 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:25.442397 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-230.ec2.internal" event={"ID":"00a4ac28b2d170e2b762126f5342671d","Type":"ContainerStarted","Data":"67dfb84e543807de026c30e65aa1417533d3a30578ff509055eb996c3d048bdf"} Apr 20 15:02:26.024230 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:26.023713 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7215f3fe-093a-42b9-bea0-26a93cb4e1ff-metrics-certs\") pod \"network-metrics-daemon-sq52t\" (UID: \"7215f3fe-093a-42b9-bea0-26a93cb4e1ff\") " pod="openshift-multus/network-metrics-daemon-sq52t" Apr 20 15:02:26.024230 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:26.023857 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 15:02:26.024230 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:26.023913 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7215f3fe-093a-42b9-bea0-26a93cb4e1ff-metrics-certs podName:7215f3fe-093a-42b9-bea0-26a93cb4e1ff nodeName:}" failed. No retries permitted until 2026-04-20 15:02:30.023895334 +0000 UTC m=+10.154770623 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7215f3fe-093a-42b9-bea0-26a93cb4e1ff-metrics-certs") pod "network-metrics-daemon-sq52t" (UID: "7215f3fe-093a-42b9-bea0-26a93cb4e1ff") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 15:02:26.124946 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:26.124912 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8wdxz\" (UniqueName: \"kubernetes.io/projected/31c75c33-7396-4dd6-8404-bd0e038f65b3-kube-api-access-8wdxz\") pod \"network-check-target-5f8cl\" (UID: \"31c75c33-7396-4dd6-8404-bd0e038f65b3\") " pod="openshift-network-diagnostics/network-check-target-5f8cl" Apr 20 15:02:26.125123 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:26.125075 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 15:02:26.125123 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:26.125094 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 15:02:26.125123 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:26.125105 2572 projected.go:194] Error preparing data for projected volume kube-api-access-8wdxz for pod openshift-network-diagnostics/network-check-target-5f8cl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 15:02:26.125282 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:26.125160 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/31c75c33-7396-4dd6-8404-bd0e038f65b3-kube-api-access-8wdxz podName:31c75c33-7396-4dd6-8404-bd0e038f65b3 nodeName:}" failed. No retries permitted until 2026-04-20 15:02:30.125141342 +0000 UTC m=+10.256016630 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-8wdxz" (UniqueName: "kubernetes.io/projected/31c75c33-7396-4dd6-8404-bd0e038f65b3-kube-api-access-8wdxz") pod "network-check-target-5f8cl" (UID: "31c75c33-7396-4dd6-8404-bd0e038f65b3") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 15:02:26.405216 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:26.405119 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sq52t" Apr 20 15:02:26.405386 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:26.405257 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sq52t" podUID="7215f3fe-093a-42b9-bea0-26a93cb4e1ff" Apr 20 15:02:26.405683 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:26.405513 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5f8cl" Apr 20 15:02:26.405683 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:26.405639 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5f8cl" podUID="31c75c33-7396-4dd6-8404-bd0e038f65b3" Apr 20 15:02:27.032441 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:27.032381 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/1c29cec3-b554-4473-90f3-87629635db89-original-pull-secret\") pod \"global-pull-secret-syncer-wld75\" (UID: \"1c29cec3-b554-4473-90f3-87629635db89\") " pod="kube-system/global-pull-secret-syncer-wld75" Apr 20 15:02:27.032892 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:27.032560 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 15:02:27.032892 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:27.032642 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1c29cec3-b554-4473-90f3-87629635db89-original-pull-secret podName:1c29cec3-b554-4473-90f3-87629635db89 nodeName:}" failed. No retries permitted until 2026-04-20 15:02:31.032618557 +0000 UTC m=+11.163493861 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/1c29cec3-b554-4473-90f3-87629635db89-original-pull-secret") pod "global-pull-secret-syncer-wld75" (UID: "1c29cec3-b554-4473-90f3-87629635db89") : object "kube-system"/"original-pull-secret" not registered Apr 20 15:02:27.405559 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:27.405472 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wld75" Apr 20 15:02:27.405722 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:27.405619 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wld75" podUID="1c29cec3-b554-4473-90f3-87629635db89" Apr 20 15:02:28.405289 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:28.405253 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5f8cl" Apr 20 15:02:28.405769 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:28.405381 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5f8cl" podUID="31c75c33-7396-4dd6-8404-bd0e038f65b3" Apr 20 15:02:28.405769 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:28.405446 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sq52t" Apr 20 15:02:28.405769 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:28.405582 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sq52t" podUID="7215f3fe-093a-42b9-bea0-26a93cb4e1ff" Apr 20 15:02:29.405662 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:29.405611 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wld75" Apr 20 15:02:29.406068 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:29.405743 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wld75" podUID="1c29cec3-b554-4473-90f3-87629635db89" Apr 20 15:02:30.058193 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:30.058157 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7215f3fe-093a-42b9-bea0-26a93cb4e1ff-metrics-certs\") pod \"network-metrics-daemon-sq52t\" (UID: \"7215f3fe-093a-42b9-bea0-26a93cb4e1ff\") " pod="openshift-multus/network-metrics-daemon-sq52t" Apr 20 15:02:30.058371 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:30.058328 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 15:02:30.058441 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:30.058401 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7215f3fe-093a-42b9-bea0-26a93cb4e1ff-metrics-certs podName:7215f3fe-093a-42b9-bea0-26a93cb4e1ff nodeName:}" failed. No retries permitted until 2026-04-20 15:02:38.058378009 +0000 UTC m=+18.189253303 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7215f3fe-093a-42b9-bea0-26a93cb4e1ff-metrics-certs") pod "network-metrics-daemon-sq52t" (UID: "7215f3fe-093a-42b9-bea0-26a93cb4e1ff") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 15:02:30.158988 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:30.158947 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8wdxz\" (UniqueName: \"kubernetes.io/projected/31c75c33-7396-4dd6-8404-bd0e038f65b3-kube-api-access-8wdxz\") pod \"network-check-target-5f8cl\" (UID: \"31c75c33-7396-4dd6-8404-bd0e038f65b3\") " pod="openshift-network-diagnostics/network-check-target-5f8cl" Apr 20 15:02:30.159166 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:30.159144 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 15:02:30.159240 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:30.159171 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 15:02:30.159240 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:30.159186 2572 projected.go:194] Error preparing data for projected volume kube-api-access-8wdxz for pod openshift-network-diagnostics/network-check-target-5f8cl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 15:02:30.159339 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:30.159255 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/31c75c33-7396-4dd6-8404-bd0e038f65b3-kube-api-access-8wdxz podName:31c75c33-7396-4dd6-8404-bd0e038f65b3 nodeName:}" failed. No retries permitted until 2026-04-20 15:02:38.159235345 +0000 UTC m=+18.290110628 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-8wdxz" (UniqueName: "kubernetes.io/projected/31c75c33-7396-4dd6-8404-bd0e038f65b3-kube-api-access-8wdxz") pod "network-check-target-5f8cl" (UID: "31c75c33-7396-4dd6-8404-bd0e038f65b3") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 15:02:30.406986 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:30.406642 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5f8cl" Apr 20 15:02:30.406986 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:30.406740 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5f8cl" podUID="31c75c33-7396-4dd6-8404-bd0e038f65b3" Apr 20 15:02:30.406986 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:30.406799 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sq52t" Apr 20 15:02:30.406986 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:30.406880 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sq52t" podUID="7215f3fe-093a-42b9-bea0-26a93cb4e1ff" Apr 20 15:02:31.064437 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:31.064394 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/1c29cec3-b554-4473-90f3-87629635db89-original-pull-secret\") pod \"global-pull-secret-syncer-wld75\" (UID: \"1c29cec3-b554-4473-90f3-87629635db89\") " pod="kube-system/global-pull-secret-syncer-wld75" Apr 20 15:02:31.064644 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:31.064593 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 15:02:31.064715 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:31.064674 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1c29cec3-b554-4473-90f3-87629635db89-original-pull-secret podName:1c29cec3-b554-4473-90f3-87629635db89 nodeName:}" failed. No retries permitted until 2026-04-20 15:02:39.064653796 +0000 UTC m=+19.195529079 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/1c29cec3-b554-4473-90f3-87629635db89-original-pull-secret") pod "global-pull-secret-syncer-wld75" (UID: "1c29cec3-b554-4473-90f3-87629635db89") : object "kube-system"/"original-pull-secret" not registered Apr 20 15:02:31.405957 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:31.405883 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wld75" Apr 20 15:02:31.406121 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:31.406026 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wld75" podUID="1c29cec3-b554-4473-90f3-87629635db89" Apr 20 15:02:32.405397 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:32.405352 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5f8cl" Apr 20 15:02:32.405993 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:32.405497 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5f8cl" podUID="31c75c33-7396-4dd6-8404-bd0e038f65b3" Apr 20 15:02:32.405993 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:32.405540 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sq52t" Apr 20 15:02:32.405993 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:32.405662 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sq52t" podUID="7215f3fe-093a-42b9-bea0-26a93cb4e1ff" Apr 20 15:02:33.404975 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:33.404942 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wld75" Apr 20 15:02:33.405136 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:33.405055 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wld75" podUID="1c29cec3-b554-4473-90f3-87629635db89" Apr 20 15:02:34.405236 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:34.405172 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5f8cl" Apr 20 15:02:34.405691 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:34.405302 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5f8cl" podUID="31c75c33-7396-4dd6-8404-bd0e038f65b3" Apr 20 15:02:34.405691 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:34.405363 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sq52t" Apr 20 15:02:34.405691 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:34.405501 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sq52t" podUID="7215f3fe-093a-42b9-bea0-26a93cb4e1ff" Apr 20 15:02:35.405377 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:35.405347 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wld75" Apr 20 15:02:35.405861 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:35.405464 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wld75" podUID="1c29cec3-b554-4473-90f3-87629635db89" Apr 20 15:02:36.405995 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:36.405961 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sq52t" Apr 20 15:02:36.406372 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:36.405964 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5f8cl" Apr 20 15:02:36.406372 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:36.406071 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sq52t" podUID="7215f3fe-093a-42b9-bea0-26a93cb4e1ff" Apr 20 15:02:36.406372 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:36.406160 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5f8cl" podUID="31c75c33-7396-4dd6-8404-bd0e038f65b3" Apr 20 15:02:37.405101 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:37.405073 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wld75" Apr 20 15:02:37.405294 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:37.405169 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wld75" podUID="1c29cec3-b554-4473-90f3-87629635db89" Apr 20 15:02:38.120892 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:38.120821 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7215f3fe-093a-42b9-bea0-26a93cb4e1ff-metrics-certs\") pod \"network-metrics-daemon-sq52t\" (UID: \"7215f3fe-093a-42b9-bea0-26a93cb4e1ff\") " pod="openshift-multus/network-metrics-daemon-sq52t" Apr 20 15:02:38.121360 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:38.120958 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 15:02:38.121360 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:38.121032 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7215f3fe-093a-42b9-bea0-26a93cb4e1ff-metrics-certs podName:7215f3fe-093a-42b9-bea0-26a93cb4e1ff nodeName:}" failed. No retries permitted until 2026-04-20 15:02:54.121010964 +0000 UTC m=+34.251886248 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7215f3fe-093a-42b9-bea0-26a93cb4e1ff-metrics-certs") pod "network-metrics-daemon-sq52t" (UID: "7215f3fe-093a-42b9-bea0-26a93cb4e1ff") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 15:02:38.221861 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:38.221832 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8wdxz\" (UniqueName: \"kubernetes.io/projected/31c75c33-7396-4dd6-8404-bd0e038f65b3-kube-api-access-8wdxz\") pod \"network-check-target-5f8cl\" (UID: \"31c75c33-7396-4dd6-8404-bd0e038f65b3\") " pod="openshift-network-diagnostics/network-check-target-5f8cl" Apr 20 15:02:38.222033 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:38.221984 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 15:02:38.222033 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:38.222005 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 15:02:38.222033 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:38.222016 2572 projected.go:194] Error preparing data for projected volume kube-api-access-8wdxz for pod openshift-network-diagnostics/network-check-target-5f8cl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 15:02:38.222170 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:38.222063 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/31c75c33-7396-4dd6-8404-bd0e038f65b3-kube-api-access-8wdxz podName:31c75c33-7396-4dd6-8404-bd0e038f65b3 nodeName:}" failed. No retries permitted until 2026-04-20 15:02:54.222051204 +0000 UTC m=+34.352926485 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-8wdxz" (UniqueName: "kubernetes.io/projected/31c75c33-7396-4dd6-8404-bd0e038f65b3-kube-api-access-8wdxz") pod "network-check-target-5f8cl" (UID: "31c75c33-7396-4dd6-8404-bd0e038f65b3") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 15:02:38.405265 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:38.405187 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sq52t" Apr 20 15:02:38.405421 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:38.405187 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5f8cl" Apr 20 15:02:38.405421 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:38.405333 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sq52t" podUID="7215f3fe-093a-42b9-bea0-26a93cb4e1ff" Apr 20 15:02:38.405421 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:38.405395 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5f8cl" podUID="31c75c33-7396-4dd6-8404-bd0e038f65b3" Apr 20 15:02:39.126890 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:39.126854 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/1c29cec3-b554-4473-90f3-87629635db89-original-pull-secret\") pod \"global-pull-secret-syncer-wld75\" (UID: \"1c29cec3-b554-4473-90f3-87629635db89\") " pod="kube-system/global-pull-secret-syncer-wld75" Apr 20 15:02:39.127333 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:39.126982 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 15:02:39.127333 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:39.127047 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1c29cec3-b554-4473-90f3-87629635db89-original-pull-secret podName:1c29cec3-b554-4473-90f3-87629635db89 nodeName:}" failed. No retries permitted until 2026-04-20 15:02:55.127031503 +0000 UTC m=+35.257906789 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/1c29cec3-b554-4473-90f3-87629635db89-original-pull-secret") pod "global-pull-secret-syncer-wld75" (UID: "1c29cec3-b554-4473-90f3-87629635db89") : object "kube-system"/"original-pull-secret" not registered Apr 20 15:02:39.405623 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:39.405540 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wld75" Apr 20 15:02:39.405755 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:39.405705 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wld75" podUID="1c29cec3-b554-4473-90f3-87629635db89" Apr 20 15:02:40.406646 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:40.406430 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5f8cl" Apr 20 15:02:40.407306 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:40.406563 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sq52t" Apr 20 15:02:40.407306 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:40.406715 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5f8cl" podUID="31c75c33-7396-4dd6-8404-bd0e038f65b3" Apr 20 15:02:40.407306 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:40.406821 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sq52t" podUID="7215f3fe-093a-42b9-bea0-26a93cb4e1ff" Apr 20 15:02:40.466662 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:40.466581 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-b568r" event={"ID":"de3eeafd-2ce5-4f51-9232-ac55f91bb7af","Type":"ContainerStarted","Data":"586bacf5fcc5b7c4b485a53b52e2bd24b697ff457d41e1d0075dafe2ab3fc152"} Apr 20 15:02:40.468051 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:40.468024 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-m6lqv" event={"ID":"8345e9c1-deff-477f-ba7b-f5320279bda9","Type":"ContainerStarted","Data":"ff9e7a46983f82683190a67004fe22acc929fde9de655b70bede7f5bb6565982"} Apr 20 15:02:40.469359 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:40.469334 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xt47l" event={"ID":"5907ebec-df4b-4653-b568-7e4913dcec73","Type":"ContainerStarted","Data":"db2f4c3323767ed3af1c80be7c6bb8cd19bd5a94578e7e0659ebf6717d693010"} Apr 20 15:02:40.472006 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:40.471983 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nhq74" event={"ID":"772b645d-af27-49ac-9efa-a2cf5ea2725a","Type":"ContainerStarted","Data":"f6fd5b702fcfd86f5183d8cc91dc34b17526a908815302af3c281d6f23a71278"} Apr 20 15:02:40.472100 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:40.472009 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nhq74" event={"ID":"772b645d-af27-49ac-9efa-a2cf5ea2725a","Type":"ContainerStarted","Data":"826072524f44b9b9b6af2a068773e6cca232a04f81a2fc4fdf38b7e8b9f1c3c1"} Apr 20 15:02:40.472100 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:40.472023 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nhq74" event={"ID":"772b645d-af27-49ac-9efa-a2cf5ea2725a","Type":"ContainerStarted","Data":"d64c3f1bfd91b895b97764394be0152e5305dbf40d82e2a8daca02a62e0acf61"} Apr 20 15:02:40.472100 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:40.472035 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nhq74" event={"ID":"772b645d-af27-49ac-9efa-a2cf5ea2725a","Type":"ContainerStarted","Data":"270262c06335222633c1baaa8eae2ce00f915ecf809ed09468785e269dcb6ee2"} Apr 20 15:02:40.472100 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:40.472048 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nhq74" event={"ID":"772b645d-af27-49ac-9efa-a2cf5ea2725a","Type":"ContainerStarted","Data":"b88b511f8f8dbcc370abdfef46c45ab204a92499d4ba447053cb7edfd03ab040"} Apr 20 15:02:40.472100 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:40.472057 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nhq74" event={"ID":"772b645d-af27-49ac-9efa-a2cf5ea2725a","Type":"ContainerStarted","Data":"e83f37c9cb2a63a863f8b02d043af8a9848c6beb1d9588d60a7591bd441195a2"} Apr 20 15:02:40.473269 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:40.473249 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-qh2tf" event={"ID":"cb13e155-4ffa-49d2-8c49-c34c374a7d61","Type":"ContainerStarted","Data":"5c10de851c40f7b5d3cde7e679e4d63a427b80ca259c8e0e4adb4d1fab39be5f"} Apr 20 15:02:40.474657 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:40.474622 2572 generic.go:358] "Generic (PLEG): container finished" podID="1f117f6f-4e31-4bc5-91d0-9a6176af628e" containerID="95859942a243f0e4dc6b292a430a3a1232ae31799c58b7efb2a19448858f9fb1" exitCode=0 Apr 20 15:02:40.474753 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:40.474668 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-v4jvg" event={"ID":"1f117f6f-4e31-4bc5-91d0-9a6176af628e","Type":"ContainerDied","Data":"95859942a243f0e4dc6b292a430a3a1232ae31799c58b7efb2a19448858f9fb1"} Apr 20 15:02:40.475965 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:40.475945 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-x8tdg" event={"ID":"22757559-941f-4d9e-9128-3aeefc6665f3","Type":"ContainerStarted","Data":"a1de8a6865e3b98c95609992b65802669550288cca5912127cf0746ca567cf79"} Apr 20 15:02:40.482975 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:40.482927 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-230.ec2.internal" podStartSLOduration=19.482912678 podStartE2EDuration="19.482912678s" podCreationTimestamp="2026-04-20 15:02:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 15:02:25.455860636 +0000 UTC m=+5.586735940" watchObservedRunningTime="2026-04-20 15:02:40.482912678 +0000 UTC m=+20.613788050" Apr 20 15:02:40.483144 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:40.483114 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-b568r" podStartSLOduration=3.8154883059999998 podStartE2EDuration="20.483107561s" podCreationTimestamp="2026-04-20 15:02:20 +0000 UTC" firstStartedPulling="2026-04-20 15:02:22.972329669 +0000 UTC m=+3.103204955" lastFinishedPulling="2026-04-20 15:02:39.639948927 +0000 UTC m=+19.770824210" observedRunningTime="2026-04-20 15:02:40.48215725 +0000 UTC m=+20.613032554" watchObservedRunningTime="2026-04-20 15:02:40.483107561 +0000 UTC m=+20.613982865" Apr 20 15:02:40.496331 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:40.496299 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-qh2tf" podStartSLOduration=3.903097061 podStartE2EDuration="20.49628948s" podCreationTimestamp="2026-04-20 15:02:20 +0000 UTC" firstStartedPulling="2026-04-20 15:02:22.97414264 +0000 UTC m=+3.105017938" lastFinishedPulling="2026-04-20 15:02:39.56733507 +0000 UTC m=+19.698210357" observedRunningTime="2026-04-20 15:02:40.495832268 +0000 UTC m=+20.626707611" watchObservedRunningTime="2026-04-20 15:02:40.49628948 +0000 UTC m=+20.627164783" Apr 20 15:02:40.507945 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:40.507913 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-x8tdg" podStartSLOduration=3.908816543 podStartE2EDuration="20.507905209s" podCreationTimestamp="2026-04-20 15:02:20 +0000 UTC" firstStartedPulling="2026-04-20 15:02:22.968236601 +0000 UTC m=+3.099111885" lastFinishedPulling="2026-04-20 15:02:39.56732527 +0000 UTC m=+19.698200551" observedRunningTime="2026-04-20 15:02:40.507757731 +0000 UTC m=+20.638633036" watchObservedRunningTime="2026-04-20 15:02:40.507905209 +0000 UTC m=+20.638780511" Apr 20 15:02:40.779847 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:40.779806 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-m6lqv" podStartSLOduration=4.109515557 podStartE2EDuration="20.779790959s" podCreationTimestamp="2026-04-20 15:02:20 +0000 UTC" firstStartedPulling="2026-04-20 15:02:22.969618294 +0000 UTC m=+3.100493580" lastFinishedPulling="2026-04-20 15:02:39.639893701 +0000 UTC m=+19.770768982" observedRunningTime="2026-04-20 15:02:40.545595711 +0000 UTC m=+20.676471013" watchObservedRunningTime="2026-04-20 15:02:40.779790959 +0000 UTC m=+20.910666262" Apr 20 15:02:40.780310 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:40.780292 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-p5xng"] Apr 20 15:02:40.785351 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:40.785331 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-p5xng" Apr 20 15:02:40.787644 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:40.787625 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 20 15:02:40.787867 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:40.787851 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 20 15:02:40.787977 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:40.787962 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-wt7fq\"" Apr 20 15:02:40.817324 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:40.817306 2572 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 20 15:02:40.942971 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:40.942935 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/034b6f21-f85a-4440-a061-d39a6df5dde4-hosts-file\") pod \"node-resolver-p5xng\" (UID: \"034b6f21-f85a-4440-a061-d39a6df5dde4\") " pod="openshift-dns/node-resolver-p5xng" Apr 20 15:02:40.943130 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:40.942995 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/034b6f21-f85a-4440-a061-d39a6df5dde4-tmp-dir\") pod \"node-resolver-p5xng\" (UID: \"034b6f21-f85a-4440-a061-d39a6df5dde4\") " pod="openshift-dns/node-resolver-p5xng" Apr 20 15:02:40.943130 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:40.943032 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlh82\" (UniqueName: \"kubernetes.io/projected/034b6f21-f85a-4440-a061-d39a6df5dde4-kube-api-access-nlh82\") pod \"node-resolver-p5xng\" (UID: \"034b6f21-f85a-4440-a061-d39a6df5dde4\") " pod="openshift-dns/node-resolver-p5xng" Apr 20 15:02:41.043627 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:41.043594 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/034b6f21-f85a-4440-a061-d39a6df5dde4-hosts-file\") pod \"node-resolver-p5xng\" (UID: \"034b6f21-f85a-4440-a061-d39a6df5dde4\") " pod="openshift-dns/node-resolver-p5xng" Apr 20 15:02:41.043627 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:41.043635 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/034b6f21-f85a-4440-a061-d39a6df5dde4-tmp-dir\") pod \"node-resolver-p5xng\" (UID: \"034b6f21-f85a-4440-a061-d39a6df5dde4\") " pod="openshift-dns/node-resolver-p5xng" Apr 20 15:02:41.043865 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:41.043654 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nlh82\" (UniqueName: \"kubernetes.io/projected/034b6f21-f85a-4440-a061-d39a6df5dde4-kube-api-access-nlh82\") pod \"node-resolver-p5xng\" (UID: \"034b6f21-f85a-4440-a061-d39a6df5dde4\") " pod="openshift-dns/node-resolver-p5xng" Apr 20 15:02:41.043865 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:41.043712 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/034b6f21-f85a-4440-a061-d39a6df5dde4-hosts-file\") pod \"node-resolver-p5xng\" (UID: \"034b6f21-f85a-4440-a061-d39a6df5dde4\") " pod="openshift-dns/node-resolver-p5xng" Apr 20 15:02:41.043938 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:41.043907 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/034b6f21-f85a-4440-a061-d39a6df5dde4-tmp-dir\") pod \"node-resolver-p5xng\" (UID: \"034b6f21-f85a-4440-a061-d39a6df5dde4\") " pod="openshift-dns/node-resolver-p5xng" Apr 20 15:02:41.053927 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:41.053903 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlh82\" (UniqueName: \"kubernetes.io/projected/034b6f21-f85a-4440-a061-d39a6df5dde4-kube-api-access-nlh82\") pod \"node-resolver-p5xng\" (UID: \"034b6f21-f85a-4440-a061-d39a6df5dde4\") " pod="openshift-dns/node-resolver-p5xng" Apr 20 15:02:41.097354 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:41.097330 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-p5xng" Apr 20 15:02:41.106135 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:41.106108 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod034b6f21_f85a_4440_a061_d39a6df5dde4.slice/crio-24b52fddaf535c4236682e0c931f83588eb05213f28348eda389af58c187dcf7 WatchSource:0}: Error finding container 24b52fddaf535c4236682e0c931f83588eb05213f28348eda389af58c187dcf7: Status 404 returned error can't find the container with id 24b52fddaf535c4236682e0c931f83588eb05213f28348eda389af58c187dcf7 Apr 20 15:02:41.357153 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:41.356986 2572 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-20T15:02:40.817320635Z","UUID":"d2e168a7-f5c2-4754-ba1f-cf794e2da5e5","Handler":null,"Name":"","Endpoint":""} Apr 20 15:02:41.358921 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:41.358892 2572 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 20 15:02:41.359054 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:41.358929 2572 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 20 15:02:41.405906 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:41.405873 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wld75" Apr 20 15:02:41.406074 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:41.406000 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wld75" podUID="1c29cec3-b554-4473-90f3-87629635db89" Apr 20 15:02:41.419620 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:41.419592 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-qh2tf" Apr 20 15:02:41.420437 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:41.420185 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-qh2tf" Apr 20 15:02:41.479321 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:41.479283 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-5nnpq" event={"ID":"c683848d-ce2a-4a44-8396-37b7a8863b07","Type":"ContainerStarted","Data":"dd03d1edf5caf87a8ecf153cc2691d8a3d1b117b8ca99f8c51b9697cc8bdfafd"} Apr 20 15:02:41.480832 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:41.480800 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-p5xng" event={"ID":"034b6f21-f85a-4440-a061-d39a6df5dde4","Type":"ContainerStarted","Data":"099a18f711866e1cc9756bccd0b387069ff003f9cc49fe0bb23eb399959aa58e"} Apr 20 15:02:41.480832 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:41.480832 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-p5xng" event={"ID":"034b6f21-f85a-4440-a061-d39a6df5dde4","Type":"ContainerStarted","Data":"24b52fddaf535c4236682e0c931f83588eb05213f28348eda389af58c187dcf7"} Apr 20 15:02:41.483442 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:41.482770 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xt47l" event={"ID":"5907ebec-df4b-4653-b568-7e4913dcec73","Type":"ContainerStarted","Data":"868d6cb583fb808c063c95b78aeb210f7bc3be8a0a65f06ff9c7ef082b6f836e"} Apr 20 15:02:41.506649 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:41.506600 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-5nnpq" podStartSLOduration=4.903922104 podStartE2EDuration="21.506583504s" podCreationTimestamp="2026-04-20 15:02:20 +0000 UTC" firstStartedPulling="2026-04-20 15:02:22.964679012 +0000 UTC m=+3.095554295" lastFinishedPulling="2026-04-20 15:02:39.567340408 +0000 UTC m=+19.698215695" observedRunningTime="2026-04-20 15:02:41.491772368 +0000 UTC m=+21.622647670" watchObservedRunningTime="2026-04-20 15:02:41.506583504 +0000 UTC m=+21.637458808" Apr 20 15:02:41.506954 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:41.506916 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-p5xng" podStartSLOduration=1.506904247 podStartE2EDuration="1.506904247s" podCreationTimestamp="2026-04-20 15:02:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 15:02:41.506207639 +0000 UTC m=+21.637082966" watchObservedRunningTime="2026-04-20 15:02:41.506904247 +0000 UTC m=+21.637779551" Apr 20 15:02:42.405279 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:42.405240 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5f8cl" Apr 20 15:02:42.405480 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:42.405253 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sq52t" Apr 20 15:02:42.405480 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:42.405351 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5f8cl" podUID="31c75c33-7396-4dd6-8404-bd0e038f65b3" Apr 20 15:02:42.405480 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:42.405413 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sq52t" podUID="7215f3fe-093a-42b9-bea0-26a93cb4e1ff" Apr 20 15:02:42.489022 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:42.488983 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xt47l" event={"ID":"5907ebec-df4b-4653-b568-7e4913dcec73","Type":"ContainerStarted","Data":"9b342cd2c3398ce8170b45a42098f3173f719706fa04ee14a231cd2e58e08d67"} Apr 20 15:02:42.491560 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:42.491531 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nhq74" event={"ID":"772b645d-af27-49ac-9efa-a2cf5ea2725a","Type":"ContainerStarted","Data":"693ae360598493d301201bd855a8058bb03b105a7edd5bdd98e608cf9824368a"} Apr 20 15:02:42.491682 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:42.491583 2572 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 20 15:02:43.405697 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:43.405520 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wld75" Apr 20 15:02:43.405878 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:43.405780 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wld75" podUID="1c29cec3-b554-4473-90f3-87629635db89" Apr 20 15:02:43.784993 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:43.784958 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-qh2tf" Apr 20 15:02:43.785441 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:43.785090 2572 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 20 15:02:43.785597 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:43.785570 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-qh2tf" Apr 20 15:02:43.799163 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:43.799124 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xt47l" podStartSLOduration=5.084204039 podStartE2EDuration="23.799111292s" podCreationTimestamp="2026-04-20 15:02:20 +0000 UTC" firstStartedPulling="2026-04-20 15:02:22.96611408 +0000 UTC m=+3.096989361" lastFinishedPulling="2026-04-20 15:02:41.681021318 +0000 UTC m=+21.811896614" observedRunningTime="2026-04-20 15:02:42.519747623 +0000 UTC m=+22.650622926" watchObservedRunningTime="2026-04-20 15:02:43.799111292 +0000 UTC m=+23.929986595" Apr 20 15:02:44.404984 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:44.404954 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sq52t" Apr 20 15:02:44.405144 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:44.405085 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sq52t" podUID="7215f3fe-093a-42b9-bea0-26a93cb4e1ff" Apr 20 15:02:44.405144 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:44.405108 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5f8cl" Apr 20 15:02:44.405263 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:44.405199 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5f8cl" podUID="31c75c33-7396-4dd6-8404-bd0e038f65b3" Apr 20 15:02:45.405120 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:45.405075 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wld75" Apr 20 15:02:45.405508 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:45.405229 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wld75" podUID="1c29cec3-b554-4473-90f3-87629635db89" Apr 20 15:02:46.405070 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:46.405039 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sq52t" Apr 20 15:02:46.405070 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:46.405039 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5f8cl" Apr 20 15:02:46.405573 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:46.405204 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sq52t" podUID="7215f3fe-093a-42b9-bea0-26a93cb4e1ff" Apr 20 15:02:46.405573 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:46.405242 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5f8cl" podUID="31c75c33-7396-4dd6-8404-bd0e038f65b3" Apr 20 15:02:46.499962 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:46.499929 2572 generic.go:358] "Generic (PLEG): container finished" podID="1f117f6f-4e31-4bc5-91d0-9a6176af628e" containerID="86535d7e620c0dbd9f07da500f3a6d86ba91eabf56947d2fe2eafbfcd909034a" exitCode=0 Apr 20 15:02:46.500082 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:46.500015 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-v4jvg" event={"ID":"1f117f6f-4e31-4bc5-91d0-9a6176af628e","Type":"ContainerDied","Data":"86535d7e620c0dbd9f07da500f3a6d86ba91eabf56947d2fe2eafbfcd909034a"} Apr 20 15:02:46.503448 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:46.503428 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nhq74" event={"ID":"772b645d-af27-49ac-9efa-a2cf5ea2725a","Type":"ContainerStarted","Data":"e72ab6db14ff430479b2a64daa52b37a7b82bd49df70e6fe0c79b86efe17a3bc"} Apr 20 15:02:46.503737 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:46.503721 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-nhq74" Apr 20 15:02:46.503793 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:46.503748 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-nhq74" Apr 20 15:02:46.517085 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:46.517068 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-nhq74" Apr 20 15:02:46.549423 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:46.549389 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-nhq74" podStartSLOduration=9.732965485 podStartE2EDuration="26.549379388s" podCreationTimestamp="2026-04-20 15:02:20 +0000 UTC" firstStartedPulling="2026-04-20 15:02:22.963215945 +0000 UTC m=+3.094091228" lastFinishedPulling="2026-04-20 15:02:39.779629849 +0000 UTC m=+19.910505131" observedRunningTime="2026-04-20 15:02:46.545917646 +0000 UTC m=+26.676792948" watchObservedRunningTime="2026-04-20 15:02:46.549379388 +0000 UTC m=+26.680254690" Apr 20 15:02:47.405527 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:47.405478 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wld75" Apr 20 15:02:47.406238 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:47.405632 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wld75" podUID="1c29cec3-b554-4473-90f3-87629635db89" Apr 20 15:02:47.507014 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:47.506981 2572 generic.go:358] "Generic (PLEG): container finished" podID="1f117f6f-4e31-4bc5-91d0-9a6176af628e" containerID="7ea7aa9414c3168132e0f8f198c61ec49e95fe3b759deceb9f924c001456ca8b" exitCode=0 Apr 20 15:02:47.507162 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:47.507073 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-v4jvg" event={"ID":"1f117f6f-4e31-4bc5-91d0-9a6176af628e","Type":"ContainerDied","Data":"7ea7aa9414c3168132e0f8f198c61ec49e95fe3b759deceb9f924c001456ca8b"} Apr 20 15:02:47.507555 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:47.507532 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-nhq74" Apr 20 15:02:47.520888 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:47.520867 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-nhq74" Apr 20 15:02:47.655661 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:47.655601 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-5f8cl"] Apr 20 15:02:47.655769 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:47.655701 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5f8cl" Apr 20 15:02:47.655803 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:47.655776 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5f8cl" podUID="31c75c33-7396-4dd6-8404-bd0e038f65b3" Apr 20 15:02:47.658962 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:47.658938 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-wld75"] Apr 20 15:02:47.659079 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:47.659037 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wld75" Apr 20 15:02:47.659152 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:47.659135 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wld75" podUID="1c29cec3-b554-4473-90f3-87629635db89" Apr 20 15:02:47.659763 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:47.659740 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-sq52t"] Apr 20 15:02:47.659861 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:47.659848 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sq52t" Apr 20 15:02:47.659988 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:47.659962 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sq52t" podUID="7215f3fe-093a-42b9-bea0-26a93cb4e1ff" Apr 20 15:02:49.404962 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:49.404932 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wld75" Apr 20 15:02:49.405406 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:49.404936 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sq52t" Apr 20 15:02:49.405406 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:49.405028 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wld75" podUID="1c29cec3-b554-4473-90f3-87629635db89" Apr 20 15:02:49.405406 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:49.405111 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sq52t" podUID="7215f3fe-093a-42b9-bea0-26a93cb4e1ff" Apr 20 15:02:49.405406 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:49.404932 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5f8cl" Apr 20 15:02:49.405406 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:49.405184 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5f8cl" podUID="31c75c33-7396-4dd6-8404-bd0e038f65b3" Apr 20 15:02:49.512746 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:49.512712 2572 generic.go:358] "Generic (PLEG): container finished" podID="1f117f6f-4e31-4bc5-91d0-9a6176af628e" containerID="e61b2b0e3ce43278c7c44bd75eed8ab89580dc73011f267acd2e12d5dcb5bd94" exitCode=0 Apr 20 15:02:49.512898 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:49.512785 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-v4jvg" event={"ID":"1f117f6f-4e31-4bc5-91d0-9a6176af628e","Type":"ContainerDied","Data":"e61b2b0e3ce43278c7c44bd75eed8ab89580dc73011f267acd2e12d5dcb5bd94"} Apr 20 15:02:51.405722 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:51.405687 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wld75" Apr 20 15:02:51.406150 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:51.405687 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5f8cl" Apr 20 15:02:51.406150 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:51.405810 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wld75" podUID="1c29cec3-b554-4473-90f3-87629635db89" Apr 20 15:02:51.406150 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:51.405690 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sq52t" Apr 20 15:02:51.406150 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:51.405906 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5f8cl" podUID="31c75c33-7396-4dd6-8404-bd0e038f65b3" Apr 20 15:02:51.406150 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:51.406018 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sq52t" podUID="7215f3fe-093a-42b9-bea0-26a93cb4e1ff" Apr 20 15:02:51.711258 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:51.711180 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-230.ec2.internal" event="NodeReady" Apr 20 15:02:51.711401 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:51.711315 2572 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 20 15:02:51.756328 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:51.756271 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-8p7wk"] Apr 20 15:02:51.787711 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:51.787682 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-5f46db577b-xgcwj"] Apr 20 15:02:51.788008 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:51.787821 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-8p7wk" Apr 20 15:02:51.791343 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:51.791317 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 20 15:02:51.791453 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:51.791321 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-dtcpk\"" Apr 20 15:02:51.791453 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:51.791355 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 20 15:02:51.805955 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:51.805888 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-8p7wk"] Apr 20 15:02:51.806051 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:51.805968 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-p98mt"] Apr 20 15:02:51.806051 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:51.806003 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5f46db577b-xgcwj" Apr 20 15:02:51.808952 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:51.808933 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 20 15:02:51.809552 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:51.809536 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 20 15:02:51.811216 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:51.811193 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-nfkdq\"" Apr 20 15:02:51.823996 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:51.823969 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5f46db577b-xgcwj"] Apr 20 15:02:51.824087 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:51.824005 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-fdmd9"] Apr 20 15:02:51.824087 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:51.824028 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-p98mt" Apr 20 15:02:51.827694 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:51.827674 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 20 15:02:51.827694 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:51.827689 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 20 15:02:51.827817 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:51.827679 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-vdrzr\"" Apr 20 15:02:51.832752 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:51.832733 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 20 15:02:51.834878 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:51.834857 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 20 15:02:51.842163 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:51.842146 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-p98mt"] Apr 20 15:02:51.842251 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:51.842167 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-fdmd9"] Apr 20 15:02:51.842251 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:51.842249 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-fdmd9" Apr 20 15:02:51.845855 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:51.845837 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 20 15:02:51.845943 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:51.845885 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 20 15:02:51.846139 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:51.846125 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 20 15:02:51.846222 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:51.846129 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-4l2x2\"" Apr 20 15:02:51.927615 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:51.927579 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f915f3e5-4964-46bc-9adb-e34434ecea10-ca-trust-extracted\") pod \"image-registry-5f46db577b-xgcwj\" (UID: \"f915f3e5-4964-46bc-9adb-e34434ecea10\") " pod="openshift-image-registry/image-registry-5f46db577b-xgcwj" Apr 20 15:02:51.927760 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:51.927622 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f915f3e5-4964-46bc-9adb-e34434ecea10-bound-sa-token\") pod \"image-registry-5f46db577b-xgcwj\" (UID: \"f915f3e5-4964-46bc-9adb-e34434ecea10\") " pod="openshift-image-registry/image-registry-5f46db577b-xgcwj" Apr 20 15:02:51.927760 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:51.927688 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/f915f3e5-4964-46bc-9adb-e34434ecea10-image-registry-private-configuration\") pod \"image-registry-5f46db577b-xgcwj\" (UID: \"f915f3e5-4964-46bc-9adb-e34434ecea10\") " pod="openshift-image-registry/image-registry-5f46db577b-xgcwj" Apr 20 15:02:51.927760 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:51.927726 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f915f3e5-4964-46bc-9adb-e34434ecea10-installation-pull-secrets\") pod \"image-registry-5f46db577b-xgcwj\" (UID: \"f915f3e5-4964-46bc-9adb-e34434ecea10\") " pod="openshift-image-registry/image-registry-5f46db577b-xgcwj" Apr 20 15:02:51.927925 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:51.927776 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f915f3e5-4964-46bc-9adb-e34434ecea10-registry-tls\") pod \"image-registry-5f46db577b-xgcwj\" (UID: \"f915f3e5-4964-46bc-9adb-e34434ecea10\") " pod="openshift-image-registry/image-registry-5f46db577b-xgcwj" Apr 20 15:02:51.927925 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:51.927811 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plhrx\" (UniqueName: \"kubernetes.io/projected/20db58c7-db2e-4b2b-be5a-cf2278346010-kube-api-access-plhrx\") pod \"dns-default-p98mt\" (UID: \"20db58c7-db2e-4b2b-be5a-cf2278346010\") " pod="openshift-dns/dns-default-p98mt" Apr 20 15:02:51.927925 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:51.927893 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/20db58c7-db2e-4b2b-be5a-cf2278346010-tmp-dir\") pod \"dns-default-p98mt\" (UID: \"20db58c7-db2e-4b2b-be5a-cf2278346010\") " pod="openshift-dns/dns-default-p98mt" Apr 20 15:02:51.928060 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:51.927950 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f915f3e5-4964-46bc-9adb-e34434ecea10-trusted-ca\") pod \"image-registry-5f46db577b-xgcwj\" (UID: \"f915f3e5-4964-46bc-9adb-e34434ecea10\") " pod="openshift-image-registry/image-registry-5f46db577b-xgcwj" Apr 20 15:02:51.928060 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:51.927979 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkn57\" (UniqueName: \"kubernetes.io/projected/f915f3e5-4964-46bc-9adb-e34434ecea10-kube-api-access-zkn57\") pod \"image-registry-5f46db577b-xgcwj\" (UID: \"f915f3e5-4964-46bc-9adb-e34434ecea10\") " pod="openshift-image-registry/image-registry-5f46db577b-xgcwj" Apr 20 15:02:51.928060 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:51.928000 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/20db58c7-db2e-4b2b-be5a-cf2278346010-metrics-tls\") pod \"dns-default-p98mt\" (UID: \"20db58c7-db2e-4b2b-be5a-cf2278346010\") " pod="openshift-dns/dns-default-p98mt" Apr 20 15:02:51.928060 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:51.928030 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/eb88787e-8848-4d3f-bcdd-871260569c2c-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-8p7wk\" (UID: \"eb88787e-8848-4d3f-bcdd-871260569c2c\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-8p7wk" Apr 20 15:02:51.928060 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:51.928050 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/eb88787e-8848-4d3f-bcdd-871260569c2c-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-8p7wk\" (UID: \"eb88787e-8848-4d3f-bcdd-871260569c2c\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-8p7wk" Apr 20 15:02:51.928257 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:51.928086 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqxgx\" (UniqueName: \"kubernetes.io/projected/8337a855-24f1-476f-b9e0-49701fd9bda2-kube-api-access-mqxgx\") pod \"ingress-canary-fdmd9\" (UID: \"8337a855-24f1-476f-b9e0-49701fd9bda2\") " pod="openshift-ingress-canary/ingress-canary-fdmd9" Apr 20 15:02:51.928257 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:51.928120 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f915f3e5-4964-46bc-9adb-e34434ecea10-registry-certificates\") pod \"image-registry-5f46db577b-xgcwj\" (UID: \"f915f3e5-4964-46bc-9adb-e34434ecea10\") " pod="openshift-image-registry/image-registry-5f46db577b-xgcwj" Apr 20 15:02:51.928257 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:51.928145 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/20db58c7-db2e-4b2b-be5a-cf2278346010-config-volume\") pod \"dns-default-p98mt\" (UID: \"20db58c7-db2e-4b2b-be5a-cf2278346010\") " pod="openshift-dns/dns-default-p98mt" Apr 20 15:02:51.928257 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:51.928170 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8337a855-24f1-476f-b9e0-49701fd9bda2-cert\") pod \"ingress-canary-fdmd9\" (UID: \"8337a855-24f1-476f-b9e0-49701fd9bda2\") " pod="openshift-ingress-canary/ingress-canary-fdmd9" Apr 20 15:02:52.028942 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:52.028910 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/20db58c7-db2e-4b2b-be5a-cf2278346010-tmp-dir\") pod \"dns-default-p98mt\" (UID: \"20db58c7-db2e-4b2b-be5a-cf2278346010\") " pod="openshift-dns/dns-default-p98mt" Apr 20 15:02:52.029095 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:52.028983 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f915f3e5-4964-46bc-9adb-e34434ecea10-trusted-ca\") pod \"image-registry-5f46db577b-xgcwj\" (UID: \"f915f3e5-4964-46bc-9adb-e34434ecea10\") " pod="openshift-image-registry/image-registry-5f46db577b-xgcwj" Apr 20 15:02:52.029095 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:52.029010 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zkn57\" (UniqueName: \"kubernetes.io/projected/f915f3e5-4964-46bc-9adb-e34434ecea10-kube-api-access-zkn57\") pod \"image-registry-5f46db577b-xgcwj\" (UID: \"f915f3e5-4964-46bc-9adb-e34434ecea10\") " pod="openshift-image-registry/image-registry-5f46db577b-xgcwj" Apr 20 15:02:52.029095 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:52.029038 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/20db58c7-db2e-4b2b-be5a-cf2278346010-metrics-tls\") pod \"dns-default-p98mt\" (UID: \"20db58c7-db2e-4b2b-be5a-cf2278346010\") " pod="openshift-dns/dns-default-p98mt" Apr 20 15:02:52.029095 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:52.029072 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/eb88787e-8848-4d3f-bcdd-871260569c2c-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-8p7wk\" (UID: \"eb88787e-8848-4d3f-bcdd-871260569c2c\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-8p7wk" Apr 20 15:02:52.029296 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:52.029096 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/eb88787e-8848-4d3f-bcdd-871260569c2c-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-8p7wk\" (UID: \"eb88787e-8848-4d3f-bcdd-871260569c2c\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-8p7wk" Apr 20 15:02:52.029296 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:52.029124 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mqxgx\" (UniqueName: \"kubernetes.io/projected/8337a855-24f1-476f-b9e0-49701fd9bda2-kube-api-access-mqxgx\") pod \"ingress-canary-fdmd9\" (UID: \"8337a855-24f1-476f-b9e0-49701fd9bda2\") " pod="openshift-ingress-canary/ingress-canary-fdmd9" Apr 20 15:02:52.029296 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:52.029153 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f915f3e5-4964-46bc-9adb-e34434ecea10-registry-certificates\") pod \"image-registry-5f46db577b-xgcwj\" (UID: \"f915f3e5-4964-46bc-9adb-e34434ecea10\") " pod="openshift-image-registry/image-registry-5f46db577b-xgcwj" Apr 20 15:02:52.029296 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:52.029176 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/20db58c7-db2e-4b2b-be5a-cf2278346010-config-volume\") pod \"dns-default-p98mt\" (UID: \"20db58c7-db2e-4b2b-be5a-cf2278346010\") " pod="openshift-dns/dns-default-p98mt" Apr 20 15:02:52.029296 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:52.029203 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8337a855-24f1-476f-b9e0-49701fd9bda2-cert\") pod \"ingress-canary-fdmd9\" (UID: \"8337a855-24f1-476f-b9e0-49701fd9bda2\") " pod="openshift-ingress-canary/ingress-canary-fdmd9" Apr 20 15:02:52.029296 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:52.029230 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f915f3e5-4964-46bc-9adb-e34434ecea10-ca-trust-extracted\") pod \"image-registry-5f46db577b-xgcwj\" (UID: \"f915f3e5-4964-46bc-9adb-e34434ecea10\") " pod="openshift-image-registry/image-registry-5f46db577b-xgcwj" Apr 20 15:02:52.029296 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:52.029259 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f915f3e5-4964-46bc-9adb-e34434ecea10-bound-sa-token\") pod \"image-registry-5f46db577b-xgcwj\" (UID: \"f915f3e5-4964-46bc-9adb-e34434ecea10\") " pod="openshift-image-registry/image-registry-5f46db577b-xgcwj" Apr 20 15:02:52.029685 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:52.029312 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/20db58c7-db2e-4b2b-be5a-cf2278346010-tmp-dir\") pod \"dns-default-p98mt\" (UID: \"20db58c7-db2e-4b2b-be5a-cf2278346010\") " pod="openshift-dns/dns-default-p98mt" Apr 20 15:02:52.029685 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:52.029327 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 15:02:52.029685 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:52.029406 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/20db58c7-db2e-4b2b-be5a-cf2278346010-metrics-tls podName:20db58c7-db2e-4b2b-be5a-cf2278346010 nodeName:}" failed. No retries permitted until 2026-04-20 15:02:52.529383982 +0000 UTC m=+32.660259277 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/20db58c7-db2e-4b2b-be5a-cf2278346010-metrics-tls") pod "dns-default-p98mt" (UID: "20db58c7-db2e-4b2b-be5a-cf2278346010") : secret "dns-default-metrics-tls" not found Apr 20 15:02:52.029685 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:52.029433 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 15:02:52.029685 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:52.029523 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8337a855-24f1-476f-b9e0-49701fd9bda2-cert podName:8337a855-24f1-476f-b9e0-49701fd9bda2 nodeName:}" failed. No retries permitted until 2026-04-20 15:02:52.529504125 +0000 UTC m=+32.660379419 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8337a855-24f1-476f-b9e0-49701fd9bda2-cert") pod "ingress-canary-fdmd9" (UID: "8337a855-24f1-476f-b9e0-49701fd9bda2") : secret "canary-serving-cert" not found Apr 20 15:02:52.029685 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:52.029639 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/f915f3e5-4964-46bc-9adb-e34434ecea10-image-registry-private-configuration\") pod \"image-registry-5f46db577b-xgcwj\" (UID: \"f915f3e5-4964-46bc-9adb-e34434ecea10\") " pod="openshift-image-registry/image-registry-5f46db577b-xgcwj" Apr 20 15:02:52.029685 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:52.029675 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f915f3e5-4964-46bc-9adb-e34434ecea10-installation-pull-secrets\") pod \"image-registry-5f46db577b-xgcwj\" (UID: \"f915f3e5-4964-46bc-9adb-e34434ecea10\") " pod="openshift-image-registry/image-registry-5f46db577b-xgcwj" Apr 20 15:02:52.030013 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:52.029704 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f915f3e5-4964-46bc-9adb-e34434ecea10-registry-tls\") pod \"image-registry-5f46db577b-xgcwj\" (UID: \"f915f3e5-4964-46bc-9adb-e34434ecea10\") " pod="openshift-image-registry/image-registry-5f46db577b-xgcwj" Apr 20 15:02:52.030013 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:52.029732 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-plhrx\" (UniqueName: \"kubernetes.io/projected/20db58c7-db2e-4b2b-be5a-cf2278346010-kube-api-access-plhrx\") pod \"dns-default-p98mt\" (UID: \"20db58c7-db2e-4b2b-be5a-cf2278346010\") " pod="openshift-dns/dns-default-p98mt" Apr 20 15:02:52.030013 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:52.029811 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f915f3e5-4964-46bc-9adb-e34434ecea10-ca-trust-extracted\") pod \"image-registry-5f46db577b-xgcwj\" (UID: \"f915f3e5-4964-46bc-9adb-e34434ecea10\") " pod="openshift-image-registry/image-registry-5f46db577b-xgcwj" Apr 20 15:02:52.030013 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:52.029972 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/eb88787e-8848-4d3f-bcdd-871260569c2c-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-8p7wk\" (UID: \"eb88787e-8848-4d3f-bcdd-871260569c2c\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-8p7wk" Apr 20 15:02:52.030013 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:52.029992 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f915f3e5-4964-46bc-9adb-e34434ecea10-trusted-ca\") pod \"image-registry-5f46db577b-xgcwj\" (UID: \"f915f3e5-4964-46bc-9adb-e34434ecea10\") " pod="openshift-image-registry/image-registry-5f46db577b-xgcwj" Apr 20 15:02:52.030233 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:52.030048 2572 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 15:02:52.030233 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:52.030076 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 15:02:52.030233 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:52.030091 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5f46db577b-xgcwj: secret "image-registry-tls" not found Apr 20 15:02:52.030233 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:52.030098 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb88787e-8848-4d3f-bcdd-871260569c2c-networking-console-plugin-cert podName:eb88787e-8848-4d3f-bcdd-871260569c2c nodeName:}" failed. No retries permitted until 2026-04-20 15:02:52.530086288 +0000 UTC m=+32.660961574 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/eb88787e-8848-4d3f-bcdd-871260569c2c-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-8p7wk" (UID: "eb88787e-8848-4d3f-bcdd-871260569c2c") : secret "networking-console-plugin-cert" not found Apr 20 15:02:52.030233 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:52.030133 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f915f3e5-4964-46bc-9adb-e34434ecea10-registry-tls podName:f915f3e5-4964-46bc-9adb-e34434ecea10 nodeName:}" failed. No retries permitted until 2026-04-20 15:02:52.530119262 +0000 UTC m=+32.660994557 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/f915f3e5-4964-46bc-9adb-e34434ecea10-registry-tls") pod "image-registry-5f46db577b-xgcwj" (UID: "f915f3e5-4964-46bc-9adb-e34434ecea10") : secret "image-registry-tls" not found Apr 20 15:02:52.033990 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:52.033966 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/f915f3e5-4964-46bc-9adb-e34434ecea10-image-registry-private-configuration\") pod \"image-registry-5f46db577b-xgcwj\" (UID: \"f915f3e5-4964-46bc-9adb-e34434ecea10\") " pod="openshift-image-registry/image-registry-5f46db577b-xgcwj" Apr 20 15:02:52.033990 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:52.033984 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f915f3e5-4964-46bc-9adb-e34434ecea10-installation-pull-secrets\") pod \"image-registry-5f46db577b-xgcwj\" (UID: \"f915f3e5-4964-46bc-9adb-e34434ecea10\") " pod="openshift-image-registry/image-registry-5f46db577b-xgcwj" Apr 20 15:02:52.037811 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:52.037782 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqxgx\" (UniqueName: \"kubernetes.io/projected/8337a855-24f1-476f-b9e0-49701fd9bda2-kube-api-access-mqxgx\") pod \"ingress-canary-fdmd9\" (UID: \"8337a855-24f1-476f-b9e0-49701fd9bda2\") " pod="openshift-ingress-canary/ingress-canary-fdmd9" Apr 20 15:02:52.037905 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:52.037877 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkn57\" (UniqueName: \"kubernetes.io/projected/f915f3e5-4964-46bc-9adb-e34434ecea10-kube-api-access-zkn57\") pod \"image-registry-5f46db577b-xgcwj\" (UID: \"f915f3e5-4964-46bc-9adb-e34434ecea10\") " pod="openshift-image-registry/image-registry-5f46db577b-xgcwj" Apr 20 15:02:52.038151 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:52.038129 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-plhrx\" (UniqueName: \"kubernetes.io/projected/20db58c7-db2e-4b2b-be5a-cf2278346010-kube-api-access-plhrx\") pod \"dns-default-p98mt\" (UID: \"20db58c7-db2e-4b2b-be5a-cf2278346010\") " pod="openshift-dns/dns-default-p98mt" Apr 20 15:02:52.038255 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:52.038236 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f915f3e5-4964-46bc-9adb-e34434ecea10-bound-sa-token\") pod \"image-registry-5f46db577b-xgcwj\" (UID: \"f915f3e5-4964-46bc-9adb-e34434ecea10\") " pod="openshift-image-registry/image-registry-5f46db577b-xgcwj" Apr 20 15:02:52.039724 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:52.039705 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f915f3e5-4964-46bc-9adb-e34434ecea10-registry-certificates\") pod \"image-registry-5f46db577b-xgcwj\" (UID: \"f915f3e5-4964-46bc-9adb-e34434ecea10\") " pod="openshift-image-registry/image-registry-5f46db577b-xgcwj" Apr 20 15:02:52.042929 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:52.042912 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/20db58c7-db2e-4b2b-be5a-cf2278346010-config-volume\") pod \"dns-default-p98mt\" (UID: \"20db58c7-db2e-4b2b-be5a-cf2278346010\") " pod="openshift-dns/dns-default-p98mt" Apr 20 15:02:52.533625 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:52.533589 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/20db58c7-db2e-4b2b-be5a-cf2278346010-metrics-tls\") pod \"dns-default-p98mt\" (UID: \"20db58c7-db2e-4b2b-be5a-cf2278346010\") " pod="openshift-dns/dns-default-p98mt" Apr 20 15:02:52.534239 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:52.533642 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/eb88787e-8848-4d3f-bcdd-871260569c2c-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-8p7wk\" (UID: \"eb88787e-8848-4d3f-bcdd-871260569c2c\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-8p7wk" Apr 20 15:02:52.534239 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:52.533676 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8337a855-24f1-476f-b9e0-49701fd9bda2-cert\") pod \"ingress-canary-fdmd9\" (UID: \"8337a855-24f1-476f-b9e0-49701fd9bda2\") " pod="openshift-ingress-canary/ingress-canary-fdmd9" Apr 20 15:02:52.534239 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:52.533709 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f915f3e5-4964-46bc-9adb-e34434ecea10-registry-tls\") pod \"image-registry-5f46db577b-xgcwj\" (UID: \"f915f3e5-4964-46bc-9adb-e34434ecea10\") " pod="openshift-image-registry/image-registry-5f46db577b-xgcwj" Apr 20 15:02:52.534239 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:52.533737 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 15:02:52.534239 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:52.533813 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 15:02:52.534239 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:52.533825 2572 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 15:02:52.534239 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:52.533817 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/20db58c7-db2e-4b2b-be5a-cf2278346010-metrics-tls podName:20db58c7-db2e-4b2b-be5a-cf2278346010 nodeName:}" failed. No retries permitted until 2026-04-20 15:02:53.533798094 +0000 UTC m=+33.664673396 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/20db58c7-db2e-4b2b-be5a-cf2278346010-metrics-tls") pod "dns-default-p98mt" (UID: "20db58c7-db2e-4b2b-be5a-cf2278346010") : secret "dns-default-metrics-tls" not found Apr 20 15:02:52.534239 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:52.533855 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8337a855-24f1-476f-b9e0-49701fd9bda2-cert podName:8337a855-24f1-476f-b9e0-49701fd9bda2 nodeName:}" failed. No retries permitted until 2026-04-20 15:02:53.533845556 +0000 UTC m=+33.664720840 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8337a855-24f1-476f-b9e0-49701fd9bda2-cert") pod "ingress-canary-fdmd9" (UID: "8337a855-24f1-476f-b9e0-49701fd9bda2") : secret "canary-serving-cert" not found Apr 20 15:02:52.534239 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:52.533871 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 15:02:52.534239 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:52.533885 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5f46db577b-xgcwj: secret "image-registry-tls" not found Apr 20 15:02:52.534239 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:52.533901 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb88787e-8848-4d3f-bcdd-871260569c2c-networking-console-plugin-cert podName:eb88787e-8848-4d3f-bcdd-871260569c2c nodeName:}" failed. No retries permitted until 2026-04-20 15:02:53.533882558 +0000 UTC m=+33.664757844 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/eb88787e-8848-4d3f-bcdd-871260569c2c-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-8p7wk" (UID: "eb88787e-8848-4d3f-bcdd-871260569c2c") : secret "networking-console-plugin-cert" not found Apr 20 15:02:52.534239 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:52.533925 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f915f3e5-4964-46bc-9adb-e34434ecea10-registry-tls podName:f915f3e5-4964-46bc-9adb-e34434ecea10 nodeName:}" failed. No retries permitted until 2026-04-20 15:02:53.533911519 +0000 UTC m=+33.664786813 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/f915f3e5-4964-46bc-9adb-e34434ecea10-registry-tls") pod "image-registry-5f46db577b-xgcwj" (UID: "f915f3e5-4964-46bc-9adb-e34434ecea10") : secret "image-registry-tls" not found Apr 20 15:02:53.405769 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:53.405735 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5f8cl" Apr 20 15:02:53.406005 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:53.405735 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sq52t" Apr 20 15:02:53.406005 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:53.405735 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wld75" Apr 20 15:02:53.408345 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:53.408323 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 15:02:53.408473 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:53.408398 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 15:02:53.409233 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:53.409215 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 20 15:02:53.409306 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:53.409296 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-dhzk6\"" Apr 20 15:02:53.409361 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:53.409301 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 15:02:53.409419 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:53.409382 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-64ff7\"" Apr 20 15:02:53.542177 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:53.542144 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/20db58c7-db2e-4b2b-be5a-cf2278346010-metrics-tls\") pod \"dns-default-p98mt\" (UID: \"20db58c7-db2e-4b2b-be5a-cf2278346010\") " pod="openshift-dns/dns-default-p98mt" Apr 20 15:02:53.542634 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:53.542195 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/eb88787e-8848-4d3f-bcdd-871260569c2c-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-8p7wk\" (UID: \"eb88787e-8848-4d3f-bcdd-871260569c2c\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-8p7wk" Apr 20 15:02:53.542634 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:53.542242 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8337a855-24f1-476f-b9e0-49701fd9bda2-cert\") pod \"ingress-canary-fdmd9\" (UID: \"8337a855-24f1-476f-b9e0-49701fd9bda2\") " pod="openshift-ingress-canary/ingress-canary-fdmd9" Apr 20 15:02:53.542634 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:53.542283 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f915f3e5-4964-46bc-9adb-e34434ecea10-registry-tls\") pod \"image-registry-5f46db577b-xgcwj\" (UID: \"f915f3e5-4964-46bc-9adb-e34434ecea10\") " pod="openshift-image-registry/image-registry-5f46db577b-xgcwj" Apr 20 15:02:53.542634 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:53.542310 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 15:02:53.542634 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:53.542372 2572 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 15:02:53.542634 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:53.542405 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 15:02:53.542634 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:53.542373 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/20db58c7-db2e-4b2b-be5a-cf2278346010-metrics-tls podName:20db58c7-db2e-4b2b-be5a-cf2278346010 nodeName:}" failed. No retries permitted until 2026-04-20 15:02:55.542358026 +0000 UTC m=+35.673233314 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/20db58c7-db2e-4b2b-be5a-cf2278346010-metrics-tls") pod "dns-default-p98mt" (UID: "20db58c7-db2e-4b2b-be5a-cf2278346010") : secret "dns-default-metrics-tls" not found Apr 20 15:02:53.542634 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:53.542417 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5f46db577b-xgcwj: secret "image-registry-tls" not found Apr 20 15:02:53.542634 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:53.542437 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb88787e-8848-4d3f-bcdd-871260569c2c-networking-console-plugin-cert podName:eb88787e-8848-4d3f-bcdd-871260569c2c nodeName:}" failed. No retries permitted until 2026-04-20 15:02:55.542420273 +0000 UTC m=+35.673295570 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/eb88787e-8848-4d3f-bcdd-871260569c2c-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-8p7wk" (UID: "eb88787e-8848-4d3f-bcdd-871260569c2c") : secret "networking-console-plugin-cert" not found Apr 20 15:02:53.542634 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:53.542469 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f915f3e5-4964-46bc-9adb-e34434ecea10-registry-tls podName:f915f3e5-4964-46bc-9adb-e34434ecea10 nodeName:}" failed. No retries permitted until 2026-04-20 15:02:55.542455794 +0000 UTC m=+35.673331076 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/f915f3e5-4964-46bc-9adb-e34434ecea10-registry-tls") pod "image-registry-5f46db577b-xgcwj" (UID: "f915f3e5-4964-46bc-9adb-e34434ecea10") : secret "image-registry-tls" not found Apr 20 15:02:53.542634 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:53.542523 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 15:02:53.542634 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:53.542568 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8337a855-24f1-476f-b9e0-49701fd9bda2-cert podName:8337a855-24f1-476f-b9e0-49701fd9bda2 nodeName:}" failed. No retries permitted until 2026-04-20 15:02:55.542557086 +0000 UTC m=+35.673432370 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8337a855-24f1-476f-b9e0-49701fd9bda2-cert") pod "ingress-canary-fdmd9" (UID: "8337a855-24f1-476f-b9e0-49701fd9bda2") : secret "canary-serving-cert" not found Apr 20 15:02:54.147310 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:54.147270 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7215f3fe-093a-42b9-bea0-26a93cb4e1ff-metrics-certs\") pod \"network-metrics-daemon-sq52t\" (UID: \"7215f3fe-093a-42b9-bea0-26a93cb4e1ff\") " pod="openshift-multus/network-metrics-daemon-sq52t" Apr 20 15:02:54.147580 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:54.147398 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 20 15:02:54.147580 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:54.147474 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7215f3fe-093a-42b9-bea0-26a93cb4e1ff-metrics-certs podName:7215f3fe-093a-42b9-bea0-26a93cb4e1ff nodeName:}" failed. No retries permitted until 2026-04-20 15:03:26.147453093 +0000 UTC m=+66.278328375 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7215f3fe-093a-42b9-bea0-26a93cb4e1ff-metrics-certs") pod "network-metrics-daemon-sq52t" (UID: "7215f3fe-093a-42b9-bea0-26a93cb4e1ff") : secret "metrics-daemon-secret" not found Apr 20 15:02:54.248120 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:54.248075 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8wdxz\" (UniqueName: \"kubernetes.io/projected/31c75c33-7396-4dd6-8404-bd0e038f65b3-kube-api-access-8wdxz\") pod \"network-check-target-5f8cl\" (UID: \"31c75c33-7396-4dd6-8404-bd0e038f65b3\") " pod="openshift-network-diagnostics/network-check-target-5f8cl" Apr 20 15:02:54.251557 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:54.251530 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wdxz\" (UniqueName: \"kubernetes.io/projected/31c75c33-7396-4dd6-8404-bd0e038f65b3-kube-api-access-8wdxz\") pod \"network-check-target-5f8cl\" (UID: \"31c75c33-7396-4dd6-8404-bd0e038f65b3\") " pod="openshift-network-diagnostics/network-check-target-5f8cl" Apr 20 15:02:54.317420 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:54.317386 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5f8cl" Apr 20 15:02:55.154960 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:55.154921 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/1c29cec3-b554-4473-90f3-87629635db89-original-pull-secret\") pod \"global-pull-secret-syncer-wld75\" (UID: \"1c29cec3-b554-4473-90f3-87629635db89\") " pod="kube-system/global-pull-secret-syncer-wld75" Apr 20 15:02:55.157377 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:55.157354 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/1c29cec3-b554-4473-90f3-87629635db89-original-pull-secret\") pod \"global-pull-secret-syncer-wld75\" (UID: \"1c29cec3-b554-4473-90f3-87629635db89\") " pod="kube-system/global-pull-secret-syncer-wld75" Apr 20 15:02:55.177359 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:55.177331 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-67f887d797-rmz8d"] Apr 20 15:02:55.208087 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:55.208063 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-568757458-778zv"] Apr 20 15:02:55.208252 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:55.208229 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-67f887d797-rmz8d" Apr 20 15:02:55.211512 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:55.211409 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 20 15:02:55.211634 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:55.211563 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 20 15:02:55.212138 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:55.211935 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 20 15:02:55.212805 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:55.212300 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 20 15:02:55.212895 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:55.212832 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-6lt47\"" Apr 20 15:02:55.227318 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:55.227301 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6ccfb7bf47-78sml"] Apr 20 15:02:55.227419 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:55.227399 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-568757458-778zv" Apr 20 15:02:55.229650 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:55.229630 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wld75" Apr 20 15:02:55.229748 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:55.229723 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 20 15:02:55.245227 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:55.245207 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-67f887d797-rmz8d"] Apr 20 15:02:55.245308 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:55.245232 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6ccfb7bf47-78sml"] Apr 20 15:02:55.245308 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:55.245241 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-568757458-778zv"] Apr 20 15:02:55.245427 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:55.245346 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6ccfb7bf47-78sml" Apr 20 15:02:55.247986 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:55.247967 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 20 15:02:55.248080 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:55.248020 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 20 15:02:55.248080 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:55.248048 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 20 15:02:55.248080 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:55.248061 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 20 15:02:55.357133 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:55.357103 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/0fe4c9de-e474-4b63-bc5b-da2d6d40ba00-klusterlet-config\") pod \"klusterlet-addon-workmgr-568757458-778zv\" (UID: \"0fe4c9de-e474-4b63-bc5b-da2d6d40ba00\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-568757458-778zv" Apr 20 15:02:55.357133 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:55.357134 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4864c\" (UniqueName: \"kubernetes.io/projected/0fe4c9de-e474-4b63-bc5b-da2d6d40ba00-kube-api-access-4864c\") pod \"klusterlet-addon-workmgr-568757458-778zv\" (UID: \"0fe4c9de-e474-4b63-bc5b-da2d6d40ba00\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-568757458-778zv" Apr 20 15:02:55.357304 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:55.357153 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hw8r8\" (UniqueName: \"kubernetes.io/projected/eba5134a-506e-447a-8b4e-946fb42feaf3-kube-api-access-hw8r8\") pod \"cluster-proxy-proxy-agent-6ccfb7bf47-78sml\" (UID: \"eba5134a-506e-447a-8b4e-946fb42feaf3\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6ccfb7bf47-78sml" Apr 20 15:02:55.357304 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:55.357237 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0fe4c9de-e474-4b63-bc5b-da2d6d40ba00-tmp\") pod \"klusterlet-addon-workmgr-568757458-778zv\" (UID: \"0fe4c9de-e474-4b63-bc5b-da2d6d40ba00\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-568757458-778zv" Apr 20 15:02:55.357304 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:55.357270 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/eba5134a-506e-447a-8b4e-946fb42feaf3-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-6ccfb7bf47-78sml\" (UID: \"eba5134a-506e-447a-8b4e-946fb42feaf3\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6ccfb7bf47-78sml" Apr 20 15:02:55.357304 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:55.357291 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/eba5134a-506e-447a-8b4e-946fb42feaf3-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-6ccfb7bf47-78sml\" (UID: \"eba5134a-506e-447a-8b4e-946fb42feaf3\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6ccfb7bf47-78sml" Apr 20 15:02:55.357435 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:55.357309 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/eba5134a-506e-447a-8b4e-946fb42feaf3-hub\") pod \"cluster-proxy-proxy-agent-6ccfb7bf47-78sml\" (UID: \"eba5134a-506e-447a-8b4e-946fb42feaf3\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6ccfb7bf47-78sml" Apr 20 15:02:55.357435 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:55.357334 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/2545e8e2-aae3-49b8-80c8-71db98b8f417-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-67f887d797-rmz8d\" (UID: \"2545e8e2-aae3-49b8-80c8-71db98b8f417\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-67f887d797-rmz8d" Apr 20 15:02:55.357435 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:55.357414 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvscn\" (UniqueName: \"kubernetes.io/projected/2545e8e2-aae3-49b8-80c8-71db98b8f417-kube-api-access-rvscn\") pod \"managed-serviceaccount-addon-agent-67f887d797-rmz8d\" (UID: \"2545e8e2-aae3-49b8-80c8-71db98b8f417\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-67f887d797-rmz8d" Apr 20 15:02:55.357545 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:55.357458 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/eba5134a-506e-447a-8b4e-946fb42feaf3-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-6ccfb7bf47-78sml\" (UID: \"eba5134a-506e-447a-8b4e-946fb42feaf3\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6ccfb7bf47-78sml" Apr 20 15:02:55.357545 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:55.357503 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/eba5134a-506e-447a-8b4e-946fb42feaf3-ca\") pod \"cluster-proxy-proxy-agent-6ccfb7bf47-78sml\" (UID: \"eba5134a-506e-447a-8b4e-946fb42feaf3\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6ccfb7bf47-78sml" Apr 20 15:02:55.458340 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:55.458287 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/eba5134a-506e-447a-8b4e-946fb42feaf3-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-6ccfb7bf47-78sml\" (UID: \"eba5134a-506e-447a-8b4e-946fb42feaf3\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6ccfb7bf47-78sml" Apr 20 15:02:55.458558 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:55.458348 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/eba5134a-506e-447a-8b4e-946fb42feaf3-ca\") pod \"cluster-proxy-proxy-agent-6ccfb7bf47-78sml\" (UID: \"eba5134a-506e-447a-8b4e-946fb42feaf3\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6ccfb7bf47-78sml" Apr 20 15:02:55.458558 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:55.458378 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/0fe4c9de-e474-4b63-bc5b-da2d6d40ba00-klusterlet-config\") pod \"klusterlet-addon-workmgr-568757458-778zv\" (UID: \"0fe4c9de-e474-4b63-bc5b-da2d6d40ba00\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-568757458-778zv" Apr 20 15:02:55.458558 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:55.458408 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4864c\" (UniqueName: \"kubernetes.io/projected/0fe4c9de-e474-4b63-bc5b-da2d6d40ba00-kube-api-access-4864c\") pod \"klusterlet-addon-workmgr-568757458-778zv\" (UID: \"0fe4c9de-e474-4b63-bc5b-da2d6d40ba00\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-568757458-778zv" Apr 20 15:02:55.458558 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:55.458438 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hw8r8\" (UniqueName: \"kubernetes.io/projected/eba5134a-506e-447a-8b4e-946fb42feaf3-kube-api-access-hw8r8\") pod \"cluster-proxy-proxy-agent-6ccfb7bf47-78sml\" (UID: \"eba5134a-506e-447a-8b4e-946fb42feaf3\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6ccfb7bf47-78sml" Apr 20 15:02:55.459217 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:55.458887 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0fe4c9de-e474-4b63-bc5b-da2d6d40ba00-tmp\") pod \"klusterlet-addon-workmgr-568757458-778zv\" (UID: \"0fe4c9de-e474-4b63-bc5b-da2d6d40ba00\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-568757458-778zv" Apr 20 15:02:55.459217 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:55.458926 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/eba5134a-506e-447a-8b4e-946fb42feaf3-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-6ccfb7bf47-78sml\" (UID: \"eba5134a-506e-447a-8b4e-946fb42feaf3\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6ccfb7bf47-78sml" Apr 20 15:02:55.459217 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:55.458956 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/eba5134a-506e-447a-8b4e-946fb42feaf3-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-6ccfb7bf47-78sml\" (UID: \"eba5134a-506e-447a-8b4e-946fb42feaf3\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6ccfb7bf47-78sml" Apr 20 15:02:55.459217 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:55.458989 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/eba5134a-506e-447a-8b4e-946fb42feaf3-hub\") pod \"cluster-proxy-proxy-agent-6ccfb7bf47-78sml\" (UID: \"eba5134a-506e-447a-8b4e-946fb42feaf3\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6ccfb7bf47-78sml" Apr 20 15:02:55.459217 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:55.459024 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/2545e8e2-aae3-49b8-80c8-71db98b8f417-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-67f887d797-rmz8d\" (UID: \"2545e8e2-aae3-49b8-80c8-71db98b8f417\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-67f887d797-rmz8d" Apr 20 15:02:55.459217 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:55.459100 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rvscn\" (UniqueName: \"kubernetes.io/projected/2545e8e2-aae3-49b8-80c8-71db98b8f417-kube-api-access-rvscn\") pod \"managed-serviceaccount-addon-agent-67f887d797-rmz8d\" (UID: \"2545e8e2-aae3-49b8-80c8-71db98b8f417\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-67f887d797-rmz8d" Apr 20 15:02:55.459685 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:55.459318 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0fe4c9de-e474-4b63-bc5b-da2d6d40ba00-tmp\") pod \"klusterlet-addon-workmgr-568757458-778zv\" (UID: \"0fe4c9de-e474-4b63-bc5b-da2d6d40ba00\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-568757458-778zv" Apr 20 15:02:55.459764 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:55.459742 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/eba5134a-506e-447a-8b4e-946fb42feaf3-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-6ccfb7bf47-78sml\" (UID: \"eba5134a-506e-447a-8b4e-946fb42feaf3\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6ccfb7bf47-78sml" Apr 20 15:02:55.463031 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:55.462579 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/eba5134a-506e-447a-8b4e-946fb42feaf3-ca\") pod \"cluster-proxy-proxy-agent-6ccfb7bf47-78sml\" (UID: \"eba5134a-506e-447a-8b4e-946fb42feaf3\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6ccfb7bf47-78sml" Apr 20 15:02:55.463031 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:55.462829 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/eba5134a-506e-447a-8b4e-946fb42feaf3-hub\") pod \"cluster-proxy-proxy-agent-6ccfb7bf47-78sml\" (UID: \"eba5134a-506e-447a-8b4e-946fb42feaf3\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6ccfb7bf47-78sml" Apr 20 15:02:55.463824 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:55.463104 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/2545e8e2-aae3-49b8-80c8-71db98b8f417-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-67f887d797-rmz8d\" (UID: \"2545e8e2-aae3-49b8-80c8-71db98b8f417\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-67f887d797-rmz8d" Apr 20 15:02:55.463824 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:55.463146 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/0fe4c9de-e474-4b63-bc5b-da2d6d40ba00-klusterlet-config\") pod \"klusterlet-addon-workmgr-568757458-778zv\" (UID: \"0fe4c9de-e474-4b63-bc5b-da2d6d40ba00\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-568757458-778zv" Apr 20 15:02:55.463824 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:55.463626 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/eba5134a-506e-447a-8b4e-946fb42feaf3-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-6ccfb7bf47-78sml\" (UID: \"eba5134a-506e-447a-8b4e-946fb42feaf3\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6ccfb7bf47-78sml" Apr 20 15:02:55.463926 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:55.463841 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/eba5134a-506e-447a-8b4e-946fb42feaf3-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-6ccfb7bf47-78sml\" (UID: \"eba5134a-506e-447a-8b4e-946fb42feaf3\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6ccfb7bf47-78sml" Apr 20 15:02:55.466732 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:55.466583 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4864c\" (UniqueName: \"kubernetes.io/projected/0fe4c9de-e474-4b63-bc5b-da2d6d40ba00-kube-api-access-4864c\") pod \"klusterlet-addon-workmgr-568757458-778zv\" (UID: \"0fe4c9de-e474-4b63-bc5b-da2d6d40ba00\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-568757458-778zv" Apr 20 15:02:55.467239 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:55.467216 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hw8r8\" (UniqueName: \"kubernetes.io/projected/eba5134a-506e-447a-8b4e-946fb42feaf3-kube-api-access-hw8r8\") pod \"cluster-proxy-proxy-agent-6ccfb7bf47-78sml\" (UID: \"eba5134a-506e-447a-8b4e-946fb42feaf3\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6ccfb7bf47-78sml" Apr 20 15:02:55.471507 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:55.471212 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvscn\" (UniqueName: \"kubernetes.io/projected/2545e8e2-aae3-49b8-80c8-71db98b8f417-kube-api-access-rvscn\") pod \"managed-serviceaccount-addon-agent-67f887d797-rmz8d\" (UID: \"2545e8e2-aae3-49b8-80c8-71db98b8f417\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-67f887d797-rmz8d" Apr 20 15:02:55.519612 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:55.519583 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-wld75"] Apr 20 15:02:55.522479 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:55.522453 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-5f8cl"] Apr 20 15:02:55.524593 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:55.524552 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c29cec3_b554_4473_90f3_87629635db89.slice/crio-d4c58fd1d537ca806490f9a09ff8e3930b2b07b5ee20a7dea0dab8dee0b292f0 WatchSource:0}: Error finding container d4c58fd1d537ca806490f9a09ff8e3930b2b07b5ee20a7dea0dab8dee0b292f0: Status 404 returned error can't find the container with id d4c58fd1d537ca806490f9a09ff8e3930b2b07b5ee20a7dea0dab8dee0b292f0 Apr 20 15:02:55.525335 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:55.525312 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod31c75c33_7396_4dd6_8404_bd0e038f65b3.slice/crio-b13fd3a568cc738332d7e36af1c9119c3e127713f8deacd3b8c24dce83f3bc5f WatchSource:0}: Error finding container b13fd3a568cc738332d7e36af1c9119c3e127713f8deacd3b8c24dce83f3bc5f: Status 404 returned error can't find the container with id b13fd3a568cc738332d7e36af1c9119c3e127713f8deacd3b8c24dce83f3bc5f Apr 20 15:02:55.530835 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:55.530816 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-67f887d797-rmz8d" Apr 20 15:02:55.540335 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:55.540315 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-568757458-778zv" Apr 20 15:02:55.554883 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:55.554868 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6ccfb7bf47-78sml" Apr 20 15:02:55.559572 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:55.559556 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/20db58c7-db2e-4b2b-be5a-cf2278346010-metrics-tls\") pod \"dns-default-p98mt\" (UID: \"20db58c7-db2e-4b2b-be5a-cf2278346010\") " pod="openshift-dns/dns-default-p98mt" Apr 20 15:02:55.559638 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:55.559583 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/eb88787e-8848-4d3f-bcdd-871260569c2c-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-8p7wk\" (UID: \"eb88787e-8848-4d3f-bcdd-871260569c2c\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-8p7wk" Apr 20 15:02:55.559699 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:55.559687 2572 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 15:02:55.559749 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:55.559704 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 15:02:55.559749 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:55.559733 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb88787e-8848-4d3f-bcdd-871260569c2c-networking-console-plugin-cert podName:eb88787e-8848-4d3f-bcdd-871260569c2c nodeName:}" failed. No retries permitted until 2026-04-20 15:02:59.559719991 +0000 UTC m=+39.690595272 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/eb88787e-8848-4d3f-bcdd-871260569c2c-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-8p7wk" (UID: "eb88787e-8848-4d3f-bcdd-871260569c2c") : secret "networking-console-plugin-cert" not found Apr 20 15:02:55.559856 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:55.559758 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/20db58c7-db2e-4b2b-be5a-cf2278346010-metrics-tls podName:20db58c7-db2e-4b2b-be5a-cf2278346010 nodeName:}" failed. No retries permitted until 2026-04-20 15:02:59.559739084 +0000 UTC m=+39.690614379 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/20db58c7-db2e-4b2b-be5a-cf2278346010-metrics-tls") pod "dns-default-p98mt" (UID: "20db58c7-db2e-4b2b-be5a-cf2278346010") : secret "dns-default-metrics-tls" not found Apr 20 15:02:55.559856 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:55.559783 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8337a855-24f1-476f-b9e0-49701fd9bda2-cert\") pod \"ingress-canary-fdmd9\" (UID: \"8337a855-24f1-476f-b9e0-49701fd9bda2\") " pod="openshift-ingress-canary/ingress-canary-fdmd9" Apr 20 15:02:55.559856 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:55.559822 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f915f3e5-4964-46bc-9adb-e34434ecea10-registry-tls\") pod \"image-registry-5f46db577b-xgcwj\" (UID: \"f915f3e5-4964-46bc-9adb-e34434ecea10\") " pod="openshift-image-registry/image-registry-5f46db577b-xgcwj" Apr 20 15:02:55.559996 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:55.559884 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 15:02:55.559996 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:55.559919 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 15:02:55.559996 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:55.559925 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8337a855-24f1-476f-b9e0-49701fd9bda2-cert podName:8337a855-24f1-476f-b9e0-49701fd9bda2 nodeName:}" failed. No retries permitted until 2026-04-20 15:02:59.559915093 +0000 UTC m=+39.690790378 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8337a855-24f1-476f-b9e0-49701fd9bda2-cert") pod "ingress-canary-fdmd9" (UID: "8337a855-24f1-476f-b9e0-49701fd9bda2") : secret "canary-serving-cert" not found Apr 20 15:02:55.559996 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:55.559931 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5f46db577b-xgcwj: secret "image-registry-tls" not found Apr 20 15:02:55.559996 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:55.559966 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f915f3e5-4964-46bc-9adb-e34434ecea10-registry-tls podName:f915f3e5-4964-46bc-9adb-e34434ecea10 nodeName:}" failed. No retries permitted until 2026-04-20 15:02:59.559954693 +0000 UTC m=+39.690829998 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/f915f3e5-4964-46bc-9adb-e34434ecea10-registry-tls") pod "image-registry-5f46db577b-xgcwj" (UID: "f915f3e5-4964-46bc-9adb-e34434ecea10") : secret "image-registry-tls" not found Apr 20 15:02:55.795668 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:55.795641 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6ccfb7bf47-78sml"] Apr 20 15:02:55.798527 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:55.798320 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeba5134a_506e_447a_8b4e_946fb42feaf3.slice/crio-c9a78d3644e146015a2d2fdae06a9b44262cb303960180ff3cec2d7417f86b32 WatchSource:0}: Error finding container c9a78d3644e146015a2d2fdae06a9b44262cb303960180ff3cec2d7417f86b32: Status 404 returned error can't find the container with id c9a78d3644e146015a2d2fdae06a9b44262cb303960180ff3cec2d7417f86b32 Apr 20 15:02:55.799078 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:55.799058 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-568757458-778zv"] Apr 20 15:02:55.800203 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:55.800155 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-67f887d797-rmz8d"] Apr 20 15:02:55.822561 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:55.822531 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0fe4c9de_e474_4b63_bc5b_da2d6d40ba00.slice/crio-3d5fee562d1df810f3cdd6530e5a0151a2a786defe7ad9dc444330dadd624e13 WatchSource:0}: Error finding container 3d5fee562d1df810f3cdd6530e5a0151a2a786defe7ad9dc444330dadd624e13: Status 404 returned error can't find the container with id 3d5fee562d1df810f3cdd6530e5a0151a2a786defe7ad9dc444330dadd624e13 Apr 20 15:02:55.822755 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:02:55.822732 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2545e8e2_aae3_49b8_80c8_71db98b8f417.slice/crio-1e4ef085a455916efe1b8c00e6e21b3c16eafd4ff072354d0fb7bcb87f7bbeac WatchSource:0}: Error finding container 1e4ef085a455916efe1b8c00e6e21b3c16eafd4ff072354d0fb7bcb87f7bbeac: Status 404 returned error can't find the container with id 1e4ef085a455916efe1b8c00e6e21b3c16eafd4ff072354d0fb7bcb87f7bbeac Apr 20 15:02:56.527401 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:56.527344 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-67f887d797-rmz8d" event={"ID":"2545e8e2-aae3-49b8-80c8-71db98b8f417","Type":"ContainerStarted","Data":"1e4ef085a455916efe1b8c00e6e21b3c16eafd4ff072354d0fb7bcb87f7bbeac"} Apr 20 15:02:56.529626 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:56.529582 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-5f8cl" event={"ID":"31c75c33-7396-4dd6-8404-bd0e038f65b3","Type":"ContainerStarted","Data":"b13fd3a568cc738332d7e36af1c9119c3e127713f8deacd3b8c24dce83f3bc5f"} Apr 20 15:02:56.530898 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:56.530849 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-wld75" event={"ID":"1c29cec3-b554-4473-90f3-87629635db89","Type":"ContainerStarted","Data":"d4c58fd1d537ca806490f9a09ff8e3930b2b07b5ee20a7dea0dab8dee0b292f0"} Apr 20 15:02:56.536789 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:56.536764 2572 generic.go:358] "Generic (PLEG): container finished" podID="1f117f6f-4e31-4bc5-91d0-9a6176af628e" containerID="6a250036c7035b22597c1983dfeb75e47d19d6ddda632b8330459380f87ae92d" exitCode=0 Apr 20 15:02:56.536872 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:56.536831 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-v4jvg" event={"ID":"1f117f6f-4e31-4bc5-91d0-9a6176af628e","Type":"ContainerDied","Data":"6a250036c7035b22597c1983dfeb75e47d19d6ddda632b8330459380f87ae92d"} Apr 20 15:02:56.539045 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:56.539024 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6ccfb7bf47-78sml" event={"ID":"eba5134a-506e-447a-8b4e-946fb42feaf3","Type":"ContainerStarted","Data":"c9a78d3644e146015a2d2fdae06a9b44262cb303960180ff3cec2d7417f86b32"} Apr 20 15:02:56.540406 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:56.540383 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-568757458-778zv" event={"ID":"0fe4c9de-e474-4b63-bc5b-da2d6d40ba00","Type":"ContainerStarted","Data":"3d5fee562d1df810f3cdd6530e5a0151a2a786defe7ad9dc444330dadd624e13"} Apr 20 15:02:57.556185 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:57.555221 2572 generic.go:358] "Generic (PLEG): container finished" podID="1f117f6f-4e31-4bc5-91d0-9a6176af628e" containerID="241453fbad80f9e7780add1673224d22506a34a996386c4e0f193c360bd776fb" exitCode=0 Apr 20 15:02:57.556185 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:57.555281 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-v4jvg" event={"ID":"1f117f6f-4e31-4bc5-91d0-9a6176af628e","Type":"ContainerDied","Data":"241453fbad80f9e7780add1673224d22506a34a996386c4e0f193c360bd776fb"} Apr 20 15:02:58.566035 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:58.565402 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-v4jvg" event={"ID":"1f117f6f-4e31-4bc5-91d0-9a6176af628e","Type":"ContainerStarted","Data":"1910993f103e98a8d13b566c8c47303e6ef9badb96fb46f560b70b1354f3f8cf"} Apr 20 15:02:58.592319 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:58.591656 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-v4jvg" podStartSLOduration=5.925012649 podStartE2EDuration="38.591636724s" podCreationTimestamp="2026-04-20 15:02:20 +0000 UTC" firstStartedPulling="2026-04-20 15:02:22.973005999 +0000 UTC m=+3.103881293" lastFinishedPulling="2026-04-20 15:02:55.639630084 +0000 UTC m=+35.770505368" observedRunningTime="2026-04-20 15:02:58.588378828 +0000 UTC m=+38.719254130" watchObservedRunningTime="2026-04-20 15:02:58.591636724 +0000 UTC m=+38.722512028" Apr 20 15:02:59.599044 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:59.599005 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8337a855-24f1-476f-b9e0-49701fd9bda2-cert\") pod \"ingress-canary-fdmd9\" (UID: \"8337a855-24f1-476f-b9e0-49701fd9bda2\") " pod="openshift-ingress-canary/ingress-canary-fdmd9" Apr 20 15:02:59.599535 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:59.599055 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f915f3e5-4964-46bc-9adb-e34434ecea10-registry-tls\") pod \"image-registry-5f46db577b-xgcwj\" (UID: \"f915f3e5-4964-46bc-9adb-e34434ecea10\") " pod="openshift-image-registry/image-registry-5f46db577b-xgcwj" Apr 20 15:02:59.599535 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:59.599098 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/20db58c7-db2e-4b2b-be5a-cf2278346010-metrics-tls\") pod \"dns-default-p98mt\" (UID: \"20db58c7-db2e-4b2b-be5a-cf2278346010\") " pod="openshift-dns/dns-default-p98mt" Apr 20 15:02:59.599535 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:02:59.599121 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/eb88787e-8848-4d3f-bcdd-871260569c2c-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-8p7wk\" (UID: \"eb88787e-8848-4d3f-bcdd-871260569c2c\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-8p7wk" Apr 20 15:02:59.599535 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:59.599156 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 15:02:59.599535 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:59.599232 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8337a855-24f1-476f-b9e0-49701fd9bda2-cert podName:8337a855-24f1-476f-b9e0-49701fd9bda2 nodeName:}" failed. No retries permitted until 2026-04-20 15:03:07.599212592 +0000 UTC m=+47.730087890 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8337a855-24f1-476f-b9e0-49701fd9bda2-cert") pod "ingress-canary-fdmd9" (UID: "8337a855-24f1-476f-b9e0-49701fd9bda2") : secret "canary-serving-cert" not found Apr 20 15:02:59.599535 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:59.599276 2572 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 15:02:59.599535 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:59.599316 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb88787e-8848-4d3f-bcdd-871260569c2c-networking-console-plugin-cert podName:eb88787e-8848-4d3f-bcdd-871260569c2c nodeName:}" failed. No retries permitted until 2026-04-20 15:03:07.599305613 +0000 UTC m=+47.730180895 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/eb88787e-8848-4d3f-bcdd-871260569c2c-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-8p7wk" (UID: "eb88787e-8848-4d3f-bcdd-871260569c2c") : secret "networking-console-plugin-cert" not found Apr 20 15:02:59.599535 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:59.599359 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 15:02:59.599535 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:59.599366 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5f46db577b-xgcwj: secret "image-registry-tls" not found Apr 20 15:02:59.599535 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:59.599386 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f915f3e5-4964-46bc-9adb-e34434ecea10-registry-tls podName:f915f3e5-4964-46bc-9adb-e34434ecea10 nodeName:}" failed. No retries permitted until 2026-04-20 15:03:07.599380315 +0000 UTC m=+47.730255596 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/f915f3e5-4964-46bc-9adb-e34434ecea10-registry-tls") pod "image-registry-5f46db577b-xgcwj" (UID: "f915f3e5-4964-46bc-9adb-e34434ecea10") : secret "image-registry-tls" not found Apr 20 15:02:59.599535 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:59.599442 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 15:02:59.599535 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:02:59.599509 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/20db58c7-db2e-4b2b-be5a-cf2278346010-metrics-tls podName:20db58c7-db2e-4b2b-be5a-cf2278346010 nodeName:}" failed. No retries permitted until 2026-04-20 15:03:07.599498751 +0000 UTC m=+47.730374032 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/20db58c7-db2e-4b2b-be5a-cf2278346010-metrics-tls") pod "dns-default-p98mt" (UID: "20db58c7-db2e-4b2b-be5a-cf2278346010") : secret "dns-default-metrics-tls" not found Apr 20 15:03:05.581663 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:03:05.581610 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-67f887d797-rmz8d" event={"ID":"2545e8e2-aae3-49b8-80c8-71db98b8f417","Type":"ContainerStarted","Data":"80cd4f9f61cd6532a4a85951595013c0e0ad7b87c83cccec2926875d0e344cb9"} Apr 20 15:03:05.583150 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:03:05.583006 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6ccfb7bf47-78sml" event={"ID":"eba5134a-506e-447a-8b4e-946fb42feaf3","Type":"ContainerStarted","Data":"ec075310febc33c784928b97f76bc9c77d9fab8dea6f614a18e9975c45ab142b"} Apr 20 15:03:05.597152 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:03:05.597109 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-67f887d797-rmz8d" podStartSLOduration=1.036268798 podStartE2EDuration="10.597096644s" podCreationTimestamp="2026-04-20 15:02:55 +0000 UTC" firstStartedPulling="2026-04-20 15:02:55.824534166 +0000 UTC m=+35.955409447" lastFinishedPulling="2026-04-20 15:03:05.385362009 +0000 UTC m=+45.516237293" observedRunningTime="2026-04-20 15:03:05.596291932 +0000 UTC m=+45.727167227" watchObservedRunningTime="2026-04-20 15:03:05.597096644 +0000 UTC m=+45.727971946" Apr 20 15:03:06.586223 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:03:06.586185 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-568757458-778zv" event={"ID":"0fe4c9de-e474-4b63-bc5b-da2d6d40ba00","Type":"ContainerStarted","Data":"671a620bca6b7578d39204a878560c625e58e62de4b7f2b7bc61b8dfb2d56d80"} Apr 20 15:03:06.586621 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:03:06.586445 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-568757458-778zv" Apr 20 15:03:06.587649 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:03:06.587622 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-5f8cl" event={"ID":"31c75c33-7396-4dd6-8404-bd0e038f65b3","Type":"ContainerStarted","Data":"8e1e7f7ef0969a579ccaaa40df77da0c4eab8d109778b4c157153859f21090cf"} Apr 20 15:03:06.587759 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:03:06.587675 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-5f8cl" Apr 20 15:03:06.588260 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:03:06.588241 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-568757458-778zv" Apr 20 15:03:06.589040 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:03:06.589010 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-wld75" event={"ID":"1c29cec3-b554-4473-90f3-87629635db89","Type":"ContainerStarted","Data":"ad6af7824d8b0954ffc1ba39ad1dedfcad8105806a1287b55507e514cb734495"} Apr 20 15:03:06.610760 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:03:06.610720 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-568757458-778zv" podStartSLOduration=1.653330413 podStartE2EDuration="11.610709249s" podCreationTimestamp="2026-04-20 15:02:55 +0000 UTC" firstStartedPulling="2026-04-20 15:02:55.824251438 +0000 UTC m=+35.955126719" lastFinishedPulling="2026-04-20 15:03:05.781630275 +0000 UTC m=+45.912505555" observedRunningTime="2026-04-20 15:03:06.609787997 +0000 UTC m=+46.740663299" watchObservedRunningTime="2026-04-20 15:03:06.610709249 +0000 UTC m=+46.741584548" Apr 20 15:03:06.636194 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:03:06.636156 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-wld75" podStartSLOduration=33.864670265 podStartE2EDuration="43.636144814s" podCreationTimestamp="2026-04-20 15:02:23 +0000 UTC" firstStartedPulling="2026-04-20 15:02:55.614510414 +0000 UTC m=+35.745385698" lastFinishedPulling="2026-04-20 15:03:05.385984952 +0000 UTC m=+45.516860247" observedRunningTime="2026-04-20 15:03:06.635968287 +0000 UTC m=+46.766843602" watchObservedRunningTime="2026-04-20 15:03:06.636144814 +0000 UTC m=+46.767020117" Apr 20 15:03:07.662272 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:03:07.662234 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8337a855-24f1-476f-b9e0-49701fd9bda2-cert\") pod \"ingress-canary-fdmd9\" (UID: \"8337a855-24f1-476f-b9e0-49701fd9bda2\") " pod="openshift-ingress-canary/ingress-canary-fdmd9" Apr 20 15:03:07.662713 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:03:07.662283 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f915f3e5-4964-46bc-9adb-e34434ecea10-registry-tls\") pod \"image-registry-5f46db577b-xgcwj\" (UID: \"f915f3e5-4964-46bc-9adb-e34434ecea10\") " pod="openshift-image-registry/image-registry-5f46db577b-xgcwj" Apr 20 15:03:07.662713 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:03:07.662334 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/20db58c7-db2e-4b2b-be5a-cf2278346010-metrics-tls\") pod \"dns-default-p98mt\" (UID: \"20db58c7-db2e-4b2b-be5a-cf2278346010\") " pod="openshift-dns/dns-default-p98mt" Apr 20 15:03:07.662713 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:03:07.662378 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/eb88787e-8848-4d3f-bcdd-871260569c2c-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-8p7wk\" (UID: \"eb88787e-8848-4d3f-bcdd-871260569c2c\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-8p7wk" Apr 20 15:03:07.662713 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:03:07.662475 2572 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 15:03:07.662713 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:03:07.662546 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb88787e-8848-4d3f-bcdd-871260569c2c-networking-console-plugin-cert podName:eb88787e-8848-4d3f-bcdd-871260569c2c nodeName:}" failed. No retries permitted until 2026-04-20 15:03:23.662530328 +0000 UTC m=+63.793405609 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/eb88787e-8848-4d3f-bcdd-871260569c2c-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-8p7wk" (UID: "eb88787e-8848-4d3f-bcdd-871260569c2c") : secret "networking-console-plugin-cert" not found Apr 20 15:03:07.662713 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:03:07.662670 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 15:03:07.662713 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:03:07.662681 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 15:03:07.663008 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:03:07.662730 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/20db58c7-db2e-4b2b-be5a-cf2278346010-metrics-tls podName:20db58c7-db2e-4b2b-be5a-cf2278346010 nodeName:}" failed. No retries permitted until 2026-04-20 15:03:23.66271334 +0000 UTC m=+63.793588632 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/20db58c7-db2e-4b2b-be5a-cf2278346010-metrics-tls") pod "dns-default-p98mt" (UID: "20db58c7-db2e-4b2b-be5a-cf2278346010") : secret "dns-default-metrics-tls" not found Apr 20 15:03:07.663008 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:03:07.662748 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8337a855-24f1-476f-b9e0-49701fd9bda2-cert podName:8337a855-24f1-476f-b9e0-49701fd9bda2 nodeName:}" failed. No retries permitted until 2026-04-20 15:03:23.6627382 +0000 UTC m=+63.793613484 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8337a855-24f1-476f-b9e0-49701fd9bda2-cert") pod "ingress-canary-fdmd9" (UID: "8337a855-24f1-476f-b9e0-49701fd9bda2") : secret "canary-serving-cert" not found Apr 20 15:03:07.663008 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:03:07.662779 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 15:03:07.663008 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:03:07.662791 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5f46db577b-xgcwj: secret "image-registry-tls" not found Apr 20 15:03:07.663008 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:03:07.662832 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f915f3e5-4964-46bc-9adb-e34434ecea10-registry-tls podName:f915f3e5-4964-46bc-9adb-e34434ecea10 nodeName:}" failed. No retries permitted until 2026-04-20 15:03:23.662816851 +0000 UTC m=+63.793692132 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/f915f3e5-4964-46bc-9adb-e34434ecea10-registry-tls") pod "image-registry-5f46db577b-xgcwj" (UID: "f915f3e5-4964-46bc-9adb-e34434ecea10") : secret "image-registry-tls" not found Apr 20 15:03:08.598243 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:03:08.598192 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6ccfb7bf47-78sml" event={"ID":"eba5134a-506e-447a-8b4e-946fb42feaf3","Type":"ContainerStarted","Data":"ff8a2a83ceff8b4f1970d2c8f6d7ac6cb2b25830e00e6158749d13dea88b9b71"} Apr 20 15:03:09.602909 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:03:09.602869 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6ccfb7bf47-78sml" event={"ID":"eba5134a-506e-447a-8b4e-946fb42feaf3","Type":"ContainerStarted","Data":"61062279f18829d89546d18f932d562c0f4080103373dbbe633b69cab4f02e27"} Apr 20 15:03:09.621790 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:03:09.621745 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-5f8cl" podStartSLOduration=39.840681635 podStartE2EDuration="49.621732357s" podCreationTimestamp="2026-04-20 15:02:20 +0000 UTC" firstStartedPulling="2026-04-20 15:02:55.614428811 +0000 UTC m=+35.745304091" lastFinishedPulling="2026-04-20 15:03:05.395479529 +0000 UTC m=+45.526354813" observedRunningTime="2026-04-20 15:03:06.648535479 +0000 UTC m=+46.779410783" watchObservedRunningTime="2026-04-20 15:03:09.621732357 +0000 UTC m=+49.752607637" Apr 20 15:03:09.621972 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:03:09.621953 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6ccfb7bf47-78sml" podStartSLOduration=1.955577924 podStartE2EDuration="14.621949365s" podCreationTimestamp="2026-04-20 15:02:55 +0000 UTC" firstStartedPulling="2026-04-20 15:02:55.800371667 +0000 UTC m=+35.931246951" lastFinishedPulling="2026-04-20 15:03:08.466743094 +0000 UTC m=+48.597618392" observedRunningTime="2026-04-20 15:03:09.620409454 +0000 UTC m=+49.751284758" watchObservedRunningTime="2026-04-20 15:03:09.621949365 +0000 UTC m=+49.752824667" Apr 20 15:03:19.523535 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:03:19.523507 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-nhq74" Apr 20 15:03:23.688385 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:03:23.688338 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8337a855-24f1-476f-b9e0-49701fd9bda2-cert\") pod \"ingress-canary-fdmd9\" (UID: \"8337a855-24f1-476f-b9e0-49701fd9bda2\") " pod="openshift-ingress-canary/ingress-canary-fdmd9" Apr 20 15:03:23.688385 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:03:23.688391 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f915f3e5-4964-46bc-9adb-e34434ecea10-registry-tls\") pod \"image-registry-5f46db577b-xgcwj\" (UID: \"f915f3e5-4964-46bc-9adb-e34434ecea10\") " pod="openshift-image-registry/image-registry-5f46db577b-xgcwj" Apr 20 15:03:23.688926 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:03:23.688431 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/20db58c7-db2e-4b2b-be5a-cf2278346010-metrics-tls\") pod \"dns-default-p98mt\" (UID: \"20db58c7-db2e-4b2b-be5a-cf2278346010\") " pod="openshift-dns/dns-default-p98mt" Apr 20 15:03:23.688926 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:03:23.688459 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/eb88787e-8848-4d3f-bcdd-871260569c2c-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-8p7wk\" (UID: \"eb88787e-8848-4d3f-bcdd-871260569c2c\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-8p7wk" Apr 20 15:03:23.688926 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:03:23.688475 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 15:03:23.688926 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:03:23.688553 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8337a855-24f1-476f-b9e0-49701fd9bda2-cert podName:8337a855-24f1-476f-b9e0-49701fd9bda2 nodeName:}" failed. No retries permitted until 2026-04-20 15:03:55.688537646 +0000 UTC m=+95.819412927 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8337a855-24f1-476f-b9e0-49701fd9bda2-cert") pod "ingress-canary-fdmd9" (UID: "8337a855-24f1-476f-b9e0-49701fd9bda2") : secret "canary-serving-cert" not found Apr 20 15:03:23.688926 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:03:23.688557 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 15:03:23.688926 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:03:23.688565 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 15:03:23.688926 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:03:23.688599 2572 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 15:03:23.688926 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:03:23.688617 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/20db58c7-db2e-4b2b-be5a-cf2278346010-metrics-tls podName:20db58c7-db2e-4b2b-be5a-cf2278346010 nodeName:}" failed. No retries permitted until 2026-04-20 15:03:55.688604512 +0000 UTC m=+95.819479792 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/20db58c7-db2e-4b2b-be5a-cf2278346010-metrics-tls") pod "dns-default-p98mt" (UID: "20db58c7-db2e-4b2b-be5a-cf2278346010") : secret "dns-default-metrics-tls" not found Apr 20 15:03:23.688926 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:03:23.688573 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5f46db577b-xgcwj: secret "image-registry-tls" not found Apr 20 15:03:23.688926 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:03:23.688649 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb88787e-8848-4d3f-bcdd-871260569c2c-networking-console-plugin-cert podName:eb88787e-8848-4d3f-bcdd-871260569c2c nodeName:}" failed. No retries permitted until 2026-04-20 15:03:55.688635416 +0000 UTC m=+95.819510698 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/eb88787e-8848-4d3f-bcdd-871260569c2c-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-8p7wk" (UID: "eb88787e-8848-4d3f-bcdd-871260569c2c") : secret "networking-console-plugin-cert" not found Apr 20 15:03:23.688926 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:03:23.688667 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f915f3e5-4964-46bc-9adb-e34434ecea10-registry-tls podName:f915f3e5-4964-46bc-9adb-e34434ecea10 nodeName:}" failed. No retries permitted until 2026-04-20 15:03:55.688657542 +0000 UTC m=+95.819532824 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/f915f3e5-4964-46bc-9adb-e34434ecea10-registry-tls") pod "image-registry-5f46db577b-xgcwj" (UID: "f915f3e5-4964-46bc-9adb-e34434ecea10") : secret "image-registry-tls" not found Apr 20 15:03:26.208067 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:03:26.208032 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7215f3fe-093a-42b9-bea0-26a93cb4e1ff-metrics-certs\") pod \"network-metrics-daemon-sq52t\" (UID: \"7215f3fe-093a-42b9-bea0-26a93cb4e1ff\") " pod="openshift-multus/network-metrics-daemon-sq52t" Apr 20 15:03:26.208437 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:03:26.208156 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 20 15:03:26.208437 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:03:26.208211 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7215f3fe-093a-42b9-bea0-26a93cb4e1ff-metrics-certs podName:7215f3fe-093a-42b9-bea0-26a93cb4e1ff nodeName:}" failed. No retries permitted until 2026-04-20 15:04:30.208195791 +0000 UTC m=+130.339071075 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7215f3fe-093a-42b9-bea0-26a93cb4e1ff-metrics-certs") pod "network-metrics-daemon-sq52t" (UID: "7215f3fe-093a-42b9-bea0-26a93cb4e1ff") : secret "metrics-daemon-secret" not found Apr 20 15:03:37.593547 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:03:37.593509 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-5f8cl" Apr 20 15:03:55.726458 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:03:55.726321 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/20db58c7-db2e-4b2b-be5a-cf2278346010-metrics-tls\") pod \"dns-default-p98mt\" (UID: \"20db58c7-db2e-4b2b-be5a-cf2278346010\") " pod="openshift-dns/dns-default-p98mt" Apr 20 15:03:55.726458 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:03:55.726364 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/eb88787e-8848-4d3f-bcdd-871260569c2c-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-8p7wk\" (UID: \"eb88787e-8848-4d3f-bcdd-871260569c2c\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-8p7wk" Apr 20 15:03:55.726458 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:03:55.726434 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8337a855-24f1-476f-b9e0-49701fd9bda2-cert\") pod \"ingress-canary-fdmd9\" (UID: \"8337a855-24f1-476f-b9e0-49701fd9bda2\") " pod="openshift-ingress-canary/ingress-canary-fdmd9" Apr 20 15:03:55.727084 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:03:55.726460 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f915f3e5-4964-46bc-9adb-e34434ecea10-registry-tls\") pod \"image-registry-5f46db577b-xgcwj\" (UID: \"f915f3e5-4964-46bc-9adb-e34434ecea10\") " pod="openshift-image-registry/image-registry-5f46db577b-xgcwj" Apr 20 15:03:55.727084 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:03:55.726463 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 15:03:55.727084 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:03:55.726525 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 15:03:55.727084 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:03:55.726540 2572 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 15:03:55.727084 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:03:55.726565 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 15:03:55.727084 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:03:55.726584 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5f46db577b-xgcwj: secret "image-registry-tls" not found Apr 20 15:03:55.727084 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:03:55.726576 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/20db58c7-db2e-4b2b-be5a-cf2278346010-metrics-tls podName:20db58c7-db2e-4b2b-be5a-cf2278346010 nodeName:}" failed. No retries permitted until 2026-04-20 15:04:59.72655182 +0000 UTC m=+159.857427106 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/20db58c7-db2e-4b2b-be5a-cf2278346010-metrics-tls") pod "dns-default-p98mt" (UID: "20db58c7-db2e-4b2b-be5a-cf2278346010") : secret "dns-default-metrics-tls" not found Apr 20 15:03:55.727084 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:03:55.726643 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8337a855-24f1-476f-b9e0-49701fd9bda2-cert podName:8337a855-24f1-476f-b9e0-49701fd9bda2 nodeName:}" failed. No retries permitted until 2026-04-20 15:04:59.726627934 +0000 UTC m=+159.857503221 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8337a855-24f1-476f-b9e0-49701fd9bda2-cert") pod "ingress-canary-fdmd9" (UID: "8337a855-24f1-476f-b9e0-49701fd9bda2") : secret "canary-serving-cert" not found Apr 20 15:03:55.727084 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:03:55.726663 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb88787e-8848-4d3f-bcdd-871260569c2c-networking-console-plugin-cert podName:eb88787e-8848-4d3f-bcdd-871260569c2c nodeName:}" failed. No retries permitted until 2026-04-20 15:04:59.726655007 +0000 UTC m=+159.857530288 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/eb88787e-8848-4d3f-bcdd-871260569c2c-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-8p7wk" (UID: "eb88787e-8848-4d3f-bcdd-871260569c2c") : secret "networking-console-plugin-cert" not found Apr 20 15:03:55.727084 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:03:55.726681 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f915f3e5-4964-46bc-9adb-e34434ecea10-registry-tls podName:f915f3e5-4964-46bc-9adb-e34434ecea10 nodeName:}" failed. No retries permitted until 2026-04-20 15:04:59.726673287 +0000 UTC m=+159.857548568 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/f915f3e5-4964-46bc-9adb-e34434ecea10-registry-tls") pod "image-registry-5f46db577b-xgcwj" (UID: "f915f3e5-4964-46bc-9adb-e34434ecea10") : secret "image-registry-tls" not found Apr 20 15:04:30.282036 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:04:30.281980 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7215f3fe-093a-42b9-bea0-26a93cb4e1ff-metrics-certs\") pod \"network-metrics-daemon-sq52t\" (UID: \"7215f3fe-093a-42b9-bea0-26a93cb4e1ff\") " pod="openshift-multus/network-metrics-daemon-sq52t" Apr 20 15:04:30.282557 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:04:30.282117 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 20 15:04:30.282557 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:04:30.282187 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7215f3fe-093a-42b9-bea0-26a93cb4e1ff-metrics-certs podName:7215f3fe-093a-42b9-bea0-26a93cb4e1ff nodeName:}" failed. No retries permitted until 2026-04-20 15:06:32.282169745 +0000 UTC m=+252.413045032 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7215f3fe-093a-42b9-bea0-26a93cb4e1ff-metrics-certs") pod "network-metrics-daemon-sq52t" (UID: "7215f3fe-093a-42b9-bea0-26a93cb4e1ff") : secret "metrics-daemon-secret" not found Apr 20 15:04:54.798749 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:04:54.798692 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-cb95c66f6-8p7wk" podUID="eb88787e-8848-4d3f-bcdd-871260569c2c" Apr 20 15:04:54.815892 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:04:54.815860 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-5f46db577b-xgcwj" podUID="f915f3e5-4964-46bc-9adb-e34434ecea10" Apr 20 15:04:54.834093 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:04:54.834067 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-p98mt" podUID="20db58c7-db2e-4b2b-be5a-cf2278346010" Apr 20 15:04:54.848780 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:04:54.848759 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-8p7wk" Apr 20 15:04:54.848780 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:04:54.848766 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5f46db577b-xgcwj" Apr 20 15:04:54.848939 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:04:54.848766 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-p98mt" Apr 20 15:04:54.851339 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:04:54.851304 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-fdmd9" podUID="8337a855-24f1-476f-b9e0-49701fd9bda2" Apr 20 15:04:56.424171 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:04:56.424133 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-sq52t" podUID="7215f3fe-093a-42b9-bea0-26a93cb4e1ff" Apr 20 15:04:58.148878 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:04:58.148849 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-p5xng_034b6f21-f85a-4440-a061-d39a6df5dde4/dns-node-resolver/0.log" Apr 20 15:04:59.348970 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:04:59.348942 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-x8tdg_22757559-941f-4d9e-9128-3aeefc6665f3/node-ca/0.log" Apr 20 15:04:59.801862 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:04:59.801833 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/20db58c7-db2e-4b2b-be5a-cf2278346010-metrics-tls\") pod \"dns-default-p98mt\" (UID: \"20db58c7-db2e-4b2b-be5a-cf2278346010\") " pod="openshift-dns/dns-default-p98mt" Apr 20 15:04:59.802007 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:04:59.801869 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/eb88787e-8848-4d3f-bcdd-871260569c2c-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-8p7wk\" (UID: \"eb88787e-8848-4d3f-bcdd-871260569c2c\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-8p7wk" Apr 20 15:04:59.802007 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:04:59.801917 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8337a855-24f1-476f-b9e0-49701fd9bda2-cert\") pod \"ingress-canary-fdmd9\" (UID: \"8337a855-24f1-476f-b9e0-49701fd9bda2\") " pod="openshift-ingress-canary/ingress-canary-fdmd9" Apr 20 15:04:59.802007 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:04:59.801936 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f915f3e5-4964-46bc-9adb-e34434ecea10-registry-tls\") pod \"image-registry-5f46db577b-xgcwj\" (UID: \"f915f3e5-4964-46bc-9adb-e34434ecea10\") " pod="openshift-image-registry/image-registry-5f46db577b-xgcwj" Apr 20 15:04:59.802007 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:04:59.801972 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 15:04:59.802234 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:04:59.802016 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 15:04:59.802234 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:04:59.802026 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5f46db577b-xgcwj: secret "image-registry-tls" not found Apr 20 15:04:59.802234 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:04:59.802049 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/20db58c7-db2e-4b2b-be5a-cf2278346010-metrics-tls podName:20db58c7-db2e-4b2b-be5a-cf2278346010 nodeName:}" failed. No retries permitted until 2026-04-20 15:07:01.802031927 +0000 UTC m=+281.932907212 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/20db58c7-db2e-4b2b-be5a-cf2278346010-metrics-tls") pod "dns-default-p98mt" (UID: "20db58c7-db2e-4b2b-be5a-cf2278346010") : secret "dns-default-metrics-tls" not found Apr 20 15:04:59.802234 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:04:59.802048 2572 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 15:04:59.802234 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:04:59.802058 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 15:04:59.802234 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:04:59.802070 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f915f3e5-4964-46bc-9adb-e34434ecea10-registry-tls podName:f915f3e5-4964-46bc-9adb-e34434ecea10 nodeName:}" failed. No retries permitted until 2026-04-20 15:07:01.802057948 +0000 UTC m=+281.932933229 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/f915f3e5-4964-46bc-9adb-e34434ecea10-registry-tls") pod "image-registry-5f46db577b-xgcwj" (UID: "f915f3e5-4964-46bc-9adb-e34434ecea10") : secret "image-registry-tls" not found Apr 20 15:04:59.802234 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:04:59.802083 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb88787e-8848-4d3f-bcdd-871260569c2c-networking-console-plugin-cert podName:eb88787e-8848-4d3f-bcdd-871260569c2c nodeName:}" failed. No retries permitted until 2026-04-20 15:07:01.802075993 +0000 UTC m=+281.932951273 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/eb88787e-8848-4d3f-bcdd-871260569c2c-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-8p7wk" (UID: "eb88787e-8848-4d3f-bcdd-871260569c2c") : secret "networking-console-plugin-cert" not found Apr 20 15:04:59.802234 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:04:59.802108 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8337a855-24f1-476f-b9e0-49701fd9bda2-cert podName:8337a855-24f1-476f-b9e0-49701fd9bda2 nodeName:}" failed. No retries permitted until 2026-04-20 15:07:01.802095428 +0000 UTC m=+281.932970709 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8337a855-24f1-476f-b9e0-49701fd9bda2-cert") pod "ingress-canary-fdmd9" (UID: "8337a855-24f1-476f-b9e0-49701fd9bda2") : secret "canary-serving-cert" not found Apr 20 15:05:05.405322 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:05:05.405282 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-fdmd9" Apr 20 15:05:05.874598 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:05:05.874563 2572 generic.go:358] "Generic (PLEG): container finished" podID="2545e8e2-aae3-49b8-80c8-71db98b8f417" containerID="80cd4f9f61cd6532a4a85951595013c0e0ad7b87c83cccec2926875d0e344cb9" exitCode=255 Apr 20 15:05:05.874790 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:05:05.874613 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-67f887d797-rmz8d" event={"ID":"2545e8e2-aae3-49b8-80c8-71db98b8f417","Type":"ContainerDied","Data":"80cd4f9f61cd6532a4a85951595013c0e0ad7b87c83cccec2926875d0e344cb9"} Apr 20 15:05:05.874909 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:05:05.874896 2572 scope.go:117] "RemoveContainer" containerID="80cd4f9f61cd6532a4a85951595013c0e0ad7b87c83cccec2926875d0e344cb9" Apr 20 15:05:06.587318 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:05:06.587253 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-568757458-778zv" podUID="0fe4c9de-e474-4b63-bc5b-da2d6d40ba00" containerName="acm-agent" probeResult="failure" output="Get \"http://10.132.0.12:8000/readyz\": dial tcp 10.132.0.12:8000: connect: connection refused" Apr 20 15:05:06.877791 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:05:06.877712 2572 generic.go:358] "Generic (PLEG): container finished" podID="0fe4c9de-e474-4b63-bc5b-da2d6d40ba00" containerID="671a620bca6b7578d39204a878560c625e58e62de4b7f2b7bc61b8dfb2d56d80" exitCode=1 Apr 20 15:05:06.877791 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:05:06.877770 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-568757458-778zv" event={"ID":"0fe4c9de-e474-4b63-bc5b-da2d6d40ba00","Type":"ContainerDied","Data":"671a620bca6b7578d39204a878560c625e58e62de4b7f2b7bc61b8dfb2d56d80"} Apr 20 15:05:06.878111 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:05:06.878094 2572 scope.go:117] "RemoveContainer" containerID="671a620bca6b7578d39204a878560c625e58e62de4b7f2b7bc61b8dfb2d56d80" Apr 20 15:05:06.879380 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:05:06.879359 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-67f887d797-rmz8d" event={"ID":"2545e8e2-aae3-49b8-80c8-71db98b8f417","Type":"ContainerStarted","Data":"c77b9b29b96a94db84ef7df69a2ef6963463723f228630e4b104d74ac275e8ca"} Apr 20 15:05:07.883033 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:05:07.882999 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-568757458-778zv" event={"ID":"0fe4c9de-e474-4b63-bc5b-da2d6d40ba00","Type":"ContainerStarted","Data":"1d09a46b556136dcd2ecd66d309360bf4fcc7b55f91a4f338910c757117b2cd8"} Apr 20 15:05:07.883440 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:05:07.883374 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-568757458-778zv" Apr 20 15:05:07.883796 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:05:07.883779 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-568757458-778zv" Apr 20 15:05:10.407925 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:05:10.407859 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sq52t" Apr 20 15:05:28.213526 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:05:28.213474 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-7qmvr"] Apr 20 15:05:28.216791 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:05:28.216771 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-7qmvr" Apr 20 15:05:28.220193 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:05:28.220170 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 20 15:05:28.220308 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:05:28.220168 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 20 15:05:28.220308 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:05:28.220218 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 20 15:05:28.220308 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:05:28.220171 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-8mlqf\"" Apr 20 15:05:28.220308 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:05:28.220171 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 20 15:05:28.227883 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:05:28.227865 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-7qmvr"] Apr 20 15:05:28.317500 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:05:28.317449 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/ed7ca0c9-d248-41fb-a871-7313c1a4e1eb-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-7qmvr\" (UID: \"ed7ca0c9-d248-41fb-a871-7313c1a4e1eb\") " pod="openshift-insights/insights-runtime-extractor-7qmvr" Apr 20 15:05:28.317615 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:05:28.317531 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qbh2\" (UniqueName: \"kubernetes.io/projected/ed7ca0c9-d248-41fb-a871-7313c1a4e1eb-kube-api-access-4qbh2\") pod \"insights-runtime-extractor-7qmvr\" (UID: \"ed7ca0c9-d248-41fb-a871-7313c1a4e1eb\") " pod="openshift-insights/insights-runtime-extractor-7qmvr" Apr 20 15:05:28.317615 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:05:28.317574 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/ed7ca0c9-d248-41fb-a871-7313c1a4e1eb-crio-socket\") pod \"insights-runtime-extractor-7qmvr\" (UID: \"ed7ca0c9-d248-41fb-a871-7313c1a4e1eb\") " pod="openshift-insights/insights-runtime-extractor-7qmvr" Apr 20 15:05:28.317615 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:05:28.317607 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/ed7ca0c9-d248-41fb-a871-7313c1a4e1eb-data-volume\") pod \"insights-runtime-extractor-7qmvr\" (UID: \"ed7ca0c9-d248-41fb-a871-7313c1a4e1eb\") " pod="openshift-insights/insights-runtime-extractor-7qmvr" Apr 20 15:05:28.317746 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:05:28.317665 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/ed7ca0c9-d248-41fb-a871-7313c1a4e1eb-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-7qmvr\" (UID: \"ed7ca0c9-d248-41fb-a871-7313c1a4e1eb\") " pod="openshift-insights/insights-runtime-extractor-7qmvr" Apr 20 15:05:28.418446 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:05:28.418421 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/ed7ca0c9-d248-41fb-a871-7313c1a4e1eb-data-volume\") pod \"insights-runtime-extractor-7qmvr\" (UID: \"ed7ca0c9-d248-41fb-a871-7313c1a4e1eb\") " pod="openshift-insights/insights-runtime-extractor-7qmvr" Apr 20 15:05:28.418610 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:05:28.418592 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/ed7ca0c9-d248-41fb-a871-7313c1a4e1eb-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-7qmvr\" (UID: \"ed7ca0c9-d248-41fb-a871-7313c1a4e1eb\") " pod="openshift-insights/insights-runtime-extractor-7qmvr" Apr 20 15:05:28.418677 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:05:28.418643 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/ed7ca0c9-d248-41fb-a871-7313c1a4e1eb-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-7qmvr\" (UID: \"ed7ca0c9-d248-41fb-a871-7313c1a4e1eb\") " pod="openshift-insights/insights-runtime-extractor-7qmvr" Apr 20 15:05:28.418737 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:05:28.418680 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4qbh2\" (UniqueName: \"kubernetes.io/projected/ed7ca0c9-d248-41fb-a871-7313c1a4e1eb-kube-api-access-4qbh2\") pod \"insights-runtime-extractor-7qmvr\" (UID: \"ed7ca0c9-d248-41fb-a871-7313c1a4e1eb\") " pod="openshift-insights/insights-runtime-extractor-7qmvr" Apr 20 15:05:28.418789 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:05:28.418732 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/ed7ca0c9-d248-41fb-a871-7313c1a4e1eb-crio-socket\") pod \"insights-runtime-extractor-7qmvr\" (UID: \"ed7ca0c9-d248-41fb-a871-7313c1a4e1eb\") " pod="openshift-insights/insights-runtime-extractor-7qmvr" Apr 20 15:05:28.418789 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:05:28.418770 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/ed7ca0c9-d248-41fb-a871-7313c1a4e1eb-data-volume\") pod \"insights-runtime-extractor-7qmvr\" (UID: \"ed7ca0c9-d248-41fb-a871-7313c1a4e1eb\") " pod="openshift-insights/insights-runtime-extractor-7qmvr" Apr 20 15:05:28.418875 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:05:28.418797 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/ed7ca0c9-d248-41fb-a871-7313c1a4e1eb-crio-socket\") pod \"insights-runtime-extractor-7qmvr\" (UID: \"ed7ca0c9-d248-41fb-a871-7313c1a4e1eb\") " pod="openshift-insights/insights-runtime-extractor-7qmvr" Apr 20 15:05:28.419194 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:05:28.419172 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/ed7ca0c9-d248-41fb-a871-7313c1a4e1eb-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-7qmvr\" (UID: \"ed7ca0c9-d248-41fb-a871-7313c1a4e1eb\") " pod="openshift-insights/insights-runtime-extractor-7qmvr" Apr 20 15:05:28.421553 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:05:28.421532 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/ed7ca0c9-d248-41fb-a871-7313c1a4e1eb-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-7qmvr\" (UID: \"ed7ca0c9-d248-41fb-a871-7313c1a4e1eb\") " pod="openshift-insights/insights-runtime-extractor-7qmvr" Apr 20 15:05:28.431000 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:05:28.430982 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qbh2\" (UniqueName: \"kubernetes.io/projected/ed7ca0c9-d248-41fb-a871-7313c1a4e1eb-kube-api-access-4qbh2\") pod \"insights-runtime-extractor-7qmvr\" (UID: \"ed7ca0c9-d248-41fb-a871-7313c1a4e1eb\") " pod="openshift-insights/insights-runtime-extractor-7qmvr" Apr 20 15:05:28.526034 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:05:28.526012 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-7qmvr" Apr 20 15:05:28.646008 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:05:28.642969 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-7qmvr"] Apr 20 15:05:28.649826 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:05:28.649800 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded7ca0c9_d248_41fb_a871_7313c1a4e1eb.slice/crio-d9ff85a286141c09daab52c95e17df625ed05465862920b5d3adaf286f3b24e0 WatchSource:0}: Error finding container d9ff85a286141c09daab52c95e17df625ed05465862920b5d3adaf286f3b24e0: Status 404 returned error can't find the container with id d9ff85a286141c09daab52c95e17df625ed05465862920b5d3adaf286f3b24e0 Apr 20 15:05:28.932430 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:05:28.932341 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-7qmvr" event={"ID":"ed7ca0c9-d248-41fb-a871-7313c1a4e1eb","Type":"ContainerStarted","Data":"be53589bc0e264397dd576a327056a57f8c52206362c6db97316195df4ff4527"} Apr 20 15:05:28.932430 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:05:28.932383 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-7qmvr" event={"ID":"ed7ca0c9-d248-41fb-a871-7313c1a4e1eb","Type":"ContainerStarted","Data":"d9ff85a286141c09daab52c95e17df625ed05465862920b5d3adaf286f3b24e0"} Apr 20 15:05:29.936070 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:05:29.936037 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-7qmvr" event={"ID":"ed7ca0c9-d248-41fb-a871-7313c1a4e1eb","Type":"ContainerStarted","Data":"46f08eeba4a7860bdb42845abf0b141c86e04b76efdf7157a182ea0702a269ce"} Apr 20 15:05:31.942662 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:05:31.942621 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-7qmvr" event={"ID":"ed7ca0c9-d248-41fb-a871-7313c1a4e1eb","Type":"ContainerStarted","Data":"49c8c64cddc61349131cee117a14e17aeafb2e62bce89c36e64de1c7e1bb5214"} Apr 20 15:05:31.960434 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:05:31.960389 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-7qmvr" podStartSLOduration=1.399125208 podStartE2EDuration="3.960376554s" podCreationTimestamp="2026-04-20 15:05:28 +0000 UTC" firstStartedPulling="2026-04-20 15:05:28.702106583 +0000 UTC m=+188.832981865" lastFinishedPulling="2026-04-20 15:05:31.263357916 +0000 UTC m=+191.394233211" observedRunningTime="2026-04-20 15:05:31.958971771 +0000 UTC m=+192.089847074" watchObservedRunningTime="2026-04-20 15:05:31.960376554 +0000 UTC m=+192.091251898" Apr 20 15:05:37.527575 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:05:37.527543 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-4rv5w"] Apr 20 15:05:37.530713 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:05:37.530695 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-4rv5w" Apr 20 15:05:37.534776 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:05:37.534756 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 20 15:05:37.534895 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:05:37.534804 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 20 15:05:37.534895 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:05:37.534875 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 20 15:05:37.535012 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:05:37.534989 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 20 15:05:37.535852 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:05:37.535835 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 20 15:05:37.535938 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:05:37.535881 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 20 15:05:37.535980 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:05:37.535939 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-lk8g4\"" Apr 20 15:05:37.586460 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:05:37.586436 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/a9732813-f0fb-4123-952e-fbb3c1c45a99-node-exporter-accelerators-collector-config\") pod \"node-exporter-4rv5w\" (UID: \"a9732813-f0fb-4123-952e-fbb3c1c45a99\") " pod="openshift-monitoring/node-exporter-4rv5w" Apr 20 15:05:37.586584 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:05:37.586466 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a9732813-f0fb-4123-952e-fbb3c1c45a99-metrics-client-ca\") pod \"node-exporter-4rv5w\" (UID: \"a9732813-f0fb-4123-952e-fbb3c1c45a99\") " pod="openshift-monitoring/node-exporter-4rv5w" Apr 20 15:05:37.586584 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:05:37.586556 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/a9732813-f0fb-4123-952e-fbb3c1c45a99-root\") pod \"node-exporter-4rv5w\" (UID: \"a9732813-f0fb-4123-952e-fbb3c1c45a99\") " pod="openshift-monitoring/node-exporter-4rv5w" Apr 20 15:05:37.586653 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:05:37.586596 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/a9732813-f0fb-4123-952e-fbb3c1c45a99-node-exporter-tls\") pod \"node-exporter-4rv5w\" (UID: \"a9732813-f0fb-4123-952e-fbb3c1c45a99\") " pod="openshift-monitoring/node-exporter-4rv5w" Apr 20 15:05:37.586653 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:05:37.586618 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/a9732813-f0fb-4123-952e-fbb3c1c45a99-node-exporter-textfile\") pod \"node-exporter-4rv5w\" (UID: \"a9732813-f0fb-4123-952e-fbb3c1c45a99\") " pod="openshift-monitoring/node-exporter-4rv5w" Apr 20 15:05:37.586717 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:05:37.586676 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a9732813-f0fb-4123-952e-fbb3c1c45a99-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-4rv5w\" (UID: \"a9732813-f0fb-4123-952e-fbb3c1c45a99\") " pod="openshift-monitoring/node-exporter-4rv5w" Apr 20 15:05:37.586717 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:05:37.586701 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tn4m\" (UniqueName: \"kubernetes.io/projected/a9732813-f0fb-4123-952e-fbb3c1c45a99-kube-api-access-8tn4m\") pod \"node-exporter-4rv5w\" (UID: \"a9732813-f0fb-4123-952e-fbb3c1c45a99\") " pod="openshift-monitoring/node-exporter-4rv5w" Apr 20 15:05:37.586784 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:05:37.586769 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a9732813-f0fb-4123-952e-fbb3c1c45a99-sys\") pod \"node-exporter-4rv5w\" (UID: \"a9732813-f0fb-4123-952e-fbb3c1c45a99\") " pod="openshift-monitoring/node-exporter-4rv5w" Apr 20 15:05:37.586818 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:05:37.586792 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/a9732813-f0fb-4123-952e-fbb3c1c45a99-node-exporter-wtmp\") pod \"node-exporter-4rv5w\" (UID: \"a9732813-f0fb-4123-952e-fbb3c1c45a99\") " pod="openshift-monitoring/node-exporter-4rv5w" Apr 20 15:05:37.687394 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:05:37.687359 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/a9732813-f0fb-4123-952e-fbb3c1c45a99-node-exporter-accelerators-collector-config\") pod \"node-exporter-4rv5w\" (UID: \"a9732813-f0fb-4123-952e-fbb3c1c45a99\") " pod="openshift-monitoring/node-exporter-4rv5w" Apr 20 15:05:37.687394 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:05:37.687398 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a9732813-f0fb-4123-952e-fbb3c1c45a99-metrics-client-ca\") pod \"node-exporter-4rv5w\" (UID: \"a9732813-f0fb-4123-952e-fbb3c1c45a99\") " pod="openshift-monitoring/node-exporter-4rv5w" Apr 20 15:05:37.687597 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:05:37.687460 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/a9732813-f0fb-4123-952e-fbb3c1c45a99-root\") pod \"node-exporter-4rv5w\" (UID: \"a9732813-f0fb-4123-952e-fbb3c1c45a99\") " pod="openshift-monitoring/node-exporter-4rv5w" Apr 20 15:05:37.687597 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:05:37.687536 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/a9732813-f0fb-4123-952e-fbb3c1c45a99-root\") pod \"node-exporter-4rv5w\" (UID: \"a9732813-f0fb-4123-952e-fbb3c1c45a99\") " pod="openshift-monitoring/node-exporter-4rv5w" Apr 20 15:05:37.687597 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:05:37.687574 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/a9732813-f0fb-4123-952e-fbb3c1c45a99-node-exporter-tls\") pod \"node-exporter-4rv5w\" (UID: \"a9732813-f0fb-4123-952e-fbb3c1c45a99\") " pod="openshift-monitoring/node-exporter-4rv5w" Apr 20 15:05:37.687690 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:05:37.687606 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/a9732813-f0fb-4123-952e-fbb3c1c45a99-node-exporter-textfile\") pod \"node-exporter-4rv5w\" (UID: \"a9732813-f0fb-4123-952e-fbb3c1c45a99\") " pod="openshift-monitoring/node-exporter-4rv5w" Apr 20 15:05:37.687690 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:05:37.687650 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a9732813-f0fb-4123-952e-fbb3c1c45a99-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-4rv5w\" (UID: \"a9732813-f0fb-4123-952e-fbb3c1c45a99\") " pod="openshift-monitoring/node-exporter-4rv5w" Apr 20 15:05:37.687690 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:05:37.687678 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8tn4m\" (UniqueName: \"kubernetes.io/projected/a9732813-f0fb-4123-952e-fbb3c1c45a99-kube-api-access-8tn4m\") pod \"node-exporter-4rv5w\" (UID: \"a9732813-f0fb-4123-952e-fbb3c1c45a99\") " pod="openshift-monitoring/node-exporter-4rv5w" Apr 20 15:05:37.687777 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:05:37.687720 2572 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 20 15:05:37.687777 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:05:37.687748 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a9732813-f0fb-4123-952e-fbb3c1c45a99-sys\") pod \"node-exporter-4rv5w\" (UID: \"a9732813-f0fb-4123-952e-fbb3c1c45a99\") " pod="openshift-monitoring/node-exporter-4rv5w" Apr 20 15:05:37.687839 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:05:37.687788 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a9732813-f0fb-4123-952e-fbb3c1c45a99-sys\") pod \"node-exporter-4rv5w\" (UID: \"a9732813-f0fb-4123-952e-fbb3c1c45a99\") " pod="openshift-monitoring/node-exporter-4rv5w" Apr 20 15:05:37.687839 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:05:37.687801 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a9732813-f0fb-4123-952e-fbb3c1c45a99-node-exporter-tls podName:a9732813-f0fb-4123-952e-fbb3c1c45a99 nodeName:}" failed. No retries permitted until 2026-04-20 15:05:38.187777862 +0000 UTC m=+198.318653151 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/a9732813-f0fb-4123-952e-fbb3c1c45a99-node-exporter-tls") pod "node-exporter-4rv5w" (UID: "a9732813-f0fb-4123-952e-fbb3c1c45a99") : secret "node-exporter-tls" not found Apr 20 15:05:37.687945 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:05:37.687925 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/a9732813-f0fb-4123-952e-fbb3c1c45a99-node-exporter-textfile\") pod \"node-exporter-4rv5w\" (UID: \"a9732813-f0fb-4123-952e-fbb3c1c45a99\") " pod="openshift-monitoring/node-exporter-4rv5w" Apr 20 15:05:37.687992 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:05:37.687977 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/a9732813-f0fb-4123-952e-fbb3c1c45a99-node-exporter-wtmp\") pod \"node-exporter-4rv5w\" (UID: \"a9732813-f0fb-4123-952e-fbb3c1c45a99\") " pod="openshift-monitoring/node-exporter-4rv5w" Apr 20 15:05:37.688072 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:05:37.688055 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/a9732813-f0fb-4123-952e-fbb3c1c45a99-node-exporter-accelerators-collector-config\") pod \"node-exporter-4rv5w\" (UID: \"a9732813-f0fb-4123-952e-fbb3c1c45a99\") " pod="openshift-monitoring/node-exporter-4rv5w" Apr 20 15:05:37.688125 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:05:37.688089 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a9732813-f0fb-4123-952e-fbb3c1c45a99-metrics-client-ca\") pod \"node-exporter-4rv5w\" (UID: \"a9732813-f0fb-4123-952e-fbb3c1c45a99\") " pod="openshift-monitoring/node-exporter-4rv5w" Apr 20 15:05:37.688173 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:05:37.688142 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/a9732813-f0fb-4123-952e-fbb3c1c45a99-node-exporter-wtmp\") pod \"node-exporter-4rv5w\" (UID: \"a9732813-f0fb-4123-952e-fbb3c1c45a99\") " pod="openshift-monitoring/node-exporter-4rv5w" Apr 20 15:05:37.689921 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:05:37.689904 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a9732813-f0fb-4123-952e-fbb3c1c45a99-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-4rv5w\" (UID: \"a9732813-f0fb-4123-952e-fbb3c1c45a99\") " pod="openshift-monitoring/node-exporter-4rv5w" Apr 20 15:05:37.698025 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:05:37.698003 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tn4m\" (UniqueName: \"kubernetes.io/projected/a9732813-f0fb-4123-952e-fbb3c1c45a99-kube-api-access-8tn4m\") pod \"node-exporter-4rv5w\" (UID: \"a9732813-f0fb-4123-952e-fbb3c1c45a99\") " pod="openshift-monitoring/node-exporter-4rv5w" Apr 20 15:05:38.191722 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:05:38.191687 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/a9732813-f0fb-4123-952e-fbb3c1c45a99-node-exporter-tls\") pod \"node-exporter-4rv5w\" (UID: \"a9732813-f0fb-4123-952e-fbb3c1c45a99\") " pod="openshift-monitoring/node-exporter-4rv5w" Apr 20 15:05:38.193923 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:05:38.193904 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/a9732813-f0fb-4123-952e-fbb3c1c45a99-node-exporter-tls\") pod \"node-exporter-4rv5w\" (UID: \"a9732813-f0fb-4123-952e-fbb3c1c45a99\") " pod="openshift-monitoring/node-exporter-4rv5w" Apr 20 15:05:38.439520 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:05:38.439476 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-4rv5w" Apr 20 15:05:38.447163 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:05:38.447136 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda9732813_f0fb_4123_952e_fbb3c1c45a99.slice/crio-e9dc33b430daede555cedfaf8b9ddbe7378e99eb89bfb4f9e3e4fa63762ae8fb WatchSource:0}: Error finding container e9dc33b430daede555cedfaf8b9ddbe7378e99eb89bfb4f9e3e4fa63762ae8fb: Status 404 returned error can't find the container with id e9dc33b430daede555cedfaf8b9ddbe7378e99eb89bfb4f9e3e4fa63762ae8fb Apr 20 15:05:38.964105 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:05:38.964064 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-4rv5w" event={"ID":"a9732813-f0fb-4123-952e-fbb3c1c45a99","Type":"ContainerStarted","Data":"e9dc33b430daede555cedfaf8b9ddbe7378e99eb89bfb4f9e3e4fa63762ae8fb"} Apr 20 15:05:39.967351 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:05:39.967317 2572 generic.go:358] "Generic (PLEG): container finished" podID="a9732813-f0fb-4123-952e-fbb3c1c45a99" containerID="0284f75bed4f546a063c6302b8d76f657075593ac4af406227b138eabc2d5cbc" exitCode=0 Apr 20 15:05:39.967731 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:05:39.967362 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-4rv5w" event={"ID":"a9732813-f0fb-4123-952e-fbb3c1c45a99","Type":"ContainerDied","Data":"0284f75bed4f546a063c6302b8d76f657075593ac4af406227b138eabc2d5cbc"} Apr 20 15:05:40.971714 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:05:40.971678 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-4rv5w" event={"ID":"a9732813-f0fb-4123-952e-fbb3c1c45a99","Type":"ContainerStarted","Data":"5069065baeb01ccaa399e0c3d73dafcddd7a9e9ce9b93d10a4a2efb03dce150a"} Apr 20 15:05:40.971714 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:05:40.971712 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-4rv5w" event={"ID":"a9732813-f0fb-4123-952e-fbb3c1c45a99","Type":"ContainerStarted","Data":"a026a63e480b28fbbc5fd8bf9783045728f38129ae923ae72803294da2359c22"} Apr 20 15:05:40.991392 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:05:40.991351 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-4rv5w" podStartSLOduration=3.004449815 podStartE2EDuration="3.991337684s" podCreationTimestamp="2026-04-20 15:05:37 +0000 UTC" firstStartedPulling="2026-04-20 15:05:38.448742033 +0000 UTC m=+198.579617317" lastFinishedPulling="2026-04-20 15:05:39.435629905 +0000 UTC m=+199.566505186" observedRunningTime="2026-04-20 15:05:40.990106879 +0000 UTC m=+201.120982182" watchObservedRunningTime="2026-04-20 15:05:40.991337684 +0000 UTC m=+201.122212988" Apr 20 15:05:45.556249 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:05:45.556207 2572 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6ccfb7bf47-78sml" podUID="eba5134a-506e-447a-8b4e-946fb42feaf3" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 20 15:05:55.556259 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:05:55.556218 2572 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6ccfb7bf47-78sml" podUID="eba5134a-506e-447a-8b4e-946fb42feaf3" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 20 15:06:05.556373 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:06:05.556337 2572 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6ccfb7bf47-78sml" podUID="eba5134a-506e-447a-8b4e-946fb42feaf3" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 20 15:06:05.556904 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:06:05.556409 2572 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6ccfb7bf47-78sml" Apr 20 15:06:05.556904 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:06:05.556873 2572 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"61062279f18829d89546d18f932d562c0f4080103373dbbe633b69cab4f02e27"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6ccfb7bf47-78sml" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 20 15:06:05.557001 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:06:05.556932 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6ccfb7bf47-78sml" podUID="eba5134a-506e-447a-8b4e-946fb42feaf3" containerName="service-proxy" containerID="cri-o://61062279f18829d89546d18f932d562c0f4080103373dbbe633b69cab4f02e27" gracePeriod=30 Apr 20 15:06:06.031101 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:06:06.031066 2572 generic.go:358] "Generic (PLEG): container finished" podID="eba5134a-506e-447a-8b4e-946fb42feaf3" containerID="61062279f18829d89546d18f932d562c0f4080103373dbbe633b69cab4f02e27" exitCode=2 Apr 20 15:06:06.031256 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:06:06.031107 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6ccfb7bf47-78sml" event={"ID":"eba5134a-506e-447a-8b4e-946fb42feaf3","Type":"ContainerDied","Data":"61062279f18829d89546d18f932d562c0f4080103373dbbe633b69cab4f02e27"} Apr 20 15:06:06.031256 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:06:06.031143 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6ccfb7bf47-78sml" event={"ID":"eba5134a-506e-447a-8b4e-946fb42feaf3","Type":"ContainerStarted","Data":"658a3aba27c08c12ca5d99b072bd44f1f2af474adcd99db277f92bc89cda7c9c"} Apr 20 15:06:21.583969 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:06:21.583938 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-4rv5w_a9732813-f0fb-4123-952e-fbb3c1c45a99/init-textfile/0.log" Apr 20 15:06:21.784998 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:06:21.784972 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-4rv5w_a9732813-f0fb-4123-952e-fbb3c1c45a99/node-exporter/0.log" Apr 20 15:06:21.983321 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:06:21.983249 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-4rv5w_a9732813-f0fb-4123-952e-fbb3c1c45a99/kube-rbac-proxy/0.log" Apr 20 15:06:30.383803 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:06:30.383769 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-p5xng_034b6f21-f85a-4440-a061-d39a6df5dde4/dns-node-resolver/0.log" Apr 20 15:06:31.583556 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:06:31.583528 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-x8tdg_22757559-941f-4d9e-9128-3aeefc6665f3/node-ca/0.log" Apr 20 15:06:32.324028 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:06:32.323990 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7215f3fe-093a-42b9-bea0-26a93cb4e1ff-metrics-certs\") pod \"network-metrics-daemon-sq52t\" (UID: \"7215f3fe-093a-42b9-bea0-26a93cb4e1ff\") " pod="openshift-multus/network-metrics-daemon-sq52t" Apr 20 15:06:32.326290 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:06:32.326259 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7215f3fe-093a-42b9-bea0-26a93cb4e1ff-metrics-certs\") pod \"network-metrics-daemon-sq52t\" (UID: \"7215f3fe-093a-42b9-bea0-26a93cb4e1ff\") " pod="openshift-multus/network-metrics-daemon-sq52t" Apr 20 15:06:32.611325 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:06:32.611250 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-64ff7\"" Apr 20 15:06:32.619189 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:06:32.619163 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sq52t" Apr 20 15:06:32.746828 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:06:32.746796 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7215f3fe_093a_42b9_bea0_26a93cb4e1ff.slice/crio-50da6b50bab44490ba89162b54bf251de3e645c0e6a2ecf078240e0aff36be14 WatchSource:0}: Error finding container 50da6b50bab44490ba89162b54bf251de3e645c0e6a2ecf078240e0aff36be14: Status 404 returned error can't find the container with id 50da6b50bab44490ba89162b54bf251de3e645c0e6a2ecf078240e0aff36be14 Apr 20 15:06:32.747374 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:06:32.747212 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-sq52t"] Apr 20 15:06:33.098305 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:06:33.098258 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-sq52t" event={"ID":"7215f3fe-093a-42b9-bea0-26a93cb4e1ff","Type":"ContainerStarted","Data":"50da6b50bab44490ba89162b54bf251de3e645c0e6a2ecf078240e0aff36be14"} Apr 20 15:06:34.103023 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:06:34.102992 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-sq52t" event={"ID":"7215f3fe-093a-42b9-bea0-26a93cb4e1ff","Type":"ContainerStarted","Data":"419e6c196a40757878a02b24097022133b984d34ae953e037da733936a4e5aa9"} Apr 20 15:06:34.103420 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:06:34.103031 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-sq52t" event={"ID":"7215f3fe-093a-42b9-bea0-26a93cb4e1ff","Type":"ContainerStarted","Data":"ba1eb1c5294493bdb7bea1d1955ee6dfb145b9965008811cb16ace4e257ef6b0"} Apr 20 15:06:34.120315 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:06:34.120260 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-sq52t" podStartSLOduration=253.158993461 podStartE2EDuration="4m14.120242914s" podCreationTimestamp="2026-04-20 15:02:20 +0000 UTC" firstStartedPulling="2026-04-20 15:06:32.748634253 +0000 UTC m=+252.879509533" lastFinishedPulling="2026-04-20 15:06:33.709883705 +0000 UTC m=+253.840758986" observedRunningTime="2026-04-20 15:06:34.11905883 +0000 UTC m=+254.249934134" watchObservedRunningTime="2026-04-20 15:06:34.120242914 +0000 UTC m=+254.251118218" Apr 20 15:06:57.849227 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:06:57.849186 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-p98mt" podUID="20db58c7-db2e-4b2b-be5a-cf2278346010" Apr 20 15:06:57.849227 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:06:57.849194 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-cb95c66f6-8p7wk" podUID="eb88787e-8848-4d3f-bcdd-871260569c2c" Apr 20 15:06:57.849694 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:06:57.849237 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-5f46db577b-xgcwj" podUID="f915f3e5-4964-46bc-9adb-e34434ecea10" Apr 20 15:06:58.162549 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:06:58.162462 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-8p7wk" Apr 20 15:06:58.162682 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:06:58.162462 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-p98mt" Apr 20 15:06:58.162682 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:06:58.162462 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5f46db577b-xgcwj" Apr 20 15:07:01.847888 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:07:01.847854 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8337a855-24f1-476f-b9e0-49701fd9bda2-cert\") pod \"ingress-canary-fdmd9\" (UID: \"8337a855-24f1-476f-b9e0-49701fd9bda2\") " pod="openshift-ingress-canary/ingress-canary-fdmd9" Apr 20 15:07:01.848335 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:07:01.847896 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f915f3e5-4964-46bc-9adb-e34434ecea10-registry-tls\") pod \"image-registry-5f46db577b-xgcwj\" (UID: \"f915f3e5-4964-46bc-9adb-e34434ecea10\") " pod="openshift-image-registry/image-registry-5f46db577b-xgcwj" Apr 20 15:07:01.848335 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:07:01.847941 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/20db58c7-db2e-4b2b-be5a-cf2278346010-metrics-tls\") pod \"dns-default-p98mt\" (UID: \"20db58c7-db2e-4b2b-be5a-cf2278346010\") " pod="openshift-dns/dns-default-p98mt" Apr 20 15:07:01.848335 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:07:01.847969 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/eb88787e-8848-4d3f-bcdd-871260569c2c-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-8p7wk\" (UID: \"eb88787e-8848-4d3f-bcdd-871260569c2c\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-8p7wk" Apr 20 15:07:01.850300 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:07:01.850274 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/20db58c7-db2e-4b2b-be5a-cf2278346010-metrics-tls\") pod \"dns-default-p98mt\" (UID: \"20db58c7-db2e-4b2b-be5a-cf2278346010\") " pod="openshift-dns/dns-default-p98mt" Apr 20 15:07:01.850300 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:07:01.850282 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f915f3e5-4964-46bc-9adb-e34434ecea10-registry-tls\") pod \"image-registry-5f46db577b-xgcwj\" (UID: \"f915f3e5-4964-46bc-9adb-e34434ecea10\") " pod="openshift-image-registry/image-registry-5f46db577b-xgcwj" Apr 20 15:07:01.850662 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:07:01.850645 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/eb88787e-8848-4d3f-bcdd-871260569c2c-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-8p7wk\" (UID: \"eb88787e-8848-4d3f-bcdd-871260569c2c\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-8p7wk" Apr 20 15:07:01.850767 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:07:01.850748 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8337a855-24f1-476f-b9e0-49701fd9bda2-cert\") pod \"ingress-canary-fdmd9\" (UID: \"8337a855-24f1-476f-b9e0-49701fd9bda2\") " pod="openshift-ingress-canary/ingress-canary-fdmd9" Apr 20 15:07:02.067073 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:07:02.067041 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-vdrzr\"" Apr 20 15:07:02.067270 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:07:02.067104 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-nfkdq\"" Apr 20 15:07:02.067270 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:07:02.067105 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-dtcpk\"" Apr 20 15:07:02.073973 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:07:02.073951 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-p98mt" Apr 20 15:07:02.074085 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:07:02.073979 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5f46db577b-xgcwj" Apr 20 15:07:02.074085 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:07:02.073992 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-8p7wk" Apr 20 15:07:02.108537 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:07:02.108447 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-4l2x2\"" Apr 20 15:07:02.116616 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:07:02.116575 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-fdmd9" Apr 20 15:07:02.225521 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:07:02.225447 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-p98mt"] Apr 20 15:07:02.229974 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:07:02.229938 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20db58c7_db2e_4b2b_be5a_cf2278346010.slice/crio-14c28a0e9cecf4b5a74aef49f531486121c5e514c72676d548835a6835d3aafa WatchSource:0}: Error finding container 14c28a0e9cecf4b5a74aef49f531486121c5e514c72676d548835a6835d3aafa: Status 404 returned error can't find the container with id 14c28a0e9cecf4b5a74aef49f531486121c5e514c72676d548835a6835d3aafa Apr 20 15:07:02.268292 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:07:02.268258 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-fdmd9"] Apr 20 15:07:02.272029 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:07:02.272005 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8337a855_24f1_476f_b9e0_49701fd9bda2.slice/crio-102f8a25088592815495cc21301201b11cac492096ea9c014e8888221c1471c3 WatchSource:0}: Error finding container 102f8a25088592815495cc21301201b11cac492096ea9c014e8888221c1471c3: Status 404 returned error can't find the container with id 102f8a25088592815495cc21301201b11cac492096ea9c014e8888221c1471c3 Apr 20 15:07:02.443189 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:07:02.443110 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-8p7wk"] Apr 20 15:07:02.446024 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:07:02.445997 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5f46db577b-xgcwj"] Apr 20 15:07:02.446690 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:07:02.446666 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb88787e_8848_4d3f_bcdd_871260569c2c.slice/crio-b917697ebc53434aee78bfef47438f7fd58ca5154ea6e5d565e16810ea1f55dc WatchSource:0}: Error finding container b917697ebc53434aee78bfef47438f7fd58ca5154ea6e5d565e16810ea1f55dc: Status 404 returned error can't find the container with id b917697ebc53434aee78bfef47438f7fd58ca5154ea6e5d565e16810ea1f55dc Apr 20 15:07:02.449353 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:07:02.449329 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf915f3e5_4964_46bc_9adb_e34434ecea10.slice/crio-162c3967496629c0135b5dd20a29948e016c75f2e8f54c536c612421bf5c0466 WatchSource:0}: Error finding container 162c3967496629c0135b5dd20a29948e016c75f2e8f54c536c612421bf5c0466: Status 404 returned error can't find the container with id 162c3967496629c0135b5dd20a29948e016c75f2e8f54c536c612421bf5c0466 Apr 20 15:07:03.179076 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:07:03.179038 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-8p7wk" event={"ID":"eb88787e-8848-4d3f-bcdd-871260569c2c","Type":"ContainerStarted","Data":"b917697ebc53434aee78bfef47438f7fd58ca5154ea6e5d565e16810ea1f55dc"} Apr 20 15:07:03.180621 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:07:03.180433 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-p98mt" event={"ID":"20db58c7-db2e-4b2b-be5a-cf2278346010","Type":"ContainerStarted","Data":"14c28a0e9cecf4b5a74aef49f531486121c5e514c72676d548835a6835d3aafa"} Apr 20 15:07:03.181539 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:07:03.181512 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-fdmd9" event={"ID":"8337a855-24f1-476f-b9e0-49701fd9bda2","Type":"ContainerStarted","Data":"102f8a25088592815495cc21301201b11cac492096ea9c014e8888221c1471c3"} Apr 20 15:07:03.183004 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:07:03.182978 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5f46db577b-xgcwj" event={"ID":"f915f3e5-4964-46bc-9adb-e34434ecea10","Type":"ContainerStarted","Data":"22172bda47dbd96e14d6a7b6addf62b0e570587d2bf623f5d9e72c0c05e51320"} Apr 20 15:07:03.183123 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:07:03.183012 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5f46db577b-xgcwj" event={"ID":"f915f3e5-4964-46bc-9adb-e34434ecea10","Type":"ContainerStarted","Data":"162c3967496629c0135b5dd20a29948e016c75f2e8f54c536c612421bf5c0466"} Apr 20 15:07:03.183226 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:07:03.183199 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-5f46db577b-xgcwj" Apr 20 15:07:03.204539 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:07:03.204456 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-5f46db577b-xgcwj" podStartSLOduration=283.204443139 podStartE2EDuration="4m43.204443139s" podCreationTimestamp="2026-04-20 15:02:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 15:07:03.202841403 +0000 UTC m=+283.333716718" watchObservedRunningTime="2026-04-20 15:07:03.204443139 +0000 UTC m=+283.335318441" Apr 20 15:07:04.187117 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:07:04.187077 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-8p7wk" event={"ID":"eb88787e-8848-4d3f-bcdd-871260569c2c","Type":"ContainerStarted","Data":"473717a6718bd8e759801c3d94a5807a3f4eb995e0b9d11b717c34150921de1e"} Apr 20 15:07:04.202928 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:07:04.202882 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-8p7wk" podStartSLOduration=260.973602342 podStartE2EDuration="4m22.202865777s" podCreationTimestamp="2026-04-20 15:02:42 +0000 UTC" firstStartedPulling="2026-04-20 15:07:02.448760503 +0000 UTC m=+282.579635786" lastFinishedPulling="2026-04-20 15:07:03.678023927 +0000 UTC m=+283.808899221" observedRunningTime="2026-04-20 15:07:04.202759461 +0000 UTC m=+284.333634764" watchObservedRunningTime="2026-04-20 15:07:04.202865777 +0000 UTC m=+284.333741081" Apr 20 15:07:05.191328 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:07:05.191293 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-p98mt" event={"ID":"20db58c7-db2e-4b2b-be5a-cf2278346010","Type":"ContainerStarted","Data":"106f264c8c476ad181627b2ef70a3d8a83e11908851c67fe4a9d349725b1d241"} Apr 20 15:07:05.191733 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:07:05.191340 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-p98mt" event={"ID":"20db58c7-db2e-4b2b-be5a-cf2278346010","Type":"ContainerStarted","Data":"99609250b0567b5b741337e39953281a6327d59fcca9ff6d6f90f14b40e571bf"} Apr 20 15:07:05.191733 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:07:05.191528 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-p98mt" Apr 20 15:07:05.192898 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:07:05.192871 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-fdmd9" event={"ID":"8337a855-24f1-476f-b9e0-49701fd9bda2","Type":"ContainerStarted","Data":"8fe40c06871c392df3cfeca699e387b82cfacb6806e570b9760077bb3fbf4c99"} Apr 20 15:07:05.207848 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:07:05.207769 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-p98mt" podStartSLOduration=251.492837918 podStartE2EDuration="4m14.207756486s" podCreationTimestamp="2026-04-20 15:02:51 +0000 UTC" firstStartedPulling="2026-04-20 15:07:02.232160679 +0000 UTC m=+282.363035960" lastFinishedPulling="2026-04-20 15:07:04.947079244 +0000 UTC m=+285.077954528" observedRunningTime="2026-04-20 15:07:05.206738634 +0000 UTC m=+285.337613938" watchObservedRunningTime="2026-04-20 15:07:05.207756486 +0000 UTC m=+285.338631788" Apr 20 15:07:05.223432 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:07:05.223393 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-fdmd9" podStartSLOduration=251.546511353 podStartE2EDuration="4m14.223381006s" podCreationTimestamp="2026-04-20 15:02:51 +0000 UTC" firstStartedPulling="2026-04-20 15:07:02.273800787 +0000 UTC m=+282.404676068" lastFinishedPulling="2026-04-20 15:07:04.95067044 +0000 UTC m=+285.081545721" observedRunningTime="2026-04-20 15:07:05.22237225 +0000 UTC m=+285.353247553" watchObservedRunningTime="2026-04-20 15:07:05.223381006 +0000 UTC m=+285.354256320" Apr 20 15:07:06.196813 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:07:06.196781 2572 generic.go:358] "Generic (PLEG): container finished" podID="2545e8e2-aae3-49b8-80c8-71db98b8f417" containerID="c77b9b29b96a94db84ef7df69a2ef6963463723f228630e4b104d74ac275e8ca" exitCode=255 Apr 20 15:07:06.197214 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:07:06.196862 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-67f887d797-rmz8d" event={"ID":"2545e8e2-aae3-49b8-80c8-71db98b8f417","Type":"ContainerDied","Data":"c77b9b29b96a94db84ef7df69a2ef6963463723f228630e4b104d74ac275e8ca"} Apr 20 15:07:06.197214 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:07:06.196904 2572 scope.go:117] "RemoveContainer" containerID="80cd4f9f61cd6532a4a85951595013c0e0ad7b87c83cccec2926875d0e344cb9" Apr 20 15:07:06.197434 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:07:06.197417 2572 scope.go:117] "RemoveContainer" containerID="c77b9b29b96a94db84ef7df69a2ef6963463723f228630e4b104d74ac275e8ca" Apr 20 15:07:06.197648 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:07:06.197616 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"addon-agent\" with CrashLoopBackOff: \"back-off 10s restarting failed container=addon-agent pod=managed-serviceaccount-addon-agent-67f887d797-rmz8d_open-cluster-management-agent-addon(2545e8e2-aae3-49b8-80c8-71db98b8f417)\"" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-67f887d797-rmz8d" podUID="2545e8e2-aae3-49b8-80c8-71db98b8f417" Apr 20 15:07:07.200676 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:07:07.200645 2572 generic.go:358] "Generic (PLEG): container finished" podID="0fe4c9de-e474-4b63-bc5b-da2d6d40ba00" containerID="1d09a46b556136dcd2ecd66d309360bf4fcc7b55f91a4f338910c757117b2cd8" exitCode=1 Apr 20 15:07:07.201069 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:07:07.200723 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-568757458-778zv" event={"ID":"0fe4c9de-e474-4b63-bc5b-da2d6d40ba00","Type":"ContainerDied","Data":"1d09a46b556136dcd2ecd66d309360bf4fcc7b55f91a4f338910c757117b2cd8"} Apr 20 15:07:07.201069 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:07:07.200761 2572 scope.go:117] "RemoveContainer" containerID="671a620bca6b7578d39204a878560c625e58e62de4b7f2b7bc61b8dfb2d56d80" Apr 20 15:07:07.201177 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:07:07.201087 2572 scope.go:117] "RemoveContainer" containerID="1d09a46b556136dcd2ecd66d309360bf4fcc7b55f91a4f338910c757117b2cd8" Apr 20 15:07:07.201326 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:07:07.201304 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"acm-agent\" with CrashLoopBackOff: \"back-off 10s restarting failed container=acm-agent pod=klusterlet-addon-workmgr-568757458-778zv_open-cluster-management-agent-addon(0fe4c9de-e474-4b63-bc5b-da2d6d40ba00)\"" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-568757458-778zv" podUID="0fe4c9de-e474-4b63-bc5b-da2d6d40ba00" Apr 20 15:07:07.884228 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:07:07.884184 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-568757458-778zv" Apr 20 15:07:08.205856 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:07:08.205782 2572 scope.go:117] "RemoveContainer" containerID="1d09a46b556136dcd2ecd66d309360bf4fcc7b55f91a4f338910c757117b2cd8" Apr 20 15:07:08.206196 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:07:08.205955 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"acm-agent\" with CrashLoopBackOff: \"back-off 10s restarting failed container=acm-agent pod=klusterlet-addon-workmgr-568757458-778zv_open-cluster-management-agent-addon(0fe4c9de-e474-4b63-bc5b-da2d6d40ba00)\"" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-568757458-778zv" podUID="0fe4c9de-e474-4b63-bc5b-da2d6d40ba00" Apr 20 15:07:15.198914 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:07:15.198881 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-p98mt" Apr 20 15:07:15.531645 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:07:15.531618 2572 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-67f887d797-rmz8d" Apr 20 15:07:15.531954 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:07:15.531941 2572 scope.go:117] "RemoveContainer" containerID="c77b9b29b96a94db84ef7df69a2ef6963463723f228630e4b104d74ac275e8ca" Apr 20 15:07:15.532127 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:07:15.532110 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"addon-agent\" with CrashLoopBackOff: \"back-off 10s restarting failed container=addon-agent pod=managed-serviceaccount-addon-agent-67f887d797-rmz8d_open-cluster-management-agent-addon(2545e8e2-aae3-49b8-80c8-71db98b8f417)\"" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-67f887d797-rmz8d" podUID="2545e8e2-aae3-49b8-80c8-71db98b8f417" Apr 20 15:07:15.541377 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:07:15.541359 2572 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-568757458-778zv" Apr 20 15:07:15.541688 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:07:15.541677 2572 scope.go:117] "RemoveContainer" containerID="1d09a46b556136dcd2ecd66d309360bf4fcc7b55f91a4f338910c757117b2cd8" Apr 20 15:07:15.541840 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:07:15.541823 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"acm-agent\" with CrashLoopBackOff: \"back-off 10s restarting failed container=acm-agent pod=klusterlet-addon-workmgr-568757458-778zv_open-cluster-management-agent-addon(0fe4c9de-e474-4b63-bc5b-da2d6d40ba00)\"" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-568757458-778zv" podUID="0fe4c9de-e474-4b63-bc5b-da2d6d40ba00" Apr 20 15:07:18.683556 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:07:18.683512 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5f46db577b-xgcwj"] Apr 20 15:07:18.687629 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:07:18.687588 2572 patch_prober.go:28] interesting pod/image-registry-5f46db577b-xgcwj container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 20 15:07:18.687755 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:07:18.687646 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-5f46db577b-xgcwj" podUID="f915f3e5-4964-46bc-9adb-e34434ecea10" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 15:07:27.406198 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:07:27.406165 2572 scope.go:117] "RemoveContainer" containerID="1d09a46b556136dcd2ecd66d309360bf4fcc7b55f91a4f338910c757117b2cd8" Apr 20 15:07:27.411277 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:07:27.406972 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 15:07:28.256575 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:07:28.256538 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-568757458-778zv" event={"ID":"0fe4c9de-e474-4b63-bc5b-da2d6d40ba00","Type":"ContainerStarted","Data":"4b4363a7e502e9eeb198815fe8b892ea592c9bfcc5ef56e16d9239bf5ade0a22"} Apr 20 15:07:28.256831 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:07:28.256809 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-568757458-778zv" Apr 20 15:07:28.257980 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:07:28.257964 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-568757458-778zv" Apr 20 15:07:28.405805 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:07:28.405779 2572 scope.go:117] "RemoveContainer" containerID="c77b9b29b96a94db84ef7df69a2ef6963463723f228630e4b104d74ac275e8ca" Apr 20 15:07:28.687413 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:07:28.687337 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-5f46db577b-xgcwj" Apr 20 15:07:29.260584 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:07:29.260547 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-67f887d797-rmz8d" event={"ID":"2545e8e2-aae3-49b8-80c8-71db98b8f417","Type":"ContainerStarted","Data":"2962998b05d03373017c756af95ee9c64035845156cf5d7e9a19dcc662930498"} Apr 20 15:07:43.701945 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:07:43.701882 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-5f46db577b-xgcwj" podUID="f915f3e5-4964-46bc-9adb-e34434ecea10" containerName="registry" containerID="cri-o://22172bda47dbd96e14d6a7b6addf62b0e570587d2bf623f5d9e72c0c05e51320" gracePeriod=30 Apr 20 15:07:44.930548 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:07:44.930526 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5f46db577b-xgcwj" Apr 20 15:07:45.029040 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:07:45.028951 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f915f3e5-4964-46bc-9adb-e34434ecea10-ca-trust-extracted\") pod \"f915f3e5-4964-46bc-9adb-e34434ecea10\" (UID: \"f915f3e5-4964-46bc-9adb-e34434ecea10\") " Apr 20 15:07:45.029040 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:07:45.028990 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkn57\" (UniqueName: \"kubernetes.io/projected/f915f3e5-4964-46bc-9adb-e34434ecea10-kube-api-access-zkn57\") pod \"f915f3e5-4964-46bc-9adb-e34434ecea10\" (UID: \"f915f3e5-4964-46bc-9adb-e34434ecea10\") " Apr 20 15:07:45.029040 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:07:45.029020 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f915f3e5-4964-46bc-9adb-e34434ecea10-installation-pull-secrets\") pod \"f915f3e5-4964-46bc-9adb-e34434ecea10\" (UID: \"f915f3e5-4964-46bc-9adb-e34434ecea10\") " Apr 20 15:07:45.029040 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:07:45.029038 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/f915f3e5-4964-46bc-9adb-e34434ecea10-image-registry-private-configuration\") pod \"f915f3e5-4964-46bc-9adb-e34434ecea10\" (UID: \"f915f3e5-4964-46bc-9adb-e34434ecea10\") " Apr 20 15:07:45.029365 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:07:45.029055 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f915f3e5-4964-46bc-9adb-e34434ecea10-trusted-ca\") pod \"f915f3e5-4964-46bc-9adb-e34434ecea10\" (UID: \"f915f3e5-4964-46bc-9adb-e34434ecea10\") " Apr 20 15:07:45.029365 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:07:45.029075 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f915f3e5-4964-46bc-9adb-e34434ecea10-registry-tls\") pod \"f915f3e5-4964-46bc-9adb-e34434ecea10\" (UID: \"f915f3e5-4964-46bc-9adb-e34434ecea10\") " Apr 20 15:07:45.029365 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:07:45.029097 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f915f3e5-4964-46bc-9adb-e34434ecea10-registry-certificates\") pod \"f915f3e5-4964-46bc-9adb-e34434ecea10\" (UID: \"f915f3e5-4964-46bc-9adb-e34434ecea10\") " Apr 20 15:07:45.029545 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:07:45.029453 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f915f3e5-4964-46bc-9adb-e34434ecea10-bound-sa-token\") pod \"f915f3e5-4964-46bc-9adb-e34434ecea10\" (UID: \"f915f3e5-4964-46bc-9adb-e34434ecea10\") " Apr 20 15:07:45.029641 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:07:45.029615 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f915f3e5-4964-46bc-9adb-e34434ecea10-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "f915f3e5-4964-46bc-9adb-e34434ecea10" (UID: "f915f3e5-4964-46bc-9adb-e34434ecea10"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 15:07:45.029703 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:07:45.029640 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f915f3e5-4964-46bc-9adb-e34434ecea10-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "f915f3e5-4964-46bc-9adb-e34434ecea10" (UID: "f915f3e5-4964-46bc-9adb-e34434ecea10"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 15:07:45.029759 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:07:45.029742 2572 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f915f3e5-4964-46bc-9adb-e34434ecea10-trusted-ca\") on node \"ip-10-0-134-230.ec2.internal\" DevicePath \"\"" Apr 20 15:07:45.029813 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:07:45.029761 2572 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f915f3e5-4964-46bc-9adb-e34434ecea10-registry-certificates\") on node \"ip-10-0-134-230.ec2.internal\" DevicePath \"\"" Apr 20 15:07:45.031500 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:07:45.031425 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f915f3e5-4964-46bc-9adb-e34434ecea10-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "f915f3e5-4964-46bc-9adb-e34434ecea10" (UID: "f915f3e5-4964-46bc-9adb-e34434ecea10"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 15:07:45.031771 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:07:45.031733 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f915f3e5-4964-46bc-9adb-e34434ecea10-kube-api-access-zkn57" (OuterVolumeSpecName: "kube-api-access-zkn57") pod "f915f3e5-4964-46bc-9adb-e34434ecea10" (UID: "f915f3e5-4964-46bc-9adb-e34434ecea10"). InnerVolumeSpecName "kube-api-access-zkn57". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 15:07:45.031771 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:07:45.031746 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f915f3e5-4964-46bc-9adb-e34434ecea10-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "f915f3e5-4964-46bc-9adb-e34434ecea10" (UID: "f915f3e5-4964-46bc-9adb-e34434ecea10"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 15:07:45.031940 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:07:45.031921 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f915f3e5-4964-46bc-9adb-e34434ecea10-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "f915f3e5-4964-46bc-9adb-e34434ecea10" (UID: "f915f3e5-4964-46bc-9adb-e34434ecea10"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 15:07:45.032118 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:07:45.032102 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f915f3e5-4964-46bc-9adb-e34434ecea10-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "f915f3e5-4964-46bc-9adb-e34434ecea10" (UID: "f915f3e5-4964-46bc-9adb-e34434ecea10"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 15:07:45.039955 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:07:45.039930 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f915f3e5-4964-46bc-9adb-e34434ecea10-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "f915f3e5-4964-46bc-9adb-e34434ecea10" (UID: "f915f3e5-4964-46bc-9adb-e34434ecea10"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 15:07:45.130713 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:07:45.130687 2572 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f915f3e5-4964-46bc-9adb-e34434ecea10-installation-pull-secrets\") on node \"ip-10-0-134-230.ec2.internal\" DevicePath \"\"" Apr 20 15:07:45.130713 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:07:45.130713 2572 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/f915f3e5-4964-46bc-9adb-e34434ecea10-image-registry-private-configuration\") on node \"ip-10-0-134-230.ec2.internal\" DevicePath \"\"" Apr 20 15:07:45.130900 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:07:45.130725 2572 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f915f3e5-4964-46bc-9adb-e34434ecea10-registry-tls\") on node \"ip-10-0-134-230.ec2.internal\" DevicePath \"\"" Apr 20 15:07:45.130900 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:07:45.130734 2572 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f915f3e5-4964-46bc-9adb-e34434ecea10-bound-sa-token\") on node \"ip-10-0-134-230.ec2.internal\" DevicePath \"\"" Apr 20 15:07:45.130900 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:07:45.130742 2572 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f915f3e5-4964-46bc-9adb-e34434ecea10-ca-trust-extracted\") on node \"ip-10-0-134-230.ec2.internal\" DevicePath \"\"" Apr 20 15:07:45.130900 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:07:45.130751 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zkn57\" (UniqueName: \"kubernetes.io/projected/f915f3e5-4964-46bc-9adb-e34434ecea10-kube-api-access-zkn57\") on node \"ip-10-0-134-230.ec2.internal\" DevicePath \"\"" Apr 20 15:07:45.303533 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:07:45.303434 2572 generic.go:358] "Generic (PLEG): container finished" podID="f915f3e5-4964-46bc-9adb-e34434ecea10" containerID="22172bda47dbd96e14d6a7b6addf62b0e570587d2bf623f5d9e72c0c05e51320" exitCode=0 Apr 20 15:07:45.303533 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:07:45.303506 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5f46db577b-xgcwj" Apr 20 15:07:45.303714 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:07:45.303530 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5f46db577b-xgcwj" event={"ID":"f915f3e5-4964-46bc-9adb-e34434ecea10","Type":"ContainerDied","Data":"22172bda47dbd96e14d6a7b6addf62b0e570587d2bf623f5d9e72c0c05e51320"} Apr 20 15:07:45.303714 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:07:45.303565 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5f46db577b-xgcwj" event={"ID":"f915f3e5-4964-46bc-9adb-e34434ecea10","Type":"ContainerDied","Data":"162c3967496629c0135b5dd20a29948e016c75f2e8f54c536c612421bf5c0466"} Apr 20 15:07:45.303714 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:07:45.303580 2572 scope.go:117] "RemoveContainer" containerID="22172bda47dbd96e14d6a7b6addf62b0e570587d2bf623f5d9e72c0c05e51320" Apr 20 15:07:45.311467 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:07:45.311447 2572 scope.go:117] "RemoveContainer" containerID="22172bda47dbd96e14d6a7b6addf62b0e570587d2bf623f5d9e72c0c05e51320" Apr 20 15:07:45.311770 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:07:45.311750 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22172bda47dbd96e14d6a7b6addf62b0e570587d2bf623f5d9e72c0c05e51320\": container with ID starting with 22172bda47dbd96e14d6a7b6addf62b0e570587d2bf623f5d9e72c0c05e51320 not found: ID does not exist" containerID="22172bda47dbd96e14d6a7b6addf62b0e570587d2bf623f5d9e72c0c05e51320" Apr 20 15:07:45.311847 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:07:45.311776 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22172bda47dbd96e14d6a7b6addf62b0e570587d2bf623f5d9e72c0c05e51320"} err="failed to get container status \"22172bda47dbd96e14d6a7b6addf62b0e570587d2bf623f5d9e72c0c05e51320\": rpc error: code = NotFound desc = could not find container \"22172bda47dbd96e14d6a7b6addf62b0e570587d2bf623f5d9e72c0c05e51320\": container with ID starting with 22172bda47dbd96e14d6a7b6addf62b0e570587d2bf623f5d9e72c0c05e51320 not found: ID does not exist" Apr 20 15:07:45.324199 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:07:45.324178 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5f46db577b-xgcwj"] Apr 20 15:07:45.329156 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:07:45.329134 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-5f46db577b-xgcwj"] Apr 20 15:07:46.409920 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:07:46.409886 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f915f3e5-4964-46bc-9adb-e34434ecea10" path="/var/lib/kubelet/pods/f915f3e5-4964-46bc-9adb-e34434ecea10/volumes" Apr 20 15:09:02.687586 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:02.687548 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5mt7tp"] Apr 20 15:09:02.688052 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:02.687826 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f915f3e5-4964-46bc-9adb-e34434ecea10" containerName="registry" Apr 20 15:09:02.688052 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:02.687837 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="f915f3e5-4964-46bc-9adb-e34434ecea10" containerName="registry" Apr 20 15:09:02.688052 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:02.687891 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="f915f3e5-4964-46bc-9adb-e34434ecea10" containerName="registry" Apr 20 15:09:02.689770 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:02.689753 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5mt7tp" Apr 20 15:09:02.692319 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:02.692298 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 15:09:02.693363 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:02.693345 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 15:09:02.693363 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:02.693354 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-5dxmm\"" Apr 20 15:09:02.698572 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:02.698550 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5mt7tp"] Apr 20 15:09:02.737696 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:02.737668 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f0cfc50e-6127-4cd0-b03e-4d9327738ab0-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5mt7tp\" (UID: \"f0cfc50e-6127-4cd0-b03e-4d9327738ab0\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5mt7tp" Apr 20 15:09:02.737879 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:02.737706 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pnff\" (UniqueName: \"kubernetes.io/projected/f0cfc50e-6127-4cd0-b03e-4d9327738ab0-kube-api-access-7pnff\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5mt7tp\" (UID: \"f0cfc50e-6127-4cd0-b03e-4d9327738ab0\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5mt7tp" Apr 20 15:09:02.737879 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:02.737803 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f0cfc50e-6127-4cd0-b03e-4d9327738ab0-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5mt7tp\" (UID: \"f0cfc50e-6127-4cd0-b03e-4d9327738ab0\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5mt7tp" Apr 20 15:09:02.839142 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:02.839103 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f0cfc50e-6127-4cd0-b03e-4d9327738ab0-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5mt7tp\" (UID: \"f0cfc50e-6127-4cd0-b03e-4d9327738ab0\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5mt7tp" Apr 20 15:09:02.839332 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:02.839156 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f0cfc50e-6127-4cd0-b03e-4d9327738ab0-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5mt7tp\" (UID: \"f0cfc50e-6127-4cd0-b03e-4d9327738ab0\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5mt7tp" Apr 20 15:09:02.839332 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:02.839178 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7pnff\" (UniqueName: \"kubernetes.io/projected/f0cfc50e-6127-4cd0-b03e-4d9327738ab0-kube-api-access-7pnff\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5mt7tp\" (UID: \"f0cfc50e-6127-4cd0-b03e-4d9327738ab0\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5mt7tp" Apr 20 15:09:02.839522 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:02.839462 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f0cfc50e-6127-4cd0-b03e-4d9327738ab0-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5mt7tp\" (UID: \"f0cfc50e-6127-4cd0-b03e-4d9327738ab0\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5mt7tp" Apr 20 15:09:02.839600 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:02.839564 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f0cfc50e-6127-4cd0-b03e-4d9327738ab0-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5mt7tp\" (UID: \"f0cfc50e-6127-4cd0-b03e-4d9327738ab0\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5mt7tp" Apr 20 15:09:02.847473 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:02.847453 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pnff\" (UniqueName: \"kubernetes.io/projected/f0cfc50e-6127-4cd0-b03e-4d9327738ab0-kube-api-access-7pnff\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5mt7tp\" (UID: \"f0cfc50e-6127-4cd0-b03e-4d9327738ab0\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5mt7tp" Apr 20 15:09:02.998732 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:02.998705 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5mt7tp" Apr 20 15:09:03.109141 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:03.109112 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5mt7tp"] Apr 20 15:09:03.112098 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:09:03.112065 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf0cfc50e_6127_4cd0_b03e_4d9327738ab0.slice/crio-2bffd3a6af18d0d480ec25e6bea1104c0c58826e711a422e91693e7f3504a9c6 WatchSource:0}: Error finding container 2bffd3a6af18d0d480ec25e6bea1104c0c58826e711a422e91693e7f3504a9c6: Status 404 returned error can't find the container with id 2bffd3a6af18d0d480ec25e6bea1104c0c58826e711a422e91693e7f3504a9c6 Apr 20 15:09:03.495443 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:03.495412 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5mt7tp" event={"ID":"f0cfc50e-6127-4cd0-b03e-4d9327738ab0","Type":"ContainerStarted","Data":"2bffd3a6af18d0d480ec25e6bea1104c0c58826e711a422e91693e7f3504a9c6"} Apr 20 15:09:11.517584 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:11.517552 2572 generic.go:358] "Generic (PLEG): container finished" podID="f0cfc50e-6127-4cd0-b03e-4d9327738ab0" containerID="fc20be775f3223c0322420fbd897283999f9a0383e9b3cccc8dc6d84a25ce01c" exitCode=0 Apr 20 15:09:11.517584 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:11.517589 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5mt7tp" event={"ID":"f0cfc50e-6127-4cd0-b03e-4d9327738ab0","Type":"ContainerDied","Data":"fc20be775f3223c0322420fbd897283999f9a0383e9b3cccc8dc6d84a25ce01c"} Apr 20 15:09:13.524983 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:13.524955 2572 generic.go:358] "Generic (PLEG): container finished" podID="f0cfc50e-6127-4cd0-b03e-4d9327738ab0" containerID="5f0ec83678158c9a9d2cee3de022958d47f4f5293efcc35321be5831a537dccf" exitCode=0 Apr 20 15:09:13.525293 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:13.525047 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5mt7tp" event={"ID":"f0cfc50e-6127-4cd0-b03e-4d9327738ab0","Type":"ContainerDied","Data":"5f0ec83678158c9a9d2cee3de022958d47f4f5293efcc35321be5831a537dccf"} Apr 20 15:09:20.545459 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:20.545428 2572 generic.go:358] "Generic (PLEG): container finished" podID="f0cfc50e-6127-4cd0-b03e-4d9327738ab0" containerID="e8fcd143fd4da1d6d970c6d455054acae31fc21f1a0a150e500469ecb4f36f9b" exitCode=0 Apr 20 15:09:20.545843 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:20.545503 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5mt7tp" event={"ID":"f0cfc50e-6127-4cd0-b03e-4d9327738ab0","Type":"ContainerDied","Data":"e8fcd143fd4da1d6d970c6d455054acae31fc21f1a0a150e500469ecb4f36f9b"} Apr 20 15:09:21.662477 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:21.662454 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5mt7tp" Apr 20 15:09:21.779235 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:21.779198 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f0cfc50e-6127-4cd0-b03e-4d9327738ab0-util\") pod \"f0cfc50e-6127-4cd0-b03e-4d9327738ab0\" (UID: \"f0cfc50e-6127-4cd0-b03e-4d9327738ab0\") " Apr 20 15:09:21.779235 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:21.779244 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f0cfc50e-6127-4cd0-b03e-4d9327738ab0-bundle\") pod \"f0cfc50e-6127-4cd0-b03e-4d9327738ab0\" (UID: \"f0cfc50e-6127-4cd0-b03e-4d9327738ab0\") " Apr 20 15:09:21.779396 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:21.779268 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7pnff\" (UniqueName: \"kubernetes.io/projected/f0cfc50e-6127-4cd0-b03e-4d9327738ab0-kube-api-access-7pnff\") pod \"f0cfc50e-6127-4cd0-b03e-4d9327738ab0\" (UID: \"f0cfc50e-6127-4cd0-b03e-4d9327738ab0\") " Apr 20 15:09:21.779827 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:21.779795 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0cfc50e-6127-4cd0-b03e-4d9327738ab0-bundle" (OuterVolumeSpecName: "bundle") pod "f0cfc50e-6127-4cd0-b03e-4d9327738ab0" (UID: "f0cfc50e-6127-4cd0-b03e-4d9327738ab0"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 15:09:21.781339 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:21.781314 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0cfc50e-6127-4cd0-b03e-4d9327738ab0-kube-api-access-7pnff" (OuterVolumeSpecName: "kube-api-access-7pnff") pod "f0cfc50e-6127-4cd0-b03e-4d9327738ab0" (UID: "f0cfc50e-6127-4cd0-b03e-4d9327738ab0"). InnerVolumeSpecName "kube-api-access-7pnff". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 15:09:21.783311 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:21.783284 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0cfc50e-6127-4cd0-b03e-4d9327738ab0-util" (OuterVolumeSpecName: "util") pod "f0cfc50e-6127-4cd0-b03e-4d9327738ab0" (UID: "f0cfc50e-6127-4cd0-b03e-4d9327738ab0"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 15:09:21.880226 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:21.880159 2572 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f0cfc50e-6127-4cd0-b03e-4d9327738ab0-util\") on node \"ip-10-0-134-230.ec2.internal\" DevicePath \"\"" Apr 20 15:09:21.880226 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:21.880181 2572 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f0cfc50e-6127-4cd0-b03e-4d9327738ab0-bundle\") on node \"ip-10-0-134-230.ec2.internal\" DevicePath \"\"" Apr 20 15:09:21.880226 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:21.880192 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7pnff\" (UniqueName: \"kubernetes.io/projected/f0cfc50e-6127-4cd0-b03e-4d9327738ab0-kube-api-access-7pnff\") on node \"ip-10-0-134-230.ec2.internal\" DevicePath \"\"" Apr 20 15:09:22.552918 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:22.552892 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5mt7tp" Apr 20 15:09:22.553068 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:22.552882 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5mt7tp" event={"ID":"f0cfc50e-6127-4cd0-b03e-4d9327738ab0","Type":"ContainerDied","Data":"2bffd3a6af18d0d480ec25e6bea1104c0c58826e711a422e91693e7f3504a9c6"} Apr 20 15:09:22.553068 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:22.552994 2572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2bffd3a6af18d0d480ec25e6bea1104c0c58826e711a422e91693e7f3504a9c6" Apr 20 15:09:29.887367 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:29.887327 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhrpjr"] Apr 20 15:09:29.887869 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:29.887583 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f0cfc50e-6127-4cd0-b03e-4d9327738ab0" containerName="util" Apr 20 15:09:29.887869 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:29.887594 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0cfc50e-6127-4cd0-b03e-4d9327738ab0" containerName="util" Apr 20 15:09:29.887869 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:29.887601 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f0cfc50e-6127-4cd0-b03e-4d9327738ab0" containerName="pull" Apr 20 15:09:29.887869 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:29.887607 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0cfc50e-6127-4cd0-b03e-4d9327738ab0" containerName="pull" Apr 20 15:09:29.887869 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:29.887613 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f0cfc50e-6127-4cd0-b03e-4d9327738ab0" containerName="extract" Apr 20 15:09:29.887869 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:29.887619 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0cfc50e-6127-4cd0-b03e-4d9327738ab0" containerName="extract" Apr 20 15:09:29.887869 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:29.887667 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="f0cfc50e-6127-4cd0-b03e-4d9327738ab0" containerName="extract" Apr 20 15:09:29.892309 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:29.892291 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhrpjr" Apr 20 15:09:29.895699 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:29.895681 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 15:09:29.895780 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:29.895683 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 15:09:29.896591 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:29.896575 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-5dxmm\"" Apr 20 15:09:29.901480 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:29.901457 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhrpjr"] Apr 20 15:09:30.036181 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:30.036138 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9e6d3607-6407-415e-ab25-35fd50699afa-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhrpjr\" (UID: \"9e6d3607-6407-415e-ab25-35fd50699afa\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhrpjr" Apr 20 15:09:30.036181 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:30.036185 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9e6d3607-6407-415e-ab25-35fd50699afa-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhrpjr\" (UID: \"9e6d3607-6407-415e-ab25-35fd50699afa\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhrpjr" Apr 20 15:09:30.036397 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:30.036216 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9f4cx\" (UniqueName: \"kubernetes.io/projected/9e6d3607-6407-415e-ab25-35fd50699afa-kube-api-access-9f4cx\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhrpjr\" (UID: \"9e6d3607-6407-415e-ab25-35fd50699afa\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhrpjr" Apr 20 15:09:30.137138 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:30.137103 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9e6d3607-6407-415e-ab25-35fd50699afa-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhrpjr\" (UID: \"9e6d3607-6407-415e-ab25-35fd50699afa\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhrpjr" Apr 20 15:09:30.137138 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:30.137140 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9e6d3607-6407-415e-ab25-35fd50699afa-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhrpjr\" (UID: \"9e6d3607-6407-415e-ab25-35fd50699afa\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhrpjr" Apr 20 15:09:30.137355 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:30.137170 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9f4cx\" (UniqueName: \"kubernetes.io/projected/9e6d3607-6407-415e-ab25-35fd50699afa-kube-api-access-9f4cx\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhrpjr\" (UID: \"9e6d3607-6407-415e-ab25-35fd50699afa\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhrpjr" Apr 20 15:09:30.137582 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:30.137529 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9e6d3607-6407-415e-ab25-35fd50699afa-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhrpjr\" (UID: \"9e6d3607-6407-415e-ab25-35fd50699afa\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhrpjr" Apr 20 15:09:30.137582 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:30.137530 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9e6d3607-6407-415e-ab25-35fd50699afa-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhrpjr\" (UID: \"9e6d3607-6407-415e-ab25-35fd50699afa\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhrpjr" Apr 20 15:09:30.145352 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:30.145327 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9f4cx\" (UniqueName: \"kubernetes.io/projected/9e6d3607-6407-415e-ab25-35fd50699afa-kube-api-access-9f4cx\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhrpjr\" (UID: \"9e6d3607-6407-415e-ab25-35fd50699afa\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhrpjr" Apr 20 15:09:30.201169 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:30.201146 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhrpjr" Apr 20 15:09:30.310502 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:30.310461 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhrpjr"] Apr 20 15:09:30.314760 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:09:30.314735 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e6d3607_6407_415e_ab25_35fd50699afa.slice/crio-1fbc61fb79cd172dffe68b103de56481eddd42667c1bdef8e200b43b18f06166 WatchSource:0}: Error finding container 1fbc61fb79cd172dffe68b103de56481eddd42667c1bdef8e200b43b18f06166: Status 404 returned error can't find the container with id 1fbc61fb79cd172dffe68b103de56481eddd42667c1bdef8e200b43b18f06166 Apr 20 15:09:30.576435 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:30.576399 2572 generic.go:358] "Generic (PLEG): container finished" podID="9e6d3607-6407-415e-ab25-35fd50699afa" containerID="596e43261d36580891011ac46263d74c615c9dae8c0d6647ddccc5dbb067395a" exitCode=0 Apr 20 15:09:30.576620 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:30.576450 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhrpjr" event={"ID":"9e6d3607-6407-415e-ab25-35fd50699afa","Type":"ContainerDied","Data":"596e43261d36580891011ac46263d74c615c9dae8c0d6647ddccc5dbb067395a"} Apr 20 15:09:30.576620 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:30.576473 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhrpjr" event={"ID":"9e6d3607-6407-415e-ab25-35fd50699afa","Type":"ContainerStarted","Data":"1fbc61fb79cd172dffe68b103de56481eddd42667c1bdef8e200b43b18f06166"} Apr 20 15:09:32.583339 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:32.583305 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhrpjr" event={"ID":"9e6d3607-6407-415e-ab25-35fd50699afa","Type":"ContainerStarted","Data":"253d962c0b399bc4922d5ccec4f8514c75b5e93723b4ceb102191a115131f09e"} Apr 20 15:09:33.587792 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:33.587752 2572 generic.go:358] "Generic (PLEG): container finished" podID="9e6d3607-6407-415e-ab25-35fd50699afa" containerID="253d962c0b399bc4922d5ccec4f8514c75b5e93723b4ceb102191a115131f09e" exitCode=0 Apr 20 15:09:33.588205 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:33.587807 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhrpjr" event={"ID":"9e6d3607-6407-415e-ab25-35fd50699afa","Type":"ContainerDied","Data":"253d962c0b399bc4922d5ccec4f8514c75b5e93723b4ceb102191a115131f09e"} Apr 20 15:09:34.591659 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:34.591619 2572 generic.go:358] "Generic (PLEG): container finished" podID="9e6d3607-6407-415e-ab25-35fd50699afa" containerID="90f816dcea19c5077afe03961e7c8bfef35f85b474c494dfa7c1c188e4c7c54e" exitCode=0 Apr 20 15:09:34.592039 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:34.591699 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhrpjr" event={"ID":"9e6d3607-6407-415e-ab25-35fd50699afa","Type":"ContainerDied","Data":"90f816dcea19c5077afe03961e7c8bfef35f85b474c494dfa7c1c188e4c7c54e"} Apr 20 15:09:35.709822 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:35.709800 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhrpjr" Apr 20 15:09:35.881829 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:35.881740 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9e6d3607-6407-415e-ab25-35fd50699afa-bundle\") pod \"9e6d3607-6407-415e-ab25-35fd50699afa\" (UID: \"9e6d3607-6407-415e-ab25-35fd50699afa\") " Apr 20 15:09:35.881829 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:35.881799 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9f4cx\" (UniqueName: \"kubernetes.io/projected/9e6d3607-6407-415e-ab25-35fd50699afa-kube-api-access-9f4cx\") pod \"9e6d3607-6407-415e-ab25-35fd50699afa\" (UID: \"9e6d3607-6407-415e-ab25-35fd50699afa\") " Apr 20 15:09:35.882067 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:35.881831 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9e6d3607-6407-415e-ab25-35fd50699afa-util\") pod \"9e6d3607-6407-415e-ab25-35fd50699afa\" (UID: \"9e6d3607-6407-415e-ab25-35fd50699afa\") " Apr 20 15:09:35.882137 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:35.882114 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e6d3607-6407-415e-ab25-35fd50699afa-bundle" (OuterVolumeSpecName: "bundle") pod "9e6d3607-6407-415e-ab25-35fd50699afa" (UID: "9e6d3607-6407-415e-ab25-35fd50699afa"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 15:09:35.883960 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:35.883935 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e6d3607-6407-415e-ab25-35fd50699afa-kube-api-access-9f4cx" (OuterVolumeSpecName: "kube-api-access-9f4cx") pod "9e6d3607-6407-415e-ab25-35fd50699afa" (UID: "9e6d3607-6407-415e-ab25-35fd50699afa"). InnerVolumeSpecName "kube-api-access-9f4cx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 15:09:35.886281 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:35.886257 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e6d3607-6407-415e-ab25-35fd50699afa-util" (OuterVolumeSpecName: "util") pod "9e6d3607-6407-415e-ab25-35fd50699afa" (UID: "9e6d3607-6407-415e-ab25-35fd50699afa"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 15:09:35.982247 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:35.982211 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9f4cx\" (UniqueName: \"kubernetes.io/projected/9e6d3607-6407-415e-ab25-35fd50699afa-kube-api-access-9f4cx\") on node \"ip-10-0-134-230.ec2.internal\" DevicePath \"\"" Apr 20 15:09:35.982247 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:35.982242 2572 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9e6d3607-6407-415e-ab25-35fd50699afa-util\") on node \"ip-10-0-134-230.ec2.internal\" DevicePath \"\"" Apr 20 15:09:35.982247 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:35.982253 2572 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9e6d3607-6407-415e-ab25-35fd50699afa-bundle\") on node \"ip-10-0-134-230.ec2.internal\" DevicePath \"\"" Apr 20 15:09:36.598640 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:36.598604 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhrpjr" event={"ID":"9e6d3607-6407-415e-ab25-35fd50699afa","Type":"ContainerDied","Data":"1fbc61fb79cd172dffe68b103de56481eddd42667c1bdef8e200b43b18f06166"} Apr 20 15:09:36.598640 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:36.598641 2572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1fbc61fb79cd172dffe68b103de56481eddd42667c1bdef8e200b43b18f06166" Apr 20 15:09:36.598833 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:36.598645 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhrpjr" Apr 20 15:09:40.779731 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:40.779651 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-qkldg"] Apr 20 15:09:40.780161 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:40.779866 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9e6d3607-6407-415e-ab25-35fd50699afa" containerName="pull" Apr 20 15:09:40.780161 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:40.779876 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e6d3607-6407-415e-ab25-35fd50699afa" containerName="pull" Apr 20 15:09:40.780161 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:40.779891 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9e6d3607-6407-415e-ab25-35fd50699afa" containerName="extract" Apr 20 15:09:40.780161 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:40.779897 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e6d3607-6407-415e-ab25-35fd50699afa" containerName="extract" Apr 20 15:09:40.780161 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:40.779905 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9e6d3607-6407-415e-ab25-35fd50699afa" containerName="util" Apr 20 15:09:40.780161 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:40.779910 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e6d3607-6407-415e-ab25-35fd50699afa" containerName="util" Apr 20 15:09:40.780161 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:40.779948 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="9e6d3607-6407-415e-ab25-35fd50699afa" containerName="extract" Apr 20 15:09:40.781534 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:40.781517 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-qkldg" Apr 20 15:09:40.783848 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:40.783820 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 20 15:09:40.784761 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:40.784741 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 20 15:09:40.784761 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:40.784753 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"openshift-lws-operator-dockercfg-jnq5g\"" Apr 20 15:09:40.789730 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:40.789708 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-qkldg"] Apr 20 15:09:40.916199 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:40.916166 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6sxt\" (UniqueName: \"kubernetes.io/projected/c2a3d95d-2318-4e9e-b741-a5263163d18e-kube-api-access-t6sxt\") pod \"openshift-lws-operator-bfc7f696d-qkldg\" (UID: \"c2a3d95d-2318-4e9e-b741-a5263163d18e\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-qkldg" Apr 20 15:09:40.916358 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:40.916223 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c2a3d95d-2318-4e9e-b741-a5263163d18e-tmp\") pod \"openshift-lws-operator-bfc7f696d-qkldg\" (UID: \"c2a3d95d-2318-4e9e-b741-a5263163d18e\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-qkldg" Apr 20 15:09:41.016962 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:41.016925 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t6sxt\" (UniqueName: \"kubernetes.io/projected/c2a3d95d-2318-4e9e-b741-a5263163d18e-kube-api-access-t6sxt\") pod \"openshift-lws-operator-bfc7f696d-qkldg\" (UID: \"c2a3d95d-2318-4e9e-b741-a5263163d18e\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-qkldg" Apr 20 15:09:41.017097 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:41.016984 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c2a3d95d-2318-4e9e-b741-a5263163d18e-tmp\") pod \"openshift-lws-operator-bfc7f696d-qkldg\" (UID: \"c2a3d95d-2318-4e9e-b741-a5263163d18e\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-qkldg" Apr 20 15:09:41.017321 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:41.017307 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c2a3d95d-2318-4e9e-b741-a5263163d18e-tmp\") pod \"openshift-lws-operator-bfc7f696d-qkldg\" (UID: \"c2a3d95d-2318-4e9e-b741-a5263163d18e\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-qkldg" Apr 20 15:09:41.024995 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:41.024967 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6sxt\" (UniqueName: \"kubernetes.io/projected/c2a3d95d-2318-4e9e-b741-a5263163d18e-kube-api-access-t6sxt\") pod \"openshift-lws-operator-bfc7f696d-qkldg\" (UID: \"c2a3d95d-2318-4e9e-b741-a5263163d18e\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-qkldg" Apr 20 15:09:41.090002 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:41.089929 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-qkldg" Apr 20 15:09:41.202217 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:41.202188 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-qkldg"] Apr 20 15:09:41.205007 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:09:41.204981 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc2a3d95d_2318_4e9e_b741_a5263163d18e.slice/crio-0af7901d5cba392c0e3ca12e2e851bd98b2f8ba32f3cb1ba97f880acabdef86d WatchSource:0}: Error finding container 0af7901d5cba392c0e3ca12e2e851bd98b2f8ba32f3cb1ba97f880acabdef86d: Status 404 returned error can't find the container with id 0af7901d5cba392c0e3ca12e2e851bd98b2f8ba32f3cb1ba97f880acabdef86d Apr 20 15:09:41.611885 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:41.611851 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-qkldg" event={"ID":"c2a3d95d-2318-4e9e-b741-a5263163d18e","Type":"ContainerStarted","Data":"0af7901d5cba392c0e3ca12e2e851bd98b2f8ba32f3cb1ba97f880acabdef86d"} Apr 20 15:09:43.618636 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:43.618597 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-qkldg" event={"ID":"c2a3d95d-2318-4e9e-b741-a5263163d18e","Type":"ContainerStarted","Data":"a329a3ab2e39331c00bfd8b13bc838b4bb9044dc1b2092a7169a6bd1436ffe88"} Apr 20 15:09:43.633446 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:43.633397 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-qkldg" podStartSLOduration=1.869413301 podStartE2EDuration="3.633384753s" podCreationTimestamp="2026-04-20 15:09:40 +0000 UTC" firstStartedPulling="2026-04-20 15:09:41.206500744 +0000 UTC m=+441.337376041" lastFinishedPulling="2026-04-20 15:09:42.970472211 +0000 UTC m=+443.101347493" observedRunningTime="2026-04-20 15:09:43.632804222 +0000 UTC m=+443.763679525" watchObservedRunningTime="2026-04-20 15:09:43.633384753 +0000 UTC m=+443.764260056" Apr 20 15:09:46.549867 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:46.549832 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5z2nlj"] Apr 20 15:09:46.552067 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:46.552052 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5z2nlj" Apr 20 15:09:46.554348 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:46.554323 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 15:09:46.554503 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:46.554400 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 15:09:46.555357 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:46.555339 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-5dxmm\"" Apr 20 15:09:46.560479 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:46.560443 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5z2nlj"] Apr 20 15:09:46.659445 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:46.659415 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8c6e90c5-b30f-43cb-bf52-d7d16b587979-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5z2nlj\" (UID: \"8c6e90c5-b30f-43cb-bf52-d7d16b587979\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5z2nlj" Apr 20 15:09:46.659584 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:46.659456 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gltwx\" (UniqueName: \"kubernetes.io/projected/8c6e90c5-b30f-43cb-bf52-d7d16b587979-kube-api-access-gltwx\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5z2nlj\" (UID: \"8c6e90c5-b30f-43cb-bf52-d7d16b587979\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5z2nlj" Apr 20 15:09:46.659584 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:46.659519 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8c6e90c5-b30f-43cb-bf52-d7d16b587979-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5z2nlj\" (UID: \"8c6e90c5-b30f-43cb-bf52-d7d16b587979\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5z2nlj" Apr 20 15:09:46.760594 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:46.760567 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8c6e90c5-b30f-43cb-bf52-d7d16b587979-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5z2nlj\" (UID: \"8c6e90c5-b30f-43cb-bf52-d7d16b587979\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5z2nlj" Apr 20 15:09:46.760681 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:46.760605 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gltwx\" (UniqueName: \"kubernetes.io/projected/8c6e90c5-b30f-43cb-bf52-d7d16b587979-kube-api-access-gltwx\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5z2nlj\" (UID: \"8c6e90c5-b30f-43cb-bf52-d7d16b587979\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5z2nlj" Apr 20 15:09:46.760792 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:46.760778 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8c6e90c5-b30f-43cb-bf52-d7d16b587979-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5z2nlj\" (UID: \"8c6e90c5-b30f-43cb-bf52-d7d16b587979\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5z2nlj" Apr 20 15:09:46.760919 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:46.760900 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8c6e90c5-b30f-43cb-bf52-d7d16b587979-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5z2nlj\" (UID: \"8c6e90c5-b30f-43cb-bf52-d7d16b587979\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5z2nlj" Apr 20 15:09:46.761047 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:46.761033 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8c6e90c5-b30f-43cb-bf52-d7d16b587979-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5z2nlj\" (UID: \"8c6e90c5-b30f-43cb-bf52-d7d16b587979\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5z2nlj" Apr 20 15:09:46.768477 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:46.768458 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gltwx\" (UniqueName: \"kubernetes.io/projected/8c6e90c5-b30f-43cb-bf52-d7d16b587979-kube-api-access-gltwx\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5z2nlj\" (UID: \"8c6e90c5-b30f-43cb-bf52-d7d16b587979\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5z2nlj" Apr 20 15:09:46.861460 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:46.861407 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5z2nlj" Apr 20 15:09:46.987325 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:46.987295 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5z2nlj"] Apr 20 15:09:46.990314 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:09:46.990290 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c6e90c5_b30f_43cb_bf52_d7d16b587979.slice/crio-972f37f962f7d58d319c8bb5311025b955f1d2f0ebe1bc0c75286af7c7ded050 WatchSource:0}: Error finding container 972f37f962f7d58d319c8bb5311025b955f1d2f0ebe1bc0c75286af7c7ded050: Status 404 returned error can't find the container with id 972f37f962f7d58d319c8bb5311025b955f1d2f0ebe1bc0c75286af7c7ded050 Apr 20 15:09:47.630177 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:47.630144 2572 generic.go:358] "Generic (PLEG): container finished" podID="8c6e90c5-b30f-43cb-bf52-d7d16b587979" containerID="df5d5a054d6653c4d1b2f061a8ce44f6068c10bf4004b5bec73c61b267ce1b64" exitCode=0 Apr 20 15:09:47.630591 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:47.630187 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5z2nlj" event={"ID":"8c6e90c5-b30f-43cb-bf52-d7d16b587979","Type":"ContainerDied","Data":"df5d5a054d6653c4d1b2f061a8ce44f6068c10bf4004b5bec73c61b267ce1b64"} Apr 20 15:09:47.630591 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:47.630207 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5z2nlj" event={"ID":"8c6e90c5-b30f-43cb-bf52-d7d16b587979","Type":"ContainerStarted","Data":"972f37f962f7d58d319c8bb5311025b955f1d2f0ebe1bc0c75286af7c7ded050"} Apr 20 15:09:48.634133 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:48.634101 2572 generic.go:358] "Generic (PLEG): container finished" podID="8c6e90c5-b30f-43cb-bf52-d7d16b587979" containerID="c95cf1d0dbf36d73202cfe9d880edb87b22a8b75f9833b77315c25ea32e83f59" exitCode=0 Apr 20 15:09:48.634592 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:48.634184 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5z2nlj" event={"ID":"8c6e90c5-b30f-43cb-bf52-d7d16b587979","Type":"ContainerDied","Data":"c95cf1d0dbf36d73202cfe9d880edb87b22a8b75f9833b77315c25ea32e83f59"} Apr 20 15:09:49.638697 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:49.638665 2572 generic.go:358] "Generic (PLEG): container finished" podID="8c6e90c5-b30f-43cb-bf52-d7d16b587979" containerID="360e52618b51e35598d02cc6bcce40415a7e4e686207ef82200a24ff3d58ba83" exitCode=0 Apr 20 15:09:49.639050 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:49.638753 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5z2nlj" event={"ID":"8c6e90c5-b30f-43cb-bf52-d7d16b587979","Type":"ContainerDied","Data":"360e52618b51e35598d02cc6bcce40415a7e4e686207ef82200a24ff3d58ba83"} Apr 20 15:09:50.754316 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:50.754295 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5z2nlj" Apr 20 15:09:50.886144 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:50.886110 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8c6e90c5-b30f-43cb-bf52-d7d16b587979-util\") pod \"8c6e90c5-b30f-43cb-bf52-d7d16b587979\" (UID: \"8c6e90c5-b30f-43cb-bf52-d7d16b587979\") " Apr 20 15:09:50.886144 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:50.886152 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8c6e90c5-b30f-43cb-bf52-d7d16b587979-bundle\") pod \"8c6e90c5-b30f-43cb-bf52-d7d16b587979\" (UID: \"8c6e90c5-b30f-43cb-bf52-d7d16b587979\") " Apr 20 15:09:50.886381 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:50.886177 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gltwx\" (UniqueName: \"kubernetes.io/projected/8c6e90c5-b30f-43cb-bf52-d7d16b587979-kube-api-access-gltwx\") pod \"8c6e90c5-b30f-43cb-bf52-d7d16b587979\" (UID: \"8c6e90c5-b30f-43cb-bf52-d7d16b587979\") " Apr 20 15:09:50.886869 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:50.886842 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c6e90c5-b30f-43cb-bf52-d7d16b587979-bundle" (OuterVolumeSpecName: "bundle") pod "8c6e90c5-b30f-43cb-bf52-d7d16b587979" (UID: "8c6e90c5-b30f-43cb-bf52-d7d16b587979"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 15:09:50.888257 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:50.888237 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c6e90c5-b30f-43cb-bf52-d7d16b587979-kube-api-access-gltwx" (OuterVolumeSpecName: "kube-api-access-gltwx") pod "8c6e90c5-b30f-43cb-bf52-d7d16b587979" (UID: "8c6e90c5-b30f-43cb-bf52-d7d16b587979"). InnerVolumeSpecName "kube-api-access-gltwx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 15:09:50.894404 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:50.894367 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c6e90c5-b30f-43cb-bf52-d7d16b587979-util" (OuterVolumeSpecName: "util") pod "8c6e90c5-b30f-43cb-bf52-d7d16b587979" (UID: "8c6e90c5-b30f-43cb-bf52-d7d16b587979"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 15:09:50.987165 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:50.987135 2572 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8c6e90c5-b30f-43cb-bf52-d7d16b587979-util\") on node \"ip-10-0-134-230.ec2.internal\" DevicePath \"\"" Apr 20 15:09:50.987165 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:50.987162 2572 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8c6e90c5-b30f-43cb-bf52-d7d16b587979-bundle\") on node \"ip-10-0-134-230.ec2.internal\" DevicePath \"\"" Apr 20 15:09:50.987165 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:50.987171 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gltwx\" (UniqueName: \"kubernetes.io/projected/8c6e90c5-b30f-43cb-bf52-d7d16b587979-kube-api-access-gltwx\") on node \"ip-10-0-134-230.ec2.internal\" DevicePath \"\"" Apr 20 15:09:51.645094 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:51.645061 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5z2nlj" event={"ID":"8c6e90c5-b30f-43cb-bf52-d7d16b587979","Type":"ContainerDied","Data":"972f37f962f7d58d319c8bb5311025b955f1d2f0ebe1bc0c75286af7c7ded050"} Apr 20 15:09:51.645094 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:51.645094 2572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="972f37f962f7d58d319c8bb5311025b955f1d2f0ebe1bc0c75286af7c7ded050" Apr 20 15:09:51.645298 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:09:51.645137 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5z2nlj" Apr 20 15:10:03.296340 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:03.296303 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9bpzr7"] Apr 20 15:10:03.296767 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:03.296565 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8c6e90c5-b30f-43cb-bf52-d7d16b587979" containerName="pull" Apr 20 15:10:03.296767 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:03.296578 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c6e90c5-b30f-43cb-bf52-d7d16b587979" containerName="pull" Apr 20 15:10:03.296767 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:03.296587 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8c6e90c5-b30f-43cb-bf52-d7d16b587979" containerName="extract" Apr 20 15:10:03.296767 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:03.296593 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c6e90c5-b30f-43cb-bf52-d7d16b587979" containerName="extract" Apr 20 15:10:03.296767 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:03.296605 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8c6e90c5-b30f-43cb-bf52-d7d16b587979" containerName="util" Apr 20 15:10:03.296767 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:03.296611 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c6e90c5-b30f-43cb-bf52-d7d16b587979" containerName="util" Apr 20 15:10:03.296767 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:03.296652 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="8c6e90c5-b30f-43cb-bf52-d7d16b587979" containerName="extract" Apr 20 15:10:03.301100 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:03.301081 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9bpzr7" Apr 20 15:10:03.304016 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:03.303995 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 15:10:03.304518 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:03.304502 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-5dxmm\"" Apr 20 15:10:03.309655 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:03.309635 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 15:10:03.316465 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:03.316442 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9bpzr7"] Apr 20 15:10:03.373140 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:03.373099 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ac315570-366d-4412-aca4-c6137390ccd8-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9bpzr7\" (UID: \"ac315570-366d-4412-aca4-c6137390ccd8\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9bpzr7" Apr 20 15:10:03.373140 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:03.373140 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ac315570-366d-4412-aca4-c6137390ccd8-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9bpzr7\" (UID: \"ac315570-366d-4412-aca4-c6137390ccd8\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9bpzr7" Apr 20 15:10:03.373368 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:03.373177 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drprq\" (UniqueName: \"kubernetes.io/projected/ac315570-366d-4412-aca4-c6137390ccd8-kube-api-access-drprq\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9bpzr7\" (UID: \"ac315570-366d-4412-aca4-c6137390ccd8\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9bpzr7" Apr 20 15:10:03.474594 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:03.474556 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ac315570-366d-4412-aca4-c6137390ccd8-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9bpzr7\" (UID: \"ac315570-366d-4412-aca4-c6137390ccd8\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9bpzr7" Apr 20 15:10:03.474594 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:03.474596 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ac315570-366d-4412-aca4-c6137390ccd8-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9bpzr7\" (UID: \"ac315570-366d-4412-aca4-c6137390ccd8\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9bpzr7" Apr 20 15:10:03.474799 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:03.474642 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-drprq\" (UniqueName: \"kubernetes.io/projected/ac315570-366d-4412-aca4-c6137390ccd8-kube-api-access-drprq\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9bpzr7\" (UID: \"ac315570-366d-4412-aca4-c6137390ccd8\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9bpzr7" Apr 20 15:10:03.474978 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:03.474959 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ac315570-366d-4412-aca4-c6137390ccd8-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9bpzr7\" (UID: \"ac315570-366d-4412-aca4-c6137390ccd8\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9bpzr7" Apr 20 15:10:03.475030 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:03.475014 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ac315570-366d-4412-aca4-c6137390ccd8-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9bpzr7\" (UID: \"ac315570-366d-4412-aca4-c6137390ccd8\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9bpzr7" Apr 20 15:10:03.482770 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:03.482748 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-drprq\" (UniqueName: \"kubernetes.io/projected/ac315570-366d-4412-aca4-c6137390ccd8-kube-api-access-drprq\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9bpzr7\" (UID: \"ac315570-366d-4412-aca4-c6137390ccd8\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9bpzr7" Apr 20 15:10:03.610148 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:03.610070 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9bpzr7" Apr 20 15:10:03.728317 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:03.728196 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9bpzr7"] Apr 20 15:10:03.730751 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:10:03.730718 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac315570_366d_4412_aca4_c6137390ccd8.slice/crio-b27dd8fa960cf7926c5e75d56323c098b5b7da5c9cd3dede53533a18fd4a8533 WatchSource:0}: Error finding container b27dd8fa960cf7926c5e75d56323c098b5b7da5c9cd3dede53533a18fd4a8533: Status 404 returned error can't find the container with id b27dd8fa960cf7926c5e75d56323c098b5b7da5c9cd3dede53533a18fd4a8533 Apr 20 15:10:03.990559 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:03.990450 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6cdf6786d9-jz9pb"] Apr 20 15:10:03.993497 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:03.993457 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-6cdf6786d9-jz9pb" Apr 20 15:10:03.996037 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:03.996015 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 20 15:10:03.996168 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:03.996050 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-td6pf\"" Apr 20 15:10:03.996386 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:03.996367 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 20 15:10:03.996476 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:03.996455 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 20 15:10:03.996759 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:03.996741 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 20 15:10:04.013380 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:04.013356 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6cdf6786d9-jz9pb"] Apr 20 15:10:04.080054 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:04.080017 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6be64f0b-1aba-438c-a1bd-ae1c81cd4a97-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6cdf6786d9-jz9pb\" (UID: \"6be64f0b-1aba-438c-a1bd-ae1c81cd4a97\") " pod="opendatahub/opendatahub-operator-controller-manager-6cdf6786d9-jz9pb" Apr 20 15:10:04.080054 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:04.080061 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpkwm\" (UniqueName: \"kubernetes.io/projected/6be64f0b-1aba-438c-a1bd-ae1c81cd4a97-kube-api-access-gpkwm\") pod \"opendatahub-operator-controller-manager-6cdf6786d9-jz9pb\" (UID: \"6be64f0b-1aba-438c-a1bd-ae1c81cd4a97\") " pod="opendatahub/opendatahub-operator-controller-manager-6cdf6786d9-jz9pb" Apr 20 15:10:04.080263 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:04.080093 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6be64f0b-1aba-438c-a1bd-ae1c81cd4a97-webhook-cert\") pod \"opendatahub-operator-controller-manager-6cdf6786d9-jz9pb\" (UID: \"6be64f0b-1aba-438c-a1bd-ae1c81cd4a97\") " pod="opendatahub/opendatahub-operator-controller-manager-6cdf6786d9-jz9pb" Apr 20 15:10:04.180400 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:04.180360 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6be64f0b-1aba-438c-a1bd-ae1c81cd4a97-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6cdf6786d9-jz9pb\" (UID: \"6be64f0b-1aba-438c-a1bd-ae1c81cd4a97\") " pod="opendatahub/opendatahub-operator-controller-manager-6cdf6786d9-jz9pb" Apr 20 15:10:04.180626 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:04.180409 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gpkwm\" (UniqueName: \"kubernetes.io/projected/6be64f0b-1aba-438c-a1bd-ae1c81cd4a97-kube-api-access-gpkwm\") pod \"opendatahub-operator-controller-manager-6cdf6786d9-jz9pb\" (UID: \"6be64f0b-1aba-438c-a1bd-ae1c81cd4a97\") " pod="opendatahub/opendatahub-operator-controller-manager-6cdf6786d9-jz9pb" Apr 20 15:10:04.180626 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:04.180452 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6be64f0b-1aba-438c-a1bd-ae1c81cd4a97-webhook-cert\") pod \"opendatahub-operator-controller-manager-6cdf6786d9-jz9pb\" (UID: \"6be64f0b-1aba-438c-a1bd-ae1c81cd4a97\") " pod="opendatahub/opendatahub-operator-controller-manager-6cdf6786d9-jz9pb" Apr 20 15:10:04.182853 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:04.182830 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6be64f0b-1aba-438c-a1bd-ae1c81cd4a97-webhook-cert\") pod \"opendatahub-operator-controller-manager-6cdf6786d9-jz9pb\" (UID: \"6be64f0b-1aba-438c-a1bd-ae1c81cd4a97\") " pod="opendatahub/opendatahub-operator-controller-manager-6cdf6786d9-jz9pb" Apr 20 15:10:04.182853 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:04.182844 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6be64f0b-1aba-438c-a1bd-ae1c81cd4a97-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6cdf6786d9-jz9pb\" (UID: \"6be64f0b-1aba-438c-a1bd-ae1c81cd4a97\") " pod="opendatahub/opendatahub-operator-controller-manager-6cdf6786d9-jz9pb" Apr 20 15:10:04.189606 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:04.189584 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpkwm\" (UniqueName: \"kubernetes.io/projected/6be64f0b-1aba-438c-a1bd-ae1c81cd4a97-kube-api-access-gpkwm\") pod \"opendatahub-operator-controller-manager-6cdf6786d9-jz9pb\" (UID: \"6be64f0b-1aba-438c-a1bd-ae1c81cd4a97\") " pod="opendatahub/opendatahub-operator-controller-manager-6cdf6786d9-jz9pb" Apr 20 15:10:04.303011 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:04.302972 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-6cdf6786d9-jz9pb" Apr 20 15:10:04.443240 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:04.443192 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6cdf6786d9-jz9pb"] Apr 20 15:10:04.447300 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:10:04.447263 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6be64f0b_1aba_438c_a1bd_ae1c81cd4a97.slice/crio-aa7eb9e92eebd69e1ffad3a645a97e76da1e6042408b8a56e7d452857f1efde3 WatchSource:0}: Error finding container aa7eb9e92eebd69e1ffad3a645a97e76da1e6042408b8a56e7d452857f1efde3: Status 404 returned error can't find the container with id aa7eb9e92eebd69e1ffad3a645a97e76da1e6042408b8a56e7d452857f1efde3 Apr 20 15:10:04.683976 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:04.683883 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-6cdf6786d9-jz9pb" event={"ID":"6be64f0b-1aba-438c-a1bd-ae1c81cd4a97","Type":"ContainerStarted","Data":"aa7eb9e92eebd69e1ffad3a645a97e76da1e6042408b8a56e7d452857f1efde3"} Apr 20 15:10:04.685156 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:04.685128 2572 generic.go:358] "Generic (PLEG): container finished" podID="ac315570-366d-4412-aca4-c6137390ccd8" containerID="13f103421f35fdd72b4afd7e5d0aba152561f2e57c1c9d0216aed0e619c57f20" exitCode=0 Apr 20 15:10:04.685273 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:04.685163 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9bpzr7" event={"ID":"ac315570-366d-4412-aca4-c6137390ccd8","Type":"ContainerDied","Data":"13f103421f35fdd72b4afd7e5d0aba152561f2e57c1c9d0216aed0e619c57f20"} Apr 20 15:10:04.685273 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:04.685182 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9bpzr7" event={"ID":"ac315570-366d-4412-aca4-c6137390ccd8","Type":"ContainerStarted","Data":"b27dd8fa960cf7926c5e75d56323c098b5b7da5c9cd3dede53533a18fd4a8533"} Apr 20 15:10:05.690158 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:05.690066 2572 generic.go:358] "Generic (PLEG): container finished" podID="ac315570-366d-4412-aca4-c6137390ccd8" containerID="90350bdcd7ed1adf50108b3b11aeb3a434c5812c94c28a2fe9d01986b680d344" exitCode=0 Apr 20 15:10:05.690158 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:05.690135 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9bpzr7" event={"ID":"ac315570-366d-4412-aca4-c6137390ccd8","Type":"ContainerDied","Data":"90350bdcd7ed1adf50108b3b11aeb3a434c5812c94c28a2fe9d01986b680d344"} Apr 20 15:10:07.698395 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:07.698359 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-6cdf6786d9-jz9pb" event={"ID":"6be64f0b-1aba-438c-a1bd-ae1c81cd4a97","Type":"ContainerStarted","Data":"81f4f8d0ec573094384543a77183e98027f289b65914c47d2ff92cd26f200a9b"} Apr 20 15:10:07.698882 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:07.698525 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-6cdf6786d9-jz9pb" Apr 20 15:10:07.700263 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:07.700236 2572 generic.go:358] "Generic (PLEG): container finished" podID="ac315570-366d-4412-aca4-c6137390ccd8" containerID="90bfd370c88a3846912fe2b10a4dcb0d3891c6cf8194a9c5ab0f398a6aa7d33f" exitCode=0 Apr 20 15:10:07.700367 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:07.700300 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9bpzr7" event={"ID":"ac315570-366d-4412-aca4-c6137390ccd8","Type":"ContainerDied","Data":"90bfd370c88a3846912fe2b10a4dcb0d3891c6cf8194a9c5ab0f398a6aa7d33f"} Apr 20 15:10:07.718713 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:07.718640 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-6cdf6786d9-jz9pb" podStartSLOduration=2.156862097 podStartE2EDuration="4.718629176s" podCreationTimestamp="2026-04-20 15:10:03 +0000 UTC" firstStartedPulling="2026-04-20 15:10:04.448940517 +0000 UTC m=+464.579815798" lastFinishedPulling="2026-04-20 15:10:07.010707592 +0000 UTC m=+467.141582877" observedRunningTime="2026-04-20 15:10:07.717148207 +0000 UTC m=+467.848023509" watchObservedRunningTime="2026-04-20 15:10:07.718629176 +0000 UTC m=+467.849504475" Apr 20 15:10:08.822519 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:08.822498 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9bpzr7" Apr 20 15:10:08.930249 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:08.930211 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ac315570-366d-4412-aca4-c6137390ccd8-util\") pod \"ac315570-366d-4412-aca4-c6137390ccd8\" (UID: \"ac315570-366d-4412-aca4-c6137390ccd8\") " Apr 20 15:10:08.930421 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:08.930272 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ac315570-366d-4412-aca4-c6137390ccd8-bundle\") pod \"ac315570-366d-4412-aca4-c6137390ccd8\" (UID: \"ac315570-366d-4412-aca4-c6137390ccd8\") " Apr 20 15:10:08.930421 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:08.930318 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-drprq\" (UniqueName: \"kubernetes.io/projected/ac315570-366d-4412-aca4-c6137390ccd8-kube-api-access-drprq\") pod \"ac315570-366d-4412-aca4-c6137390ccd8\" (UID: \"ac315570-366d-4412-aca4-c6137390ccd8\") " Apr 20 15:10:08.930941 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:08.930912 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac315570-366d-4412-aca4-c6137390ccd8-bundle" (OuterVolumeSpecName: "bundle") pod "ac315570-366d-4412-aca4-c6137390ccd8" (UID: "ac315570-366d-4412-aca4-c6137390ccd8"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 15:10:08.932411 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:08.932387 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac315570-366d-4412-aca4-c6137390ccd8-kube-api-access-drprq" (OuterVolumeSpecName: "kube-api-access-drprq") pod "ac315570-366d-4412-aca4-c6137390ccd8" (UID: "ac315570-366d-4412-aca4-c6137390ccd8"). InnerVolumeSpecName "kube-api-access-drprq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 15:10:08.938521 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:08.938454 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac315570-366d-4412-aca4-c6137390ccd8-util" (OuterVolumeSpecName: "util") pod "ac315570-366d-4412-aca4-c6137390ccd8" (UID: "ac315570-366d-4412-aca4-c6137390ccd8"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 15:10:09.031164 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:09.031131 2572 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ac315570-366d-4412-aca4-c6137390ccd8-util\") on node \"ip-10-0-134-230.ec2.internal\" DevicePath \"\"" Apr 20 15:10:09.031164 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:09.031157 2572 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ac315570-366d-4412-aca4-c6137390ccd8-bundle\") on node \"ip-10-0-134-230.ec2.internal\" DevicePath \"\"" Apr 20 15:10:09.031164 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:09.031167 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-drprq\" (UniqueName: \"kubernetes.io/projected/ac315570-366d-4412-aca4-c6137390ccd8-kube-api-access-drprq\") on node \"ip-10-0-134-230.ec2.internal\" DevicePath \"\"" Apr 20 15:10:09.710653 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:09.710626 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9bpzr7" Apr 20 15:10:09.710822 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:09.710618 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9bpzr7" event={"ID":"ac315570-366d-4412-aca4-c6137390ccd8","Type":"ContainerDied","Data":"b27dd8fa960cf7926c5e75d56323c098b5b7da5c9cd3dede53533a18fd4a8533"} Apr 20 15:10:09.710822 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:09.710736 2572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b27dd8fa960cf7926c5e75d56323c098b5b7da5c9cd3dede53533a18fd4a8533" Apr 20 15:10:18.706398 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:18.706368 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-6cdf6786d9-jz9pb" Apr 20 15:10:20.966753 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:20.966718 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835zz44q"] Apr 20 15:10:20.967133 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:20.966971 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ac315570-366d-4412-aca4-c6137390ccd8" containerName="util" Apr 20 15:10:20.967133 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:20.966982 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac315570-366d-4412-aca4-c6137390ccd8" containerName="util" Apr 20 15:10:20.967133 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:20.966992 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ac315570-366d-4412-aca4-c6137390ccd8" containerName="pull" Apr 20 15:10:20.967133 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:20.966997 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac315570-366d-4412-aca4-c6137390ccd8" containerName="pull" Apr 20 15:10:20.967133 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:20.967009 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ac315570-366d-4412-aca4-c6137390ccd8" containerName="extract" Apr 20 15:10:20.967133 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:20.967015 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac315570-366d-4412-aca4-c6137390ccd8" containerName="extract" Apr 20 15:10:20.967133 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:20.967053 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="ac315570-366d-4412-aca4-c6137390ccd8" containerName="extract" Apr 20 15:10:20.970417 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:20.970397 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835zz44q" Apr 20 15:10:20.973286 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:20.973260 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 15:10:20.973570 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:20.973555 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-5dxmm\"" Apr 20 15:10:20.974172 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:20.974157 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 15:10:20.994399 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:20.994368 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835zz44q"] Apr 20 15:10:21.013680 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:21.013654 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4g6bg\" (UniqueName: \"kubernetes.io/projected/7b0e509a-cf43-4fc2-85bb-0c72ff21cec0-kube-api-access-4g6bg\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835zz44q\" (UID: \"7b0e509a-cf43-4fc2-85bb-0c72ff21cec0\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835zz44q" Apr 20 15:10:21.013839 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:21.013691 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7b0e509a-cf43-4fc2-85bb-0c72ff21cec0-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835zz44q\" (UID: \"7b0e509a-cf43-4fc2-85bb-0c72ff21cec0\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835zz44q" Apr 20 15:10:21.013839 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:21.013709 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7b0e509a-cf43-4fc2-85bb-0c72ff21cec0-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835zz44q\" (UID: \"7b0e509a-cf43-4fc2-85bb-0c72ff21cec0\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835zz44q" Apr 20 15:10:21.114332 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:21.114301 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4g6bg\" (UniqueName: \"kubernetes.io/projected/7b0e509a-cf43-4fc2-85bb-0c72ff21cec0-kube-api-access-4g6bg\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835zz44q\" (UID: \"7b0e509a-cf43-4fc2-85bb-0c72ff21cec0\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835zz44q" Apr 20 15:10:21.114521 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:21.114342 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7b0e509a-cf43-4fc2-85bb-0c72ff21cec0-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835zz44q\" (UID: \"7b0e509a-cf43-4fc2-85bb-0c72ff21cec0\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835zz44q" Apr 20 15:10:21.114521 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:21.114360 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7b0e509a-cf43-4fc2-85bb-0c72ff21cec0-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835zz44q\" (UID: \"7b0e509a-cf43-4fc2-85bb-0c72ff21cec0\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835zz44q" Apr 20 15:10:21.114705 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:21.114689 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7b0e509a-cf43-4fc2-85bb-0c72ff21cec0-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835zz44q\" (UID: \"7b0e509a-cf43-4fc2-85bb-0c72ff21cec0\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835zz44q" Apr 20 15:10:21.114773 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:21.114752 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7b0e509a-cf43-4fc2-85bb-0c72ff21cec0-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835zz44q\" (UID: \"7b0e509a-cf43-4fc2-85bb-0c72ff21cec0\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835zz44q" Apr 20 15:10:21.122402 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:21.122377 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4g6bg\" (UniqueName: \"kubernetes.io/projected/7b0e509a-cf43-4fc2-85bb-0c72ff21cec0-kube-api-access-4g6bg\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835zz44q\" (UID: \"7b0e509a-cf43-4fc2-85bb-0c72ff21cec0\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835zz44q" Apr 20 15:10:21.279550 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:21.279507 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835zz44q" Apr 20 15:10:21.407814 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:21.407639 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835zz44q"] Apr 20 15:10:21.410539 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:21.410512 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-7fcf5d587f-fnkd4"] Apr 20 15:10:21.410698 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:10:21.410669 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b0e509a_cf43_4fc2_85bb_0c72ff21cec0.slice/crio-c4524a13a454a789c8ea95368146fb62551e189079c9a556ce081eb0a454b360 WatchSource:0}: Error finding container c4524a13a454a789c8ea95368146fb62551e189079c9a556ce081eb0a454b360: Status 404 returned error can't find the container with id c4524a13a454a789c8ea95368146fb62551e189079c9a556ce081eb0a454b360 Apr 20 15:10:21.418667 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:21.418652 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-7fcf5d587f-fnkd4" Apr 20 15:10:21.421273 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:21.421255 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 20 15:10:21.421396 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:21.421344 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 20 15:10:21.421776 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:21.421750 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 20 15:10:21.421874 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:21.421793 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-8sbn6\"" Apr 20 15:10:21.421874 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:21.421865 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 20 15:10:21.426820 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:21.426798 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-7fcf5d587f-fnkd4"] Apr 20 15:10:21.516456 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:21.516412 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a507084f-a532-4ee0-b5ae-4bbde0f9b484-tmp\") pod \"kube-auth-proxy-7fcf5d587f-fnkd4\" (UID: \"a507084f-a532-4ee0-b5ae-4bbde0f9b484\") " pod="openshift-ingress/kube-auth-proxy-7fcf5d587f-fnkd4" Apr 20 15:10:21.516636 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:21.516462 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lw4h5\" (UniqueName: \"kubernetes.io/projected/a507084f-a532-4ee0-b5ae-4bbde0f9b484-kube-api-access-lw4h5\") pod \"kube-auth-proxy-7fcf5d587f-fnkd4\" (UID: \"a507084f-a532-4ee0-b5ae-4bbde0f9b484\") " pod="openshift-ingress/kube-auth-proxy-7fcf5d587f-fnkd4" Apr 20 15:10:21.516636 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:21.516558 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a507084f-a532-4ee0-b5ae-4bbde0f9b484-tls-certs\") pod \"kube-auth-proxy-7fcf5d587f-fnkd4\" (UID: \"a507084f-a532-4ee0-b5ae-4bbde0f9b484\") " pod="openshift-ingress/kube-auth-proxy-7fcf5d587f-fnkd4" Apr 20 15:10:21.617558 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:21.617450 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lw4h5\" (UniqueName: \"kubernetes.io/projected/a507084f-a532-4ee0-b5ae-4bbde0f9b484-kube-api-access-lw4h5\") pod \"kube-auth-proxy-7fcf5d587f-fnkd4\" (UID: \"a507084f-a532-4ee0-b5ae-4bbde0f9b484\") " pod="openshift-ingress/kube-auth-proxy-7fcf5d587f-fnkd4" Apr 20 15:10:21.617558 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:21.617535 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a507084f-a532-4ee0-b5ae-4bbde0f9b484-tls-certs\") pod \"kube-auth-proxy-7fcf5d587f-fnkd4\" (UID: \"a507084f-a532-4ee0-b5ae-4bbde0f9b484\") " pod="openshift-ingress/kube-auth-proxy-7fcf5d587f-fnkd4" Apr 20 15:10:21.617781 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:21.617604 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a507084f-a532-4ee0-b5ae-4bbde0f9b484-tmp\") pod \"kube-auth-proxy-7fcf5d587f-fnkd4\" (UID: \"a507084f-a532-4ee0-b5ae-4bbde0f9b484\") " pod="openshift-ingress/kube-auth-proxy-7fcf5d587f-fnkd4" Apr 20 15:10:21.619751 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:21.619730 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a507084f-a532-4ee0-b5ae-4bbde0f9b484-tmp\") pod \"kube-auth-proxy-7fcf5d587f-fnkd4\" (UID: \"a507084f-a532-4ee0-b5ae-4bbde0f9b484\") " pod="openshift-ingress/kube-auth-proxy-7fcf5d587f-fnkd4" Apr 20 15:10:21.620002 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:21.619983 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a507084f-a532-4ee0-b5ae-4bbde0f9b484-tls-certs\") pod \"kube-auth-proxy-7fcf5d587f-fnkd4\" (UID: \"a507084f-a532-4ee0-b5ae-4bbde0f9b484\") " pod="openshift-ingress/kube-auth-proxy-7fcf5d587f-fnkd4" Apr 20 15:10:21.625275 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:21.625252 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lw4h5\" (UniqueName: \"kubernetes.io/projected/a507084f-a532-4ee0-b5ae-4bbde0f9b484-kube-api-access-lw4h5\") pod \"kube-auth-proxy-7fcf5d587f-fnkd4\" (UID: \"a507084f-a532-4ee0-b5ae-4bbde0f9b484\") " pod="openshift-ingress/kube-auth-proxy-7fcf5d587f-fnkd4" Apr 20 15:10:21.734124 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:21.734089 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-7fcf5d587f-fnkd4" Apr 20 15:10:21.749320 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:21.749289 2572 generic.go:358] "Generic (PLEG): container finished" podID="7b0e509a-cf43-4fc2-85bb-0c72ff21cec0" containerID="c94f2b610ae1f5ae1e5d0266ed6af567edf39c8fffc47d391338723d56a166e4" exitCode=0 Apr 20 15:10:21.749448 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:21.749381 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835zz44q" event={"ID":"7b0e509a-cf43-4fc2-85bb-0c72ff21cec0","Type":"ContainerDied","Data":"c94f2b610ae1f5ae1e5d0266ed6af567edf39c8fffc47d391338723d56a166e4"} Apr 20 15:10:21.749448 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:21.749425 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835zz44q" event={"ID":"7b0e509a-cf43-4fc2-85bb-0c72ff21cec0","Type":"ContainerStarted","Data":"c4524a13a454a789c8ea95368146fb62551e189079c9a556ce081eb0a454b360"} Apr 20 15:10:21.851283 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:21.851256 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-7fcf5d587f-fnkd4"] Apr 20 15:10:21.854051 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:10:21.854019 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda507084f_a532_4ee0_b5ae_4bbde0f9b484.slice/crio-5af0e506da6bd82ecf53256a84c61b308c3ffc0e27b1610a0d234ad4c96dc687 WatchSource:0}: Error finding container 5af0e506da6bd82ecf53256a84c61b308c3ffc0e27b1610a0d234ad4c96dc687: Status 404 returned error can't find the container with id 5af0e506da6bd82ecf53256a84c61b308c3ffc0e27b1610a0d234ad4c96dc687 Apr 20 15:10:22.754916 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:22.754886 2572 generic.go:358] "Generic (PLEG): container finished" podID="7b0e509a-cf43-4fc2-85bb-0c72ff21cec0" containerID="9ad021f4e53032168ce001644431f5d2c329a00fed736f7164f2113d9f82222f" exitCode=0 Apr 20 15:10:22.755393 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:22.754978 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835zz44q" event={"ID":"7b0e509a-cf43-4fc2-85bb-0c72ff21cec0","Type":"ContainerDied","Data":"9ad021f4e53032168ce001644431f5d2c329a00fed736f7164f2113d9f82222f"} Apr 20 15:10:22.756630 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:22.756580 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-7fcf5d587f-fnkd4" event={"ID":"a507084f-a532-4ee0-b5ae-4bbde0f9b484","Type":"ContainerStarted","Data":"5af0e506da6bd82ecf53256a84c61b308c3ffc0e27b1610a0d234ad4c96dc687"} Apr 20 15:10:23.762196 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:23.762163 2572 generic.go:358] "Generic (PLEG): container finished" podID="7b0e509a-cf43-4fc2-85bb-0c72ff21cec0" containerID="91434daec8facb3fa9817f03a97fb0a82289b5442f658c10ad9325a60a0768a4" exitCode=0 Apr 20 15:10:23.762711 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:23.762230 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835zz44q" event={"ID":"7b0e509a-cf43-4fc2-85bb-0c72ff21cec0","Type":"ContainerDied","Data":"91434daec8facb3fa9817f03a97fb0a82289b5442f658c10ad9325a60a0768a4"} Apr 20 15:10:24.784402 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:24.784363 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-ws2wq"] Apr 20 15:10:24.787899 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:24.787851 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-ws2wq" Apr 20 15:10:24.790543 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:24.790512 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-webhook-cert\"" Apr 20 15:10:24.790678 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:24.790603 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-dockercfg-78lg9\"" Apr 20 15:10:24.796668 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:24.796326 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-ws2wq"] Apr 20 15:10:24.843211 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:24.843162 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2sdgj\" (UniqueName: \"kubernetes.io/projected/ba566142-9484-403a-b32d-f2a048b6021b-kube-api-access-2sdgj\") pod \"odh-model-controller-858dbf95b8-ws2wq\" (UID: \"ba566142-9484-403a-b32d-f2a048b6021b\") " pod="opendatahub/odh-model-controller-858dbf95b8-ws2wq" Apr 20 15:10:24.843370 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:24.843221 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ba566142-9484-403a-b32d-f2a048b6021b-cert\") pod \"odh-model-controller-858dbf95b8-ws2wq\" (UID: \"ba566142-9484-403a-b32d-f2a048b6021b\") " pod="opendatahub/odh-model-controller-858dbf95b8-ws2wq" Apr 20 15:10:24.944511 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:24.944466 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2sdgj\" (UniqueName: \"kubernetes.io/projected/ba566142-9484-403a-b32d-f2a048b6021b-kube-api-access-2sdgj\") pod \"odh-model-controller-858dbf95b8-ws2wq\" (UID: \"ba566142-9484-403a-b32d-f2a048b6021b\") " pod="opendatahub/odh-model-controller-858dbf95b8-ws2wq" Apr 20 15:10:24.944689 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:24.944547 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ba566142-9484-403a-b32d-f2a048b6021b-cert\") pod \"odh-model-controller-858dbf95b8-ws2wq\" (UID: \"ba566142-9484-403a-b32d-f2a048b6021b\") " pod="opendatahub/odh-model-controller-858dbf95b8-ws2wq" Apr 20 15:10:24.944689 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:10:24.944655 2572 secret.go:189] Couldn't get secret opendatahub/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 20 15:10:24.944803 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:10:24.944716 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ba566142-9484-403a-b32d-f2a048b6021b-cert podName:ba566142-9484-403a-b32d-f2a048b6021b nodeName:}" failed. No retries permitted until 2026-04-20 15:10:25.44469668 +0000 UTC m=+485.575571965 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ba566142-9484-403a-b32d-f2a048b6021b-cert") pod "odh-model-controller-858dbf95b8-ws2wq" (UID: "ba566142-9484-403a-b32d-f2a048b6021b") : secret "odh-model-controller-webhook-cert" not found Apr 20 15:10:24.955641 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:24.955604 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2sdgj\" (UniqueName: \"kubernetes.io/projected/ba566142-9484-403a-b32d-f2a048b6021b-kube-api-access-2sdgj\") pod \"odh-model-controller-858dbf95b8-ws2wq\" (UID: \"ba566142-9484-403a-b32d-f2a048b6021b\") " pod="opendatahub/odh-model-controller-858dbf95b8-ws2wq" Apr 20 15:10:25.449155 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:25.449124 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ba566142-9484-403a-b32d-f2a048b6021b-cert\") pod \"odh-model-controller-858dbf95b8-ws2wq\" (UID: \"ba566142-9484-403a-b32d-f2a048b6021b\") " pod="opendatahub/odh-model-controller-858dbf95b8-ws2wq" Apr 20 15:10:25.451794 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:25.451765 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ba566142-9484-403a-b32d-f2a048b6021b-cert\") pod \"odh-model-controller-858dbf95b8-ws2wq\" (UID: \"ba566142-9484-403a-b32d-f2a048b6021b\") " pod="opendatahub/odh-model-controller-858dbf95b8-ws2wq" Apr 20 15:10:25.701020 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:25.700921 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-ws2wq" Apr 20 15:10:26.159074 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:26.159049 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835zz44q" Apr 20 15:10:26.256646 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:26.256623 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7b0e509a-cf43-4fc2-85bb-0c72ff21cec0-bundle\") pod \"7b0e509a-cf43-4fc2-85bb-0c72ff21cec0\" (UID: \"7b0e509a-cf43-4fc2-85bb-0c72ff21cec0\") " Apr 20 15:10:26.256808 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:26.256689 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7b0e509a-cf43-4fc2-85bb-0c72ff21cec0-util\") pod \"7b0e509a-cf43-4fc2-85bb-0c72ff21cec0\" (UID: \"7b0e509a-cf43-4fc2-85bb-0c72ff21cec0\") " Apr 20 15:10:26.256808 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:26.256750 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4g6bg\" (UniqueName: \"kubernetes.io/projected/7b0e509a-cf43-4fc2-85bb-0c72ff21cec0-kube-api-access-4g6bg\") pod \"7b0e509a-cf43-4fc2-85bb-0c72ff21cec0\" (UID: \"7b0e509a-cf43-4fc2-85bb-0c72ff21cec0\") " Apr 20 15:10:26.257513 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:26.257473 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b0e509a-cf43-4fc2-85bb-0c72ff21cec0-bundle" (OuterVolumeSpecName: "bundle") pod "7b0e509a-cf43-4fc2-85bb-0c72ff21cec0" (UID: "7b0e509a-cf43-4fc2-85bb-0c72ff21cec0"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 15:10:26.259181 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:26.259158 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b0e509a-cf43-4fc2-85bb-0c72ff21cec0-kube-api-access-4g6bg" (OuterVolumeSpecName: "kube-api-access-4g6bg") pod "7b0e509a-cf43-4fc2-85bb-0c72ff21cec0" (UID: "7b0e509a-cf43-4fc2-85bb-0c72ff21cec0"). InnerVolumeSpecName "kube-api-access-4g6bg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 15:10:26.262331 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:26.262292 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b0e509a-cf43-4fc2-85bb-0c72ff21cec0-util" (OuterVolumeSpecName: "util") pod "7b0e509a-cf43-4fc2-85bb-0c72ff21cec0" (UID: "7b0e509a-cf43-4fc2-85bb-0c72ff21cec0"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 15:10:26.357690 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:26.357639 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4g6bg\" (UniqueName: \"kubernetes.io/projected/7b0e509a-cf43-4fc2-85bb-0c72ff21cec0-kube-api-access-4g6bg\") on node \"ip-10-0-134-230.ec2.internal\" DevicePath \"\"" Apr 20 15:10:26.357690 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:26.357690 2572 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7b0e509a-cf43-4fc2-85bb-0c72ff21cec0-bundle\") on node \"ip-10-0-134-230.ec2.internal\" DevicePath \"\"" Apr 20 15:10:26.357861 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:26.357704 2572 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7b0e509a-cf43-4fc2-85bb-0c72ff21cec0-util\") on node \"ip-10-0-134-230.ec2.internal\" DevicePath \"\"" Apr 20 15:10:26.467904 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:26.467862 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-ws2wq"] Apr 20 15:10:26.472407 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:10:26.472381 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba566142_9484_403a_b32d_f2a048b6021b.slice/crio-ac18eeb98c9344a7e33983415cc9a775f443b7bf974711af5885280ae0d62882 WatchSource:0}: Error finding container ac18eeb98c9344a7e33983415cc9a775f443b7bf974711af5885280ae0d62882: Status 404 returned error can't find the container with id ac18eeb98c9344a7e33983415cc9a775f443b7bf974711af5885280ae0d62882 Apr 20 15:10:26.775316 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:26.775274 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-7fcf5d587f-fnkd4" event={"ID":"a507084f-a532-4ee0-b5ae-4bbde0f9b484","Type":"ContainerStarted","Data":"ef5a0a482e42ddf606f15ca7158d6e35cbe5c59850b2ae20c395eb19bec56a1d"} Apr 20 15:10:26.777267 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:26.777236 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835zz44q" event={"ID":"7b0e509a-cf43-4fc2-85bb-0c72ff21cec0","Type":"ContainerDied","Data":"c4524a13a454a789c8ea95368146fb62551e189079c9a556ce081eb0a454b360"} Apr 20 15:10:26.777387 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:26.777272 2572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c4524a13a454a789c8ea95368146fb62551e189079c9a556ce081eb0a454b360" Apr 20 15:10:26.777387 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:26.777299 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835zz44q" Apr 20 15:10:26.778275 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:26.778239 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-ws2wq" event={"ID":"ba566142-9484-403a-b32d-f2a048b6021b","Type":"ContainerStarted","Data":"ac18eeb98c9344a7e33983415cc9a775f443b7bf974711af5885280ae0d62882"} Apr 20 15:10:26.793808 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:26.793748 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-7fcf5d587f-fnkd4" podStartSLOduration=1.447071861 podStartE2EDuration="5.793736984s" podCreationTimestamp="2026-04-20 15:10:21 +0000 UTC" firstStartedPulling="2026-04-20 15:10:21.856299613 +0000 UTC m=+481.987174893" lastFinishedPulling="2026-04-20 15:10:26.202964735 +0000 UTC m=+486.333840016" observedRunningTime="2026-04-20 15:10:26.792530375 +0000 UTC m=+486.923405677" watchObservedRunningTime="2026-04-20 15:10:26.793736984 +0000 UTC m=+486.924612287" Apr 20 15:10:29.791538 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:29.791480 2572 generic.go:358] "Generic (PLEG): container finished" podID="ba566142-9484-403a-b32d-f2a048b6021b" containerID="b5e9d96cfd118f635eb18fe3aa43da21f7a422ee60f0f4630df1b3fa87fa5aa4" exitCode=1 Apr 20 15:10:29.791538 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:29.791541 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-ws2wq" event={"ID":"ba566142-9484-403a-b32d-f2a048b6021b","Type":"ContainerDied","Data":"b5e9d96cfd118f635eb18fe3aa43da21f7a422ee60f0f4630df1b3fa87fa5aa4"} Apr 20 15:10:29.791933 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:29.791721 2572 scope.go:117] "RemoveContainer" containerID="b5e9d96cfd118f635eb18fe3aa43da21f7a422ee60f0f4630df1b3fa87fa5aa4" Apr 20 15:10:30.795733 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:30.795697 2572 generic.go:358] "Generic (PLEG): container finished" podID="ba566142-9484-403a-b32d-f2a048b6021b" containerID="fe365cf810c13dc180721dcbaa7323020ee33e27ee3b6a6a6073779538f5c473" exitCode=1 Apr 20 15:10:30.796151 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:30.795777 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-ws2wq" event={"ID":"ba566142-9484-403a-b32d-f2a048b6021b","Type":"ContainerDied","Data":"fe365cf810c13dc180721dcbaa7323020ee33e27ee3b6a6a6073779538f5c473"} Apr 20 15:10:30.796151 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:30.795825 2572 scope.go:117] "RemoveContainer" containerID="b5e9d96cfd118f635eb18fe3aa43da21f7a422ee60f0f4630df1b3fa87fa5aa4" Apr 20 15:10:30.796151 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:30.796019 2572 scope.go:117] "RemoveContainer" containerID="fe365cf810c13dc180721dcbaa7323020ee33e27ee3b6a6a6073779538f5c473" Apr 20 15:10:30.796291 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:10:30.796203 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-ws2wq_opendatahub(ba566142-9484-403a-b32d-f2a048b6021b)\"" pod="opendatahub/odh-model-controller-858dbf95b8-ws2wq" podUID="ba566142-9484-403a-b32d-f2a048b6021b" Apr 20 15:10:30.871887 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:30.871854 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-2r77g"] Apr 20 15:10:30.872114 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:30.872103 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7b0e509a-cf43-4fc2-85bb-0c72ff21cec0" containerName="pull" Apr 20 15:10:30.872166 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:30.872116 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b0e509a-cf43-4fc2-85bb-0c72ff21cec0" containerName="pull" Apr 20 15:10:30.872166 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:30.872132 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7b0e509a-cf43-4fc2-85bb-0c72ff21cec0" containerName="extract" Apr 20 15:10:30.872166 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:30.872138 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b0e509a-cf43-4fc2-85bb-0c72ff21cec0" containerName="extract" Apr 20 15:10:30.872166 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:30.872151 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7b0e509a-cf43-4fc2-85bb-0c72ff21cec0" containerName="util" Apr 20 15:10:30.872166 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:30.872157 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b0e509a-cf43-4fc2-85bb-0c72ff21cec0" containerName="util" Apr 20 15:10:30.872314 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:30.872196 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="7b0e509a-cf43-4fc2-85bb-0c72ff21cec0" containerName="extract" Apr 20 15:10:30.875991 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:30.875967 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-2r77g" Apr 20 15:10:30.878229 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:30.878208 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-webhook-server-cert\"" Apr 20 15:10:30.878537 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:30.878515 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-controller-manager-dockercfg-xtqxb\"" Apr 20 15:10:30.886072 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:30.886033 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-2r77g"] Apr 20 15:10:30.996668 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:30.996632 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rn8q7\" (UniqueName: \"kubernetes.io/projected/0653a16c-21ee-4062-b961-87c17974c316-kube-api-access-rn8q7\") pod \"kserve-controller-manager-856948b99f-2r77g\" (UID: \"0653a16c-21ee-4062-b961-87c17974c316\") " pod="opendatahub/kserve-controller-manager-856948b99f-2r77g" Apr 20 15:10:30.996825 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:30.996709 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0653a16c-21ee-4062-b961-87c17974c316-cert\") pod \"kserve-controller-manager-856948b99f-2r77g\" (UID: \"0653a16c-21ee-4062-b961-87c17974c316\") " pod="opendatahub/kserve-controller-manager-856948b99f-2r77g" Apr 20 15:10:31.097933 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:31.097848 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0653a16c-21ee-4062-b961-87c17974c316-cert\") pod \"kserve-controller-manager-856948b99f-2r77g\" (UID: \"0653a16c-21ee-4062-b961-87c17974c316\") " pod="opendatahub/kserve-controller-manager-856948b99f-2r77g" Apr 20 15:10:31.097933 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:31.097907 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rn8q7\" (UniqueName: \"kubernetes.io/projected/0653a16c-21ee-4062-b961-87c17974c316-kube-api-access-rn8q7\") pod \"kserve-controller-manager-856948b99f-2r77g\" (UID: \"0653a16c-21ee-4062-b961-87c17974c316\") " pod="opendatahub/kserve-controller-manager-856948b99f-2r77g" Apr 20 15:10:31.098115 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:10:31.098001 2572 secret.go:189] Couldn't get secret opendatahub/kserve-webhook-server-cert: secret "kserve-webhook-server-cert" not found Apr 20 15:10:31.098115 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:10:31.098067 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0653a16c-21ee-4062-b961-87c17974c316-cert podName:0653a16c-21ee-4062-b961-87c17974c316 nodeName:}" failed. No retries permitted until 2026-04-20 15:10:31.598050254 +0000 UTC m=+491.728925536 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0653a16c-21ee-4062-b961-87c17974c316-cert") pod "kserve-controller-manager-856948b99f-2r77g" (UID: "0653a16c-21ee-4062-b961-87c17974c316") : secret "kserve-webhook-server-cert" not found Apr 20 15:10:31.112495 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:31.112455 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rn8q7\" (UniqueName: \"kubernetes.io/projected/0653a16c-21ee-4062-b961-87c17974c316-kube-api-access-rn8q7\") pod \"kserve-controller-manager-856948b99f-2r77g\" (UID: \"0653a16c-21ee-4062-b961-87c17974c316\") " pod="opendatahub/kserve-controller-manager-856948b99f-2r77g" Apr 20 15:10:31.600439 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:31.600396 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0653a16c-21ee-4062-b961-87c17974c316-cert\") pod \"kserve-controller-manager-856948b99f-2r77g\" (UID: \"0653a16c-21ee-4062-b961-87c17974c316\") " pod="opendatahub/kserve-controller-manager-856948b99f-2r77g" Apr 20 15:10:31.602752 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:31.602731 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0653a16c-21ee-4062-b961-87c17974c316-cert\") pod \"kserve-controller-manager-856948b99f-2r77g\" (UID: \"0653a16c-21ee-4062-b961-87c17974c316\") " pod="opendatahub/kserve-controller-manager-856948b99f-2r77g" Apr 20 15:10:31.788299 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:31.788263 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-2r77g" Apr 20 15:10:31.800765 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:31.800739 2572 scope.go:117] "RemoveContainer" containerID="fe365cf810c13dc180721dcbaa7323020ee33e27ee3b6a6a6073779538f5c473" Apr 20 15:10:31.801148 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:10:31.800918 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-ws2wq_opendatahub(ba566142-9484-403a-b32d-f2a048b6021b)\"" pod="opendatahub/odh-model-controller-858dbf95b8-ws2wq" podUID="ba566142-9484-403a-b32d-f2a048b6021b" Apr 20 15:10:31.927332 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:31.927310 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-2r77g"] Apr 20 15:10:31.929335 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:10:31.929304 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0653a16c_21ee_4062_b961_87c17974c316.slice/crio-c6c87c174a3d9de8e57833c8ef11802ac24ca972a09a6bcfa63ffd0e2a7986b5 WatchSource:0}: Error finding container c6c87c174a3d9de8e57833c8ef11802ac24ca972a09a6bcfa63ffd0e2a7986b5: Status 404 returned error can't find the container with id c6c87c174a3d9de8e57833c8ef11802ac24ca972a09a6bcfa63ffd0e2a7986b5 Apr 20 15:10:32.804282 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:32.804239 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-2r77g" event={"ID":"0653a16c-21ee-4062-b961-87c17974c316","Type":"ContainerStarted","Data":"c6c87c174a3d9de8e57833c8ef11802ac24ca972a09a6bcfa63ffd0e2a7986b5"} Apr 20 15:10:35.415733 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:35.415700 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hpz9d"] Apr 20 15:10:35.420026 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:35.420004 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hpz9d" Apr 20 15:10:35.423198 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:35.423173 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 15:10:35.423312 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:35.423212 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 15:10:35.424208 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:35.424190 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-5dxmm\"" Apr 20 15:10:35.432237 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:35.432194 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hpz9d"] Apr 20 15:10:35.530897 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:35.530856 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzmlt\" (UniqueName: \"kubernetes.io/projected/7f14279a-b8cb-419c-a8b4-e61d472fce2c-kube-api-access-fzmlt\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hpz9d\" (UID: \"7f14279a-b8cb-419c-a8b4-e61d472fce2c\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hpz9d" Apr 20 15:10:35.531057 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:35.530921 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7f14279a-b8cb-419c-a8b4-e61d472fce2c-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hpz9d\" (UID: \"7f14279a-b8cb-419c-a8b4-e61d472fce2c\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hpz9d" Apr 20 15:10:35.531057 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:35.530982 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7f14279a-b8cb-419c-a8b4-e61d472fce2c-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hpz9d\" (UID: \"7f14279a-b8cb-419c-a8b4-e61d472fce2c\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hpz9d" Apr 20 15:10:35.631816 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:35.631791 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fzmlt\" (UniqueName: \"kubernetes.io/projected/7f14279a-b8cb-419c-a8b4-e61d472fce2c-kube-api-access-fzmlt\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hpz9d\" (UID: \"7f14279a-b8cb-419c-a8b4-e61d472fce2c\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hpz9d" Apr 20 15:10:35.631927 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:35.631830 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7f14279a-b8cb-419c-a8b4-e61d472fce2c-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hpz9d\" (UID: \"7f14279a-b8cb-419c-a8b4-e61d472fce2c\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hpz9d" Apr 20 15:10:35.631927 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:35.631867 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7f14279a-b8cb-419c-a8b4-e61d472fce2c-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hpz9d\" (UID: \"7f14279a-b8cb-419c-a8b4-e61d472fce2c\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hpz9d" Apr 20 15:10:35.632188 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:35.632170 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7f14279a-b8cb-419c-a8b4-e61d472fce2c-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hpz9d\" (UID: \"7f14279a-b8cb-419c-a8b4-e61d472fce2c\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hpz9d" Apr 20 15:10:35.632245 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:35.632203 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7f14279a-b8cb-419c-a8b4-e61d472fce2c-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hpz9d\" (UID: \"7f14279a-b8cb-419c-a8b4-e61d472fce2c\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hpz9d" Apr 20 15:10:35.645757 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:35.645737 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzmlt\" (UniqueName: \"kubernetes.io/projected/7f14279a-b8cb-419c-a8b4-e61d472fce2c-kube-api-access-fzmlt\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hpz9d\" (UID: \"7f14279a-b8cb-419c-a8b4-e61d472fce2c\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hpz9d" Apr 20 15:10:35.701916 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:35.701892 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-ws2wq" Apr 20 15:10:35.702241 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:35.702228 2572 scope.go:117] "RemoveContainer" containerID="fe365cf810c13dc180721dcbaa7323020ee33e27ee3b6a6a6073779538f5c473" Apr 20 15:10:35.702386 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:10:35.702371 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-ws2wq_opendatahub(ba566142-9484-403a-b32d-f2a048b6021b)\"" pod="opendatahub/odh-model-controller-858dbf95b8-ws2wq" podUID="ba566142-9484-403a-b32d-f2a048b6021b" Apr 20 15:10:35.731869 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:35.731844 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hpz9d" Apr 20 15:10:35.816366 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:35.815972 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-2r77g" event={"ID":"0653a16c-21ee-4062-b961-87c17974c316","Type":"ContainerStarted","Data":"a4c93bb0812ce056152f20936a932cdfa3db50c962f83853fa88ec6b04a01ee2"} Apr 20 15:10:35.816533 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:35.816457 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/kserve-controller-manager-856948b99f-2r77g" Apr 20 15:10:35.850948 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:35.850889 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/kserve-controller-manager-856948b99f-2r77g" podStartSLOduration=2.177285656 podStartE2EDuration="5.850866726s" podCreationTimestamp="2026-04-20 15:10:30 +0000 UTC" firstStartedPulling="2026-04-20 15:10:31.930659602 +0000 UTC m=+492.061534886" lastFinishedPulling="2026-04-20 15:10:35.604240672 +0000 UTC m=+495.735115956" observedRunningTime="2026-04-20 15:10:35.849701516 +0000 UTC m=+495.980576820" watchObservedRunningTime="2026-04-20 15:10:35.850866726 +0000 UTC m=+495.981742029" Apr 20 15:10:35.868411 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:35.868377 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hpz9d"] Apr 20 15:10:35.872752 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:10:35.872716 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f14279a_b8cb_419c_a8b4_e61d472fce2c.slice/crio-67ae649e37509f14aa9b2ec6bfd06b6c5c23821a5d4d6196b8e21affa236c77b WatchSource:0}: Error finding container 67ae649e37509f14aa9b2ec6bfd06b6c5c23821a5d4d6196b8e21affa236c77b: Status 404 returned error can't find the container with id 67ae649e37509f14aa9b2ec6bfd06b6c5c23821a5d4d6196b8e21affa236c77b Apr 20 15:10:36.637239 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:36.637206 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-gg6km"] Apr 20 15:10:36.639955 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:36.639936 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-gg6km" Apr 20 15:10:36.644701 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:36.644685 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"kube-root-ca.crt\"" Apr 20 15:10:36.644807 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:36.644786 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"openshift-service-ca.crt\"" Apr 20 15:10:36.645160 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:36.645141 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"servicemesh-operator3-dockercfg-427lt\"" Apr 20 15:10:36.657244 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:36.657214 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-gg6km"] Apr 20 15:10:36.740195 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:36.740165 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kp78q\" (UniqueName: \"kubernetes.io/projected/20cac620-948b-44f0-b2b4-d2c0192a0028-kube-api-access-kp78q\") pod \"servicemesh-operator3-55f49c5f94-gg6km\" (UID: \"20cac620-948b-44f0-b2b4-d2c0192a0028\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-gg6km" Apr 20 15:10:36.740333 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:36.740209 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/20cac620-948b-44f0-b2b4-d2c0192a0028-operator-config\") pod \"servicemesh-operator3-55f49c5f94-gg6km\" (UID: \"20cac620-948b-44f0-b2b4-d2c0192a0028\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-gg6km" Apr 20 15:10:36.820452 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:36.820420 2572 generic.go:358] "Generic (PLEG): container finished" podID="7f14279a-b8cb-419c-a8b4-e61d472fce2c" containerID="4ff1ec32a817685d8557569e1464dd9d110ef1aaced7cfd6f644035331a638ae" exitCode=0 Apr 20 15:10:36.820646 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:36.820514 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hpz9d" event={"ID":"7f14279a-b8cb-419c-a8b4-e61d472fce2c","Type":"ContainerDied","Data":"4ff1ec32a817685d8557569e1464dd9d110ef1aaced7cfd6f644035331a638ae"} Apr 20 15:10:36.820646 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:36.820550 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hpz9d" event={"ID":"7f14279a-b8cb-419c-a8b4-e61d472fce2c","Type":"ContainerStarted","Data":"67ae649e37509f14aa9b2ec6bfd06b6c5c23821a5d4d6196b8e21affa236c77b"} Apr 20 15:10:36.841127 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:36.841104 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kp78q\" (UniqueName: \"kubernetes.io/projected/20cac620-948b-44f0-b2b4-d2c0192a0028-kube-api-access-kp78q\") pod \"servicemesh-operator3-55f49c5f94-gg6km\" (UID: \"20cac620-948b-44f0-b2b4-d2c0192a0028\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-gg6km" Apr 20 15:10:36.841234 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:36.841154 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/20cac620-948b-44f0-b2b4-d2c0192a0028-operator-config\") pod \"servicemesh-operator3-55f49c5f94-gg6km\" (UID: \"20cac620-948b-44f0-b2b4-d2c0192a0028\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-gg6km" Apr 20 15:10:36.843540 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:36.843518 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/20cac620-948b-44f0-b2b4-d2c0192a0028-operator-config\") pod \"servicemesh-operator3-55f49c5f94-gg6km\" (UID: \"20cac620-948b-44f0-b2b4-d2c0192a0028\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-gg6km" Apr 20 15:10:36.849469 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:36.849445 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kp78q\" (UniqueName: \"kubernetes.io/projected/20cac620-948b-44f0-b2b4-d2c0192a0028-kube-api-access-kp78q\") pod \"servicemesh-operator3-55f49c5f94-gg6km\" (UID: \"20cac620-948b-44f0-b2b4-d2c0192a0028\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-gg6km" Apr 20 15:10:36.955021 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:36.954955 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-gg6km" Apr 20 15:10:37.084407 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:37.084369 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-gg6km"] Apr 20 15:10:37.086702 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:10:37.086675 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20cac620_948b_44f0_b2b4_d2c0192a0028.slice/crio-318885c5e06abe2e564d431088b2a16c8f179d4c3335491f4711bc12525f9226 WatchSource:0}: Error finding container 318885c5e06abe2e564d431088b2a16c8f179d4c3335491f4711bc12525f9226: Status 404 returned error can't find the container with id 318885c5e06abe2e564d431088b2a16c8f179d4c3335491f4711bc12525f9226 Apr 20 15:10:37.825296 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:37.825216 2572 generic.go:358] "Generic (PLEG): container finished" podID="7f14279a-b8cb-419c-a8b4-e61d472fce2c" containerID="329745c84b86ccf1feaa9cc091845b7ccd3925839437a8775736129f85aa38e3" exitCode=0 Apr 20 15:10:37.825730 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:37.825305 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hpz9d" event={"ID":"7f14279a-b8cb-419c-a8b4-e61d472fce2c","Type":"ContainerDied","Data":"329745c84b86ccf1feaa9cc091845b7ccd3925839437a8775736129f85aa38e3"} Apr 20 15:10:37.826537 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:37.826514 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-gg6km" event={"ID":"20cac620-948b-44f0-b2b4-d2c0192a0028","Type":"ContainerStarted","Data":"318885c5e06abe2e564d431088b2a16c8f179d4c3335491f4711bc12525f9226"} Apr 20 15:10:38.835245 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:38.835205 2572 generic.go:358] "Generic (PLEG): container finished" podID="7f14279a-b8cb-419c-a8b4-e61d472fce2c" containerID="c2a0a9d7311a6b7c39dbad6d62825d822d5c7cc7a53e12f0d5f7f454820e068a" exitCode=0 Apr 20 15:10:38.835655 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:38.835275 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hpz9d" event={"ID":"7f14279a-b8cb-419c-a8b4-e61d472fce2c","Type":"ContainerDied","Data":"c2a0a9d7311a6b7c39dbad6d62825d822d5c7cc7a53e12f0d5f7f454820e068a"} Apr 20 15:10:39.962173 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:39.962151 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hpz9d" Apr 20 15:10:40.069225 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:40.069187 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7f14279a-b8cb-419c-a8b4-e61d472fce2c-util\") pod \"7f14279a-b8cb-419c-a8b4-e61d472fce2c\" (UID: \"7f14279a-b8cb-419c-a8b4-e61d472fce2c\") " Apr 20 15:10:40.069416 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:40.069268 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fzmlt\" (UniqueName: \"kubernetes.io/projected/7f14279a-b8cb-419c-a8b4-e61d472fce2c-kube-api-access-fzmlt\") pod \"7f14279a-b8cb-419c-a8b4-e61d472fce2c\" (UID: \"7f14279a-b8cb-419c-a8b4-e61d472fce2c\") " Apr 20 15:10:40.069416 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:40.069310 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7f14279a-b8cb-419c-a8b4-e61d472fce2c-bundle\") pod \"7f14279a-b8cb-419c-a8b4-e61d472fce2c\" (UID: \"7f14279a-b8cb-419c-a8b4-e61d472fce2c\") " Apr 20 15:10:40.070309 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:40.070276 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f14279a-b8cb-419c-a8b4-e61d472fce2c-bundle" (OuterVolumeSpecName: "bundle") pod "7f14279a-b8cb-419c-a8b4-e61d472fce2c" (UID: "7f14279a-b8cb-419c-a8b4-e61d472fce2c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 15:10:40.072036 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:40.072001 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f14279a-b8cb-419c-a8b4-e61d472fce2c-kube-api-access-fzmlt" (OuterVolumeSpecName: "kube-api-access-fzmlt") pod "7f14279a-b8cb-419c-a8b4-e61d472fce2c" (UID: "7f14279a-b8cb-419c-a8b4-e61d472fce2c"). InnerVolumeSpecName "kube-api-access-fzmlt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 15:10:40.074801 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:40.074775 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f14279a-b8cb-419c-a8b4-e61d472fce2c-util" (OuterVolumeSpecName: "util") pod "7f14279a-b8cb-419c-a8b4-e61d472fce2c" (UID: "7f14279a-b8cb-419c-a8b4-e61d472fce2c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 15:10:40.170811 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:40.170678 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fzmlt\" (UniqueName: \"kubernetes.io/projected/7f14279a-b8cb-419c-a8b4-e61d472fce2c-kube-api-access-fzmlt\") on node \"ip-10-0-134-230.ec2.internal\" DevicePath \"\"" Apr 20 15:10:40.170811 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:40.170713 2572 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7f14279a-b8cb-419c-a8b4-e61d472fce2c-bundle\") on node \"ip-10-0-134-230.ec2.internal\" DevicePath \"\"" Apr 20 15:10:40.170811 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:40.170728 2572 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7f14279a-b8cb-419c-a8b4-e61d472fce2c-util\") on node \"ip-10-0-134-230.ec2.internal\" DevicePath \"\"" Apr 20 15:10:40.842852 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:40.842823 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hpz9d" Apr 20 15:10:40.842852 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:40.842830 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hpz9d" event={"ID":"7f14279a-b8cb-419c-a8b4-e61d472fce2c","Type":"ContainerDied","Data":"67ae649e37509f14aa9b2ec6bfd06b6c5c23821a5d4d6196b8e21affa236c77b"} Apr 20 15:10:40.843081 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:40.842868 2572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="67ae649e37509f14aa9b2ec6bfd06b6c5c23821a5d4d6196b8e21affa236c77b" Apr 20 15:10:40.844382 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:40.844360 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-gg6km" event={"ID":"20cac620-948b-44f0-b2b4-d2c0192a0028","Type":"ContainerStarted","Data":"2c5e79338a68cd63771aa86d3e80a7dd058e09b4b5a04a948b76fa2eaac6ef84"} Apr 20 15:10:40.844521 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:40.844475 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-gg6km" Apr 20 15:10:40.867716 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:40.867672 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/servicemesh-operator3-55f49c5f94-gg6km" podStartSLOduration=2.094772209 podStartE2EDuration="4.867655491s" podCreationTimestamp="2026-04-20 15:10:36 +0000 UTC" firstStartedPulling="2026-04-20 15:10:37.089176467 +0000 UTC m=+497.220051753" lastFinishedPulling="2026-04-20 15:10:39.862059737 +0000 UTC m=+499.992935035" observedRunningTime="2026-04-20 15:10:40.865159675 +0000 UTC m=+500.996034978" watchObservedRunningTime="2026-04-20 15:10:40.867655491 +0000 UTC m=+500.998530793" Apr 20 15:10:45.701605 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:45.701570 2572 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="opendatahub/odh-model-controller-858dbf95b8-ws2wq" Apr 20 15:10:45.702184 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:45.702062 2572 scope.go:117] "RemoveContainer" containerID="fe365cf810c13dc180721dcbaa7323020ee33e27ee3b6a6a6073779538f5c473" Apr 20 15:10:46.866212 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:46.866181 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-ws2wq" event={"ID":"ba566142-9484-403a-b32d-f2a048b6021b","Type":"ContainerStarted","Data":"2b930cd6c6e95ee23afb8e6c905b748946bf63f957ebbd971eda5d8b10d55306"} Apr 20 15:10:46.866600 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:46.866378 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-ws2wq" Apr 20 15:10:46.886671 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:46.886627 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/odh-model-controller-858dbf95b8-ws2wq" podStartSLOduration=3.293613326 podStartE2EDuration="22.886615247s" podCreationTimestamp="2026-04-20 15:10:24 +0000 UTC" firstStartedPulling="2026-04-20 15:10:26.473698536 +0000 UTC m=+486.604573817" lastFinishedPulling="2026-04-20 15:10:46.066700457 +0000 UTC m=+506.197575738" observedRunningTime="2026-04-20 15:10:46.883292868 +0000 UTC m=+507.014168171" watchObservedRunningTime="2026-04-20 15:10:46.886615247 +0000 UTC m=+507.017490549" Apr 20 15:10:51.850713 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:51.850678 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-gg6km" Apr 20 15:10:57.870403 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:10:57.870369 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/odh-model-controller-858dbf95b8-ws2wq" Apr 20 15:11:07.831631 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:07.831599 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/kserve-controller-manager-856948b99f-2r77g" Apr 20 15:11:21.749182 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:21.749145 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-dw2ql"] Apr 20 15:11:21.749605 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:21.749432 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7f14279a-b8cb-419c-a8b4-e61d472fce2c" containerName="util" Apr 20 15:11:21.749605 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:21.749443 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f14279a-b8cb-419c-a8b4-e61d472fce2c" containerName="util" Apr 20 15:11:21.749605 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:21.749452 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7f14279a-b8cb-419c-a8b4-e61d472fce2c" containerName="pull" Apr 20 15:11:21.749605 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:21.749458 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f14279a-b8cb-419c-a8b4-e61d472fce2c" containerName="pull" Apr 20 15:11:21.749605 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:21.749466 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7f14279a-b8cb-419c-a8b4-e61d472fce2c" containerName="extract" Apr 20 15:11:21.749605 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:21.749472 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f14279a-b8cb-419c-a8b4-e61d472fce2c" containerName="extract" Apr 20 15:11:21.749605 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:21.749525 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="7f14279a-b8cb-419c-a8b4-e61d472fce2c" containerName="extract" Apr 20 15:11:21.755192 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:21.755174 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-dw2ql" Apr 20 15:11:21.757864 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:21.757838 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 20 15:11:21.759102 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:21.759074 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 20 15:11:21.759222 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:21.759178 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-catalog-dockercfg-7vb42\"" Apr 20 15:11:21.761470 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:21.761449 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-dw2ql"] Apr 20 15:11:21.782530 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:21.782502 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-js6n8\" (UniqueName: \"kubernetes.io/projected/ab38a0df-e7d9-40fc-b543-abcd27a74532-kube-api-access-js6n8\") pod \"kuadrant-operator-catalog-dw2ql\" (UID: \"ab38a0df-e7d9-40fc-b543-abcd27a74532\") " pod="kuadrant-system/kuadrant-operator-catalog-dw2ql" Apr 20 15:11:21.882888 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:21.882860 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-js6n8\" (UniqueName: \"kubernetes.io/projected/ab38a0df-e7d9-40fc-b543-abcd27a74532-kube-api-access-js6n8\") pod \"kuadrant-operator-catalog-dw2ql\" (UID: \"ab38a0df-e7d9-40fc-b543-abcd27a74532\") " pod="kuadrant-system/kuadrant-operator-catalog-dw2ql" Apr 20 15:11:21.890938 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:21.890915 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-js6n8\" (UniqueName: \"kubernetes.io/projected/ab38a0df-e7d9-40fc-b543-abcd27a74532-kube-api-access-js6n8\") pod \"kuadrant-operator-catalog-dw2ql\" (UID: \"ab38a0df-e7d9-40fc-b543-abcd27a74532\") " pod="kuadrant-system/kuadrant-operator-catalog-dw2ql" Apr 20 15:11:21.995297 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:21.995270 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-fgkrd"] Apr 20 15:11:21.997428 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:21.997401 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fgkrd" Apr 20 15:11:21.999904 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:21.999849 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istio-kubeconfig\"" Apr 20 15:11:21.999904 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:21.999873 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-openshift-gateway-dockercfg-4d47v\"" Apr 20 15:11:22.000033 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:21.999988 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-tls\"" Apr 20 15:11:22.000100 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:22.000085 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"cacerts\"" Apr 20 15:11:22.000219 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:22.000206 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 20 15:11:22.011614 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:22.011590 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-fgkrd"] Apr 20 15:11:22.065572 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:22.065536 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-dw2ql" Apr 20 15:11:22.084454 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:22.084426 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/7159e679-12b5-4114-8f79-9e8ea4c77bdc-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-fgkrd\" (UID: \"7159e679-12b5-4114-8f79-9e8ea4c77bdc\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fgkrd" Apr 20 15:11:22.084590 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:22.084460 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/7159e679-12b5-4114-8f79-9e8ea4c77bdc-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-fgkrd\" (UID: \"7159e679-12b5-4114-8f79-9e8ea4c77bdc\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fgkrd" Apr 20 15:11:22.084590 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:22.084495 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/7159e679-12b5-4114-8f79-9e8ea4c77bdc-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-fgkrd\" (UID: \"7159e679-12b5-4114-8f79-9e8ea4c77bdc\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fgkrd" Apr 20 15:11:22.084590 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:22.084526 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rz74\" (UniqueName: \"kubernetes.io/projected/7159e679-12b5-4114-8f79-9e8ea4c77bdc-kube-api-access-5rz74\") pod \"istiod-openshift-gateway-55ff986f96-fgkrd\" (UID: \"7159e679-12b5-4114-8f79-9e8ea4c77bdc\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fgkrd" Apr 20 15:11:22.084590 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:22.084545 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/7159e679-12b5-4114-8f79-9e8ea4c77bdc-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-fgkrd\" (UID: \"7159e679-12b5-4114-8f79-9e8ea4c77bdc\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fgkrd" Apr 20 15:11:22.084590 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:22.084560 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/7159e679-12b5-4114-8f79-9e8ea4c77bdc-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-fgkrd\" (UID: \"7159e679-12b5-4114-8f79-9e8ea4c77bdc\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fgkrd" Apr 20 15:11:22.084590 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:22.084577 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/7159e679-12b5-4114-8f79-9e8ea4c77bdc-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-fgkrd\" (UID: \"7159e679-12b5-4114-8f79-9e8ea4c77bdc\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fgkrd" Apr 20 15:11:22.112768 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:22.112722 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-dw2ql"] Apr 20 15:11:22.185173 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:22.185119 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/7159e679-12b5-4114-8f79-9e8ea4c77bdc-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-fgkrd\" (UID: \"7159e679-12b5-4114-8f79-9e8ea4c77bdc\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fgkrd" Apr 20 15:11:22.185341 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:22.185209 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/7159e679-12b5-4114-8f79-9e8ea4c77bdc-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-fgkrd\" (UID: \"7159e679-12b5-4114-8f79-9e8ea4c77bdc\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fgkrd" Apr 20 15:11:22.185341 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:22.185313 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/7159e679-12b5-4114-8f79-9e8ea4c77bdc-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-fgkrd\" (UID: \"7159e679-12b5-4114-8f79-9e8ea4c77bdc\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fgkrd" Apr 20 15:11:22.185465 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:22.185350 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/7159e679-12b5-4114-8f79-9e8ea4c77bdc-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-fgkrd\" (UID: \"7159e679-12b5-4114-8f79-9e8ea4c77bdc\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fgkrd" Apr 20 15:11:22.185465 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:22.185400 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5rz74\" (UniqueName: \"kubernetes.io/projected/7159e679-12b5-4114-8f79-9e8ea4c77bdc-kube-api-access-5rz74\") pod \"istiod-openshift-gateway-55ff986f96-fgkrd\" (UID: \"7159e679-12b5-4114-8f79-9e8ea4c77bdc\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fgkrd" Apr 20 15:11:22.185465 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:22.185438 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/7159e679-12b5-4114-8f79-9e8ea4c77bdc-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-fgkrd\" (UID: \"7159e679-12b5-4114-8f79-9e8ea4c77bdc\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fgkrd" Apr 20 15:11:22.185640 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:22.185469 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/7159e679-12b5-4114-8f79-9e8ea4c77bdc-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-fgkrd\" (UID: \"7159e679-12b5-4114-8f79-9e8ea4c77bdc\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fgkrd" Apr 20 15:11:22.186913 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:22.186874 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/7159e679-12b5-4114-8f79-9e8ea4c77bdc-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-fgkrd\" (UID: \"7159e679-12b5-4114-8f79-9e8ea4c77bdc\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fgkrd" Apr 20 15:11:22.187950 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:22.187801 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-dw2ql"] Apr 20 15:11:22.188502 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:22.188368 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/7159e679-12b5-4114-8f79-9e8ea4c77bdc-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-fgkrd\" (UID: \"7159e679-12b5-4114-8f79-9e8ea4c77bdc\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fgkrd" Apr 20 15:11:22.188746 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:22.188725 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/7159e679-12b5-4114-8f79-9e8ea4c77bdc-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-fgkrd\" (UID: \"7159e679-12b5-4114-8f79-9e8ea4c77bdc\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fgkrd" Apr 20 15:11:22.188907 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:22.188885 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/7159e679-12b5-4114-8f79-9e8ea4c77bdc-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-fgkrd\" (UID: \"7159e679-12b5-4114-8f79-9e8ea4c77bdc\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fgkrd" Apr 20 15:11:22.189030 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:22.189015 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/7159e679-12b5-4114-8f79-9e8ea4c77bdc-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-fgkrd\" (UID: \"7159e679-12b5-4114-8f79-9e8ea4c77bdc\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fgkrd" Apr 20 15:11:22.190636 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:11:22.190612 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab38a0df_e7d9_40fc_b543_abcd27a74532.slice/crio-6ea493adc3ca0d4d1f94285bbbaa0c9aa34886778dfa6a6566434cf56ace22ae WatchSource:0}: Error finding container 6ea493adc3ca0d4d1f94285bbbaa0c9aa34886778dfa6a6566434cf56ace22ae: Status 404 returned error can't find the container with id 6ea493adc3ca0d4d1f94285bbbaa0c9aa34886778dfa6a6566434cf56ace22ae Apr 20 15:11:22.193661 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:22.193641 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/7159e679-12b5-4114-8f79-9e8ea4c77bdc-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-fgkrd\" (UID: \"7159e679-12b5-4114-8f79-9e8ea4c77bdc\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fgkrd" Apr 20 15:11:22.194105 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:22.194090 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rz74\" (UniqueName: \"kubernetes.io/projected/7159e679-12b5-4114-8f79-9e8ea4c77bdc-kube-api-access-5rz74\") pod \"istiod-openshift-gateway-55ff986f96-fgkrd\" (UID: \"7159e679-12b5-4114-8f79-9e8ea4c77bdc\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fgkrd" Apr 20 15:11:22.306905 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:22.306807 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fgkrd" Apr 20 15:11:22.427481 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:22.427459 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-fgkrd"] Apr 20 15:11:22.429528 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:11:22.429497 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7159e679_12b5_4114_8f79_9e8ea4c77bdc.slice/crio-38bee824da78c59fa115daf9610a9b2a91c2fe745b2b267932f865253477b0e4 WatchSource:0}: Error finding container 38bee824da78c59fa115daf9610a9b2a91c2fe745b2b267932f865253477b0e4: Status 404 returned error can't find the container with id 38bee824da78c59fa115daf9610a9b2a91c2fe745b2b267932f865253477b0e4 Apr 20 15:11:22.991440 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:22.991399 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-dw2ql" event={"ID":"ab38a0df-e7d9-40fc-b543-abcd27a74532","Type":"ContainerStarted","Data":"6ea493adc3ca0d4d1f94285bbbaa0c9aa34886778dfa6a6566434cf56ace22ae"} Apr 20 15:11:22.992743 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:22.992702 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fgkrd" event={"ID":"7159e679-12b5-4114-8f79-9e8ea4c77bdc","Type":"ContainerStarted","Data":"38bee824da78c59fa115daf9610a9b2a91c2fe745b2b267932f865253477b0e4"} Apr 20 15:11:25.002540 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:25.002504 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-dw2ql" event={"ID":"ab38a0df-e7d9-40fc-b543-abcd27a74532","Type":"ContainerStarted","Data":"30897f9f9d87d976a3ce3908a237602cdf72f3a5b1204097830e574936048e0f"} Apr 20 15:11:25.003024 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:25.002601 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-catalog-dw2ql" podUID="ab38a0df-e7d9-40fc-b543-abcd27a74532" containerName="registry-server" containerID="cri-o://30897f9f9d87d976a3ce3908a237602cdf72f3a5b1204097830e574936048e0f" gracePeriod=2 Apr 20 15:11:25.017904 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:25.017856 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-dw2ql" podStartSLOduration=1.864432715 podStartE2EDuration="4.017823575s" podCreationTimestamp="2026-04-20 15:11:21 +0000 UTC" firstStartedPulling="2026-04-20 15:11:22.191994837 +0000 UTC m=+542.322870117" lastFinishedPulling="2026-04-20 15:11:24.345385696 +0000 UTC m=+544.476260977" observedRunningTime="2026-04-20 15:11:25.016642115 +0000 UTC m=+545.147517417" watchObservedRunningTime="2026-04-20 15:11:25.017823575 +0000 UTC m=+545.148698881" Apr 20 15:11:25.235380 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:25.235358 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-dw2ql" Apr 20 15:11:25.312699 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:25.312625 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-js6n8\" (UniqueName: \"kubernetes.io/projected/ab38a0df-e7d9-40fc-b543-abcd27a74532-kube-api-access-js6n8\") pod \"ab38a0df-e7d9-40fc-b543-abcd27a74532\" (UID: \"ab38a0df-e7d9-40fc-b543-abcd27a74532\") " Apr 20 15:11:25.314804 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:25.314779 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab38a0df-e7d9-40fc-b543-abcd27a74532-kube-api-access-js6n8" (OuterVolumeSpecName: "kube-api-access-js6n8") pod "ab38a0df-e7d9-40fc-b543-abcd27a74532" (UID: "ab38a0df-e7d9-40fc-b543-abcd27a74532"). InnerVolumeSpecName "kube-api-access-js6n8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 15:11:25.413458 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:25.413431 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-js6n8\" (UniqueName: \"kubernetes.io/projected/ab38a0df-e7d9-40fc-b543-abcd27a74532-kube-api-access-js6n8\") on node \"ip-10-0-134-230.ec2.internal\" DevicePath \"\"" Apr 20 15:11:26.011052 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:26.011017 2572 generic.go:358] "Generic (PLEG): container finished" podID="ab38a0df-e7d9-40fc-b543-abcd27a74532" containerID="30897f9f9d87d976a3ce3908a237602cdf72f3a5b1204097830e574936048e0f" exitCode=0 Apr 20 15:11:26.011608 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:26.011113 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-dw2ql" Apr 20 15:11:26.011743 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:26.011110 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-dw2ql" event={"ID":"ab38a0df-e7d9-40fc-b543-abcd27a74532","Type":"ContainerDied","Data":"30897f9f9d87d976a3ce3908a237602cdf72f3a5b1204097830e574936048e0f"} Apr 20 15:11:26.011743 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:26.011696 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-dw2ql" event={"ID":"ab38a0df-e7d9-40fc-b543-abcd27a74532","Type":"ContainerDied","Data":"6ea493adc3ca0d4d1f94285bbbaa0c9aa34886778dfa6a6566434cf56ace22ae"} Apr 20 15:11:26.011743 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:26.011723 2572 scope.go:117] "RemoveContainer" containerID="30897f9f9d87d976a3ce3908a237602cdf72f3a5b1204097830e574936048e0f" Apr 20 15:11:26.039811 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:26.039778 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-dw2ql"] Apr 20 15:11:26.041553 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:26.041528 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-dw2ql"] Apr 20 15:11:26.410696 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:26.410616 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab38a0df-e7d9-40fc-b543-abcd27a74532" path="/var/lib/kubelet/pods/ab38a0df-e7d9-40fc-b543-abcd27a74532/volumes" Apr 20 15:11:26.424879 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:26.424849 2572 scope.go:117] "RemoveContainer" containerID="30897f9f9d87d976a3ce3908a237602cdf72f3a5b1204097830e574936048e0f" Apr 20 15:11:26.425160 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:11:26.425141 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30897f9f9d87d976a3ce3908a237602cdf72f3a5b1204097830e574936048e0f\": container with ID starting with 30897f9f9d87d976a3ce3908a237602cdf72f3a5b1204097830e574936048e0f not found: ID does not exist" containerID="30897f9f9d87d976a3ce3908a237602cdf72f3a5b1204097830e574936048e0f" Apr 20 15:11:26.425225 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:26.425175 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30897f9f9d87d976a3ce3908a237602cdf72f3a5b1204097830e574936048e0f"} err="failed to get container status \"30897f9f9d87d976a3ce3908a237602cdf72f3a5b1204097830e574936048e0f\": rpc error: code = NotFound desc = could not find container \"30897f9f9d87d976a3ce3908a237602cdf72f3a5b1204097830e574936048e0f\": container with ID starting with 30897f9f9d87d976a3ce3908a237602cdf72f3a5b1204097830e574936048e0f not found: ID does not exist" Apr 20 15:11:26.460641 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:26.460598 2572 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236216Ki","pods":"250"} Apr 20 15:11:26.460737 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:26.460681 2572 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236216Ki","pods":"250"} Apr 20 15:11:27.018000 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:27.017964 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fgkrd" event={"ID":"7159e679-12b5-4114-8f79-9e8ea4c77bdc","Type":"ContainerStarted","Data":"435fe574ae7e24028b2d7f1a95c18fbd30d57c8edd3a9acc1f8e2b4595be0989"} Apr 20 15:11:27.018373 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:27.018171 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fgkrd" Apr 20 15:11:27.019890 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:27.019864 2572 patch_prober.go:28] interesting pod/istiod-openshift-gateway-55ff986f96-fgkrd container/discovery namespace/openshift-ingress: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body= Apr 20 15:11:27.020027 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:27.019916 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fgkrd" podUID="7159e679-12b5-4114-8f79-9e8ea4c77bdc" containerName="discovery" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 15:11:27.039507 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:27.039435 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fgkrd" podStartSLOduration=2.010394667 podStartE2EDuration="6.039399097s" podCreationTimestamp="2026-04-20 15:11:21 +0000 UTC" firstStartedPulling="2026-04-20 15:11:22.431345822 +0000 UTC m=+542.562221102" lastFinishedPulling="2026-04-20 15:11:26.460350248 +0000 UTC m=+546.591225532" observedRunningTime="2026-04-20 15:11:27.038451148 +0000 UTC m=+547.169326453" watchObservedRunningTime="2026-04-20 15:11:27.039399097 +0000 UTC m=+547.170274401" Apr 20 15:11:28.022296 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:28.022266 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fgkrd" Apr 20 15:11:37.351753 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:37.351718 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759wwhp7"] Apr 20 15:11:37.352151 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:37.352031 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ab38a0df-e7d9-40fc-b543-abcd27a74532" containerName="registry-server" Apr 20 15:11:37.352151 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:37.352043 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab38a0df-e7d9-40fc-b543-abcd27a74532" containerName="registry-server" Apr 20 15:11:37.352151 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:37.352096 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="ab38a0df-e7d9-40fc-b543-abcd27a74532" containerName="registry-server" Apr 20 15:11:37.354096 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:37.354079 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759wwhp7" Apr 20 15:11:37.356599 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:37.356575 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 20 15:11:37.356713 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:37.356693 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 20 15:11:37.357576 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:37.357560 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-qnq8w\"" Apr 20 15:11:37.363027 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:37.363007 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759wwhp7"] Apr 20 15:11:37.403070 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:37.403047 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/54d323d1-538a-4238-8566-911a30915416-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759wwhp7\" (UID: \"54d323d1-538a-4238-8566-911a30915416\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759wwhp7" Apr 20 15:11:37.403168 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:37.403085 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/54d323d1-538a-4238-8566-911a30915416-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759wwhp7\" (UID: \"54d323d1-538a-4238-8566-911a30915416\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759wwhp7" Apr 20 15:11:37.403168 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:37.403113 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfz7x\" (UniqueName: \"kubernetes.io/projected/54d323d1-538a-4238-8566-911a30915416-kube-api-access-dfz7x\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759wwhp7\" (UID: \"54d323d1-538a-4238-8566-911a30915416\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759wwhp7" Apr 20 15:11:37.504101 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:37.504074 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/54d323d1-538a-4238-8566-911a30915416-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759wwhp7\" (UID: \"54d323d1-538a-4238-8566-911a30915416\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759wwhp7" Apr 20 15:11:37.504229 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:37.504114 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/54d323d1-538a-4238-8566-911a30915416-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759wwhp7\" (UID: \"54d323d1-538a-4238-8566-911a30915416\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759wwhp7" Apr 20 15:11:37.504229 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:37.504139 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dfz7x\" (UniqueName: \"kubernetes.io/projected/54d323d1-538a-4238-8566-911a30915416-kube-api-access-dfz7x\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759wwhp7\" (UID: \"54d323d1-538a-4238-8566-911a30915416\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759wwhp7" Apr 20 15:11:37.504458 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:37.504435 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/54d323d1-538a-4238-8566-911a30915416-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759wwhp7\" (UID: \"54d323d1-538a-4238-8566-911a30915416\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759wwhp7" Apr 20 15:11:37.504553 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:37.504468 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/54d323d1-538a-4238-8566-911a30915416-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759wwhp7\" (UID: \"54d323d1-538a-4238-8566-911a30915416\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759wwhp7" Apr 20 15:11:37.513044 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:37.513025 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfz7x\" (UniqueName: \"kubernetes.io/projected/54d323d1-538a-4238-8566-911a30915416-kube-api-access-dfz7x\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759wwhp7\" (UID: \"54d323d1-538a-4238-8566-911a30915416\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759wwhp7" Apr 20 15:11:37.663592 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:37.663523 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759wwhp7" Apr 20 15:11:37.780250 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:37.780225 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759wwhp7"] Apr 20 15:11:37.782417 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:11:37.782391 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod54d323d1_538a_4238_8566_911a30915416.slice/crio-407e9a27f1972bc5564ddb2822dffd9b2d8982bf1d1046d6220586706b4ed26a WatchSource:0}: Error finding container 407e9a27f1972bc5564ddb2822dffd9b2d8982bf1d1046d6220586706b4ed26a: Status 404 returned error can't find the container with id 407e9a27f1972bc5564ddb2822dffd9b2d8982bf1d1046d6220586706b4ed26a Apr 20 15:11:38.056074 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:38.056040 2572 generic.go:358] "Generic (PLEG): container finished" podID="54d323d1-538a-4238-8566-911a30915416" containerID="314937e14563ab1770c6ccd699831a292726a2b0b2afd1e95eafd6fb4bf34aff" exitCode=0 Apr 20 15:11:38.056223 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:38.056094 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759wwhp7" event={"ID":"54d323d1-538a-4238-8566-911a30915416","Type":"ContainerDied","Data":"314937e14563ab1770c6ccd699831a292726a2b0b2afd1e95eafd6fb4bf34aff"} Apr 20 15:11:38.056223 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:38.056126 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759wwhp7" event={"ID":"54d323d1-538a-4238-8566-911a30915416","Type":"ContainerStarted","Data":"407e9a27f1972bc5564ddb2822dffd9b2d8982bf1d1046d6220586706b4ed26a"} Apr 20 15:11:38.158578 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:38.158544 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73ldzvg"] Apr 20 15:11:38.160721 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:38.160703 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73ldzvg" Apr 20 15:11:38.169474 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:38.169456 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73ldzvg"] Apr 20 15:11:38.209786 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:38.209757 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b6c78ccb-cce3-4d64-95cc-f3454322aa30-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73ldzvg\" (UID: \"b6c78ccb-cce3-4d64-95cc-f3454322aa30\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73ldzvg" Apr 20 15:11:38.209919 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:38.209796 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b6c78ccb-cce3-4d64-95cc-f3454322aa30-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73ldzvg\" (UID: \"b6c78ccb-cce3-4d64-95cc-f3454322aa30\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73ldzvg" Apr 20 15:11:38.209919 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:38.209828 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fg5k\" (UniqueName: \"kubernetes.io/projected/b6c78ccb-cce3-4d64-95cc-f3454322aa30-kube-api-access-5fg5k\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73ldzvg\" (UID: \"b6c78ccb-cce3-4d64-95cc-f3454322aa30\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73ldzvg" Apr 20 15:11:38.311096 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:38.311024 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b6c78ccb-cce3-4d64-95cc-f3454322aa30-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73ldzvg\" (UID: \"b6c78ccb-cce3-4d64-95cc-f3454322aa30\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73ldzvg" Apr 20 15:11:38.311096 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:38.311064 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b6c78ccb-cce3-4d64-95cc-f3454322aa30-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73ldzvg\" (UID: \"b6c78ccb-cce3-4d64-95cc-f3454322aa30\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73ldzvg" Apr 20 15:11:38.311096 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:38.311093 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5fg5k\" (UniqueName: \"kubernetes.io/projected/b6c78ccb-cce3-4d64-95cc-f3454322aa30-kube-api-access-5fg5k\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73ldzvg\" (UID: \"b6c78ccb-cce3-4d64-95cc-f3454322aa30\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73ldzvg" Apr 20 15:11:38.311425 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:38.311407 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b6c78ccb-cce3-4d64-95cc-f3454322aa30-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73ldzvg\" (UID: \"b6c78ccb-cce3-4d64-95cc-f3454322aa30\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73ldzvg" Apr 20 15:11:38.311459 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:38.311426 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b6c78ccb-cce3-4d64-95cc-f3454322aa30-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73ldzvg\" (UID: \"b6c78ccb-cce3-4d64-95cc-f3454322aa30\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73ldzvg" Apr 20 15:11:38.319587 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:38.319563 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fg5k\" (UniqueName: \"kubernetes.io/projected/b6c78ccb-cce3-4d64-95cc-f3454322aa30-kube-api-access-5fg5k\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73ldzvg\" (UID: \"b6c78ccb-cce3-4d64-95cc-f3454322aa30\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73ldzvg" Apr 20 15:11:38.470146 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:38.470123 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73ldzvg" Apr 20 15:11:38.588050 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:38.588024 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73ldzvg"] Apr 20 15:11:38.590673 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:11:38.590645 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6c78ccb_cce3_4d64_95cc_f3454322aa30.slice/crio-bfd5023f608cb7c4259c2bccae99bc0f70976a2b7626304be11e4f4a350e823b WatchSource:0}: Error finding container bfd5023f608cb7c4259c2bccae99bc0f70976a2b7626304be11e4f4a350e823b: Status 404 returned error can't find the container with id bfd5023f608cb7c4259c2bccae99bc0f70976a2b7626304be11e4f4a350e823b Apr 20 15:11:38.951145 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:38.951122 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1d8ttv"] Apr 20 15:11:38.953314 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:38.953298 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1d8ttv" Apr 20 15:11:38.961879 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:38.961857 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1d8ttv"] Apr 20 15:11:39.017067 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:39.017041 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e9442516-f7c5-46be-bd33-94af1987a8f1-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1d8ttv\" (UID: \"e9442516-f7c5-46be-bd33-94af1987a8f1\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1d8ttv" Apr 20 15:11:39.017162 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:39.017076 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e9442516-f7c5-46be-bd33-94af1987a8f1-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1d8ttv\" (UID: \"e9442516-f7c5-46be-bd33-94af1987a8f1\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1d8ttv" Apr 20 15:11:39.017202 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:39.017160 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6jcf\" (UniqueName: \"kubernetes.io/projected/e9442516-f7c5-46be-bd33-94af1987a8f1-kube-api-access-j6jcf\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1d8ttv\" (UID: \"e9442516-f7c5-46be-bd33-94af1987a8f1\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1d8ttv" Apr 20 15:11:39.065672 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:39.065645 2572 generic.go:358] "Generic (PLEG): container finished" podID="b6c78ccb-cce3-4d64-95cc-f3454322aa30" containerID="09fd6602c16b43d30bd3e66f14d92480cc202537a565acfebfc7d5f25b24da1f" exitCode=0 Apr 20 15:11:39.065792 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:39.065723 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73ldzvg" event={"ID":"b6c78ccb-cce3-4d64-95cc-f3454322aa30","Type":"ContainerDied","Data":"09fd6602c16b43d30bd3e66f14d92480cc202537a565acfebfc7d5f25b24da1f"} Apr 20 15:11:39.065792 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:39.065750 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73ldzvg" event={"ID":"b6c78ccb-cce3-4d64-95cc-f3454322aa30","Type":"ContainerStarted","Data":"bfd5023f608cb7c4259c2bccae99bc0f70976a2b7626304be11e4f4a350e823b"} Apr 20 15:11:39.067403 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:39.067383 2572 generic.go:358] "Generic (PLEG): container finished" podID="54d323d1-538a-4238-8566-911a30915416" containerID="941fce8693998b69a0bf533be6a2b87c2e171025939226427f2aafa9429e9c91" exitCode=0 Apr 20 15:11:39.067538 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:39.067409 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759wwhp7" event={"ID":"54d323d1-538a-4238-8566-911a30915416","Type":"ContainerDied","Data":"941fce8693998b69a0bf533be6a2b87c2e171025939226427f2aafa9429e9c91"} Apr 20 15:11:39.118519 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:39.118480 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j6jcf\" (UniqueName: \"kubernetes.io/projected/e9442516-f7c5-46be-bd33-94af1987a8f1-kube-api-access-j6jcf\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1d8ttv\" (UID: \"e9442516-f7c5-46be-bd33-94af1987a8f1\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1d8ttv" Apr 20 15:11:39.118629 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:39.118555 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e9442516-f7c5-46be-bd33-94af1987a8f1-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1d8ttv\" (UID: \"e9442516-f7c5-46be-bd33-94af1987a8f1\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1d8ttv" Apr 20 15:11:39.118629 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:39.118611 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e9442516-f7c5-46be-bd33-94af1987a8f1-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1d8ttv\" (UID: \"e9442516-f7c5-46be-bd33-94af1987a8f1\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1d8ttv" Apr 20 15:11:39.118878 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:39.118858 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e9442516-f7c5-46be-bd33-94af1987a8f1-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1d8ttv\" (UID: \"e9442516-f7c5-46be-bd33-94af1987a8f1\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1d8ttv" Apr 20 15:11:39.118956 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:39.118939 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e9442516-f7c5-46be-bd33-94af1987a8f1-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1d8ttv\" (UID: \"e9442516-f7c5-46be-bd33-94af1987a8f1\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1d8ttv" Apr 20 15:11:39.127206 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:39.127189 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6jcf\" (UniqueName: \"kubernetes.io/projected/e9442516-f7c5-46be-bd33-94af1987a8f1-kube-api-access-j6jcf\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1d8ttv\" (UID: \"e9442516-f7c5-46be-bd33-94af1987a8f1\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1d8ttv" Apr 20 15:11:39.295570 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:39.295545 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1d8ttv" Apr 20 15:11:39.356429 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:39.356397 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qvxc6"] Apr 20 15:11:39.359032 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:39.359013 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qvxc6" Apr 20 15:11:39.366711 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:39.366686 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qvxc6"] Apr 20 15:11:39.414107 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:39.414082 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1d8ttv"] Apr 20 15:11:39.416405 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:11:39.416382 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9442516_f7c5_46be_bd33_94af1987a8f1.slice/crio-8fef08ee1269bbe587c7f2e706fb8d6bd7beb0a9ada677bd54a5829cb4ceaeee WatchSource:0}: Error finding container 8fef08ee1269bbe587c7f2e706fb8d6bd7beb0a9ada677bd54a5829cb4ceaeee: Status 404 returned error can't find the container with id 8fef08ee1269bbe587c7f2e706fb8d6bd7beb0a9ada677bd54a5829cb4ceaeee Apr 20 15:11:39.421999 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:39.421976 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtftq\" (UniqueName: \"kubernetes.io/projected/399c50ec-720e-47b6-badd-f2188a2d0035-kube-api-access-jtftq\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qvxc6\" (UID: \"399c50ec-720e-47b6-badd-f2188a2d0035\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qvxc6" Apr 20 15:11:39.422094 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:39.422040 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/399c50ec-720e-47b6-badd-f2188a2d0035-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qvxc6\" (UID: \"399c50ec-720e-47b6-badd-f2188a2d0035\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qvxc6" Apr 20 15:11:39.422094 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:39.422062 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/399c50ec-720e-47b6-badd-f2188a2d0035-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qvxc6\" (UID: \"399c50ec-720e-47b6-badd-f2188a2d0035\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qvxc6" Apr 20 15:11:39.522425 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:39.522399 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/399c50ec-720e-47b6-badd-f2188a2d0035-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qvxc6\" (UID: \"399c50ec-720e-47b6-badd-f2188a2d0035\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qvxc6" Apr 20 15:11:39.522719 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:39.522437 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/399c50ec-720e-47b6-badd-f2188a2d0035-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qvxc6\" (UID: \"399c50ec-720e-47b6-badd-f2188a2d0035\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qvxc6" Apr 20 15:11:39.522719 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:39.522468 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jtftq\" (UniqueName: \"kubernetes.io/projected/399c50ec-720e-47b6-badd-f2188a2d0035-kube-api-access-jtftq\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qvxc6\" (UID: \"399c50ec-720e-47b6-badd-f2188a2d0035\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qvxc6" Apr 20 15:11:39.522797 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:39.522729 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/399c50ec-720e-47b6-badd-f2188a2d0035-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qvxc6\" (UID: \"399c50ec-720e-47b6-badd-f2188a2d0035\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qvxc6" Apr 20 15:11:39.522797 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:39.522778 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/399c50ec-720e-47b6-badd-f2188a2d0035-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qvxc6\" (UID: \"399c50ec-720e-47b6-badd-f2188a2d0035\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qvxc6" Apr 20 15:11:39.530518 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:39.530480 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtftq\" (UniqueName: \"kubernetes.io/projected/399c50ec-720e-47b6-badd-f2188a2d0035-kube-api-access-jtftq\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qvxc6\" (UID: \"399c50ec-720e-47b6-badd-f2188a2d0035\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qvxc6" Apr 20 15:11:39.669093 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:39.669035 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qvxc6" Apr 20 15:11:39.786883 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:39.786855 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qvxc6"] Apr 20 15:11:39.788787 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:11:39.788758 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod399c50ec_720e_47b6_badd_f2188a2d0035.slice/crio-c3154629329d2c8be33c3dfea8dc33e4c1f4db53d71ea85f5a993c5d6c144c33 WatchSource:0}: Error finding container c3154629329d2c8be33c3dfea8dc33e4c1f4db53d71ea85f5a993c5d6c144c33: Status 404 returned error can't find the container with id c3154629329d2c8be33c3dfea8dc33e4c1f4db53d71ea85f5a993c5d6c144c33 Apr 20 15:11:40.072619 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:40.072589 2572 generic.go:358] "Generic (PLEG): container finished" podID="54d323d1-538a-4238-8566-911a30915416" containerID="357b5afa409caca14478b61950924f31d5704964d61bdbe00e4a3bb196e35762" exitCode=0 Apr 20 15:11:40.072744 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:40.072664 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759wwhp7" event={"ID":"54d323d1-538a-4238-8566-911a30915416","Type":"ContainerDied","Data":"357b5afa409caca14478b61950924f31d5704964d61bdbe00e4a3bb196e35762"} Apr 20 15:11:40.074105 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:40.074081 2572 generic.go:358] "Generic (PLEG): container finished" podID="b6c78ccb-cce3-4d64-95cc-f3454322aa30" containerID="c8a76ac33ca6c4ad3e0666eec6bb1f886879749be723dd59cbaac614ead8fb52" exitCode=0 Apr 20 15:11:40.074200 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:40.074162 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73ldzvg" event={"ID":"b6c78ccb-cce3-4d64-95cc-f3454322aa30","Type":"ContainerDied","Data":"c8a76ac33ca6c4ad3e0666eec6bb1f886879749be723dd59cbaac614ead8fb52"} Apr 20 15:11:40.075468 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:40.075448 2572 generic.go:358] "Generic (PLEG): container finished" podID="399c50ec-720e-47b6-badd-f2188a2d0035" containerID="40839447d47918f3dfc3614e2dd05d5789fbe83323ed4b1d6f17abc86fbd278f" exitCode=0 Apr 20 15:11:40.075586 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:40.075525 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qvxc6" event={"ID":"399c50ec-720e-47b6-badd-f2188a2d0035","Type":"ContainerDied","Data":"40839447d47918f3dfc3614e2dd05d5789fbe83323ed4b1d6f17abc86fbd278f"} Apr 20 15:11:40.075586 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:40.075556 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qvxc6" event={"ID":"399c50ec-720e-47b6-badd-f2188a2d0035","Type":"ContainerStarted","Data":"c3154629329d2c8be33c3dfea8dc33e4c1f4db53d71ea85f5a993c5d6c144c33"} Apr 20 15:11:40.076974 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:40.076953 2572 generic.go:358] "Generic (PLEG): container finished" podID="e9442516-f7c5-46be-bd33-94af1987a8f1" containerID="274d851d0e65fb8d3cad3febdce6e5ad612e540e8ed625941f6d784a5498564b" exitCode=0 Apr 20 15:11:40.077070 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:40.077033 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1d8ttv" event={"ID":"e9442516-f7c5-46be-bd33-94af1987a8f1","Type":"ContainerDied","Data":"274d851d0e65fb8d3cad3febdce6e5ad612e540e8ed625941f6d784a5498564b"} Apr 20 15:11:40.077070 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:40.077066 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1d8ttv" event={"ID":"e9442516-f7c5-46be-bd33-94af1987a8f1","Type":"ContainerStarted","Data":"8fef08ee1269bbe587c7f2e706fb8d6bd7beb0a9ada677bd54a5829cb4ceaeee"} Apr 20 15:11:41.082896 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:41.082869 2572 generic.go:358] "Generic (PLEG): container finished" podID="b6c78ccb-cce3-4d64-95cc-f3454322aa30" containerID="0cc493fb4fe1f318129d6a181fc91001a6e1e0b08d0be67f21e4dc7bec938a0d" exitCode=0 Apr 20 15:11:41.083261 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:41.082956 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73ldzvg" event={"ID":"b6c78ccb-cce3-4d64-95cc-f3454322aa30","Type":"ContainerDied","Data":"0cc493fb4fe1f318129d6a181fc91001a6e1e0b08d0be67f21e4dc7bec938a0d"} Apr 20 15:11:41.265031 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:41.265012 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759wwhp7" Apr 20 15:11:41.338636 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:41.338606 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dfz7x\" (UniqueName: \"kubernetes.io/projected/54d323d1-538a-4238-8566-911a30915416-kube-api-access-dfz7x\") pod \"54d323d1-538a-4238-8566-911a30915416\" (UID: \"54d323d1-538a-4238-8566-911a30915416\") " Apr 20 15:11:41.338783 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:41.338681 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/54d323d1-538a-4238-8566-911a30915416-bundle\") pod \"54d323d1-538a-4238-8566-911a30915416\" (UID: \"54d323d1-538a-4238-8566-911a30915416\") " Apr 20 15:11:41.338783 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:41.338730 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/54d323d1-538a-4238-8566-911a30915416-util\") pod \"54d323d1-538a-4238-8566-911a30915416\" (UID: \"54d323d1-538a-4238-8566-911a30915416\") " Apr 20 15:11:41.339231 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:41.339199 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54d323d1-538a-4238-8566-911a30915416-bundle" (OuterVolumeSpecName: "bundle") pod "54d323d1-538a-4238-8566-911a30915416" (UID: "54d323d1-538a-4238-8566-911a30915416"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 15:11:41.340715 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:41.340686 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54d323d1-538a-4238-8566-911a30915416-kube-api-access-dfz7x" (OuterVolumeSpecName: "kube-api-access-dfz7x") pod "54d323d1-538a-4238-8566-911a30915416" (UID: "54d323d1-538a-4238-8566-911a30915416"). InnerVolumeSpecName "kube-api-access-dfz7x". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 15:11:41.344307 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:41.344282 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54d323d1-538a-4238-8566-911a30915416-util" (OuterVolumeSpecName: "util") pod "54d323d1-538a-4238-8566-911a30915416" (UID: "54d323d1-538a-4238-8566-911a30915416"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 15:11:41.439878 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:41.439843 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dfz7x\" (UniqueName: \"kubernetes.io/projected/54d323d1-538a-4238-8566-911a30915416-kube-api-access-dfz7x\") on node \"ip-10-0-134-230.ec2.internal\" DevicePath \"\"" Apr 20 15:11:41.439878 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:41.439873 2572 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/54d323d1-538a-4238-8566-911a30915416-bundle\") on node \"ip-10-0-134-230.ec2.internal\" DevicePath \"\"" Apr 20 15:11:41.439878 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:41.439883 2572 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/54d323d1-538a-4238-8566-911a30915416-util\") on node \"ip-10-0-134-230.ec2.internal\" DevicePath \"\"" Apr 20 15:11:42.088875 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:42.088835 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759wwhp7" event={"ID":"54d323d1-538a-4238-8566-911a30915416","Type":"ContainerDied","Data":"407e9a27f1972bc5564ddb2822dffd9b2d8982bf1d1046d6220586706b4ed26a"} Apr 20 15:11:42.089414 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:42.088882 2572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="407e9a27f1972bc5564ddb2822dffd9b2d8982bf1d1046d6220586706b4ed26a" Apr 20 15:11:42.089414 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:42.088895 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759wwhp7" Apr 20 15:11:42.090745 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:42.090724 2572 generic.go:358] "Generic (PLEG): container finished" podID="399c50ec-720e-47b6-badd-f2188a2d0035" containerID="b8542b9ddcf56cddf771e1efda895673341c367a7866eb77ff2cbcd7feb29ee2" exitCode=0 Apr 20 15:11:42.090877 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:42.090780 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qvxc6" event={"ID":"399c50ec-720e-47b6-badd-f2188a2d0035","Type":"ContainerDied","Data":"b8542b9ddcf56cddf771e1efda895673341c367a7866eb77ff2cbcd7feb29ee2"} Apr 20 15:11:42.092713 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:42.092683 2572 generic.go:358] "Generic (PLEG): container finished" podID="e9442516-f7c5-46be-bd33-94af1987a8f1" containerID="20c8d0e15620153ba37ac1c0c1091bd2eef7efc7f55a116dc5c33f30c7ac4865" exitCode=0 Apr 20 15:11:42.092713 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:42.092706 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1d8ttv" event={"ID":"e9442516-f7c5-46be-bd33-94af1987a8f1","Type":"ContainerDied","Data":"20c8d0e15620153ba37ac1c0c1091bd2eef7efc7f55a116dc5c33f30c7ac4865"} Apr 20 15:11:42.224329 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:42.224301 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73ldzvg" Apr 20 15:11:42.346503 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:42.346425 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b6c78ccb-cce3-4d64-95cc-f3454322aa30-util\") pod \"b6c78ccb-cce3-4d64-95cc-f3454322aa30\" (UID: \"b6c78ccb-cce3-4d64-95cc-f3454322aa30\") " Apr 20 15:11:42.346503 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:42.346499 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5fg5k\" (UniqueName: \"kubernetes.io/projected/b6c78ccb-cce3-4d64-95cc-f3454322aa30-kube-api-access-5fg5k\") pod \"b6c78ccb-cce3-4d64-95cc-f3454322aa30\" (UID: \"b6c78ccb-cce3-4d64-95cc-f3454322aa30\") " Apr 20 15:11:42.346706 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:42.346521 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b6c78ccb-cce3-4d64-95cc-f3454322aa30-bundle\") pod \"b6c78ccb-cce3-4d64-95cc-f3454322aa30\" (UID: \"b6c78ccb-cce3-4d64-95cc-f3454322aa30\") " Apr 20 15:11:42.347069 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:42.347045 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6c78ccb-cce3-4d64-95cc-f3454322aa30-bundle" (OuterVolumeSpecName: "bundle") pod "b6c78ccb-cce3-4d64-95cc-f3454322aa30" (UID: "b6c78ccb-cce3-4d64-95cc-f3454322aa30"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 15:11:42.348539 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:42.348520 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6c78ccb-cce3-4d64-95cc-f3454322aa30-kube-api-access-5fg5k" (OuterVolumeSpecName: "kube-api-access-5fg5k") pod "b6c78ccb-cce3-4d64-95cc-f3454322aa30" (UID: "b6c78ccb-cce3-4d64-95cc-f3454322aa30"). InnerVolumeSpecName "kube-api-access-5fg5k". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 15:11:42.350950 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:42.350908 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6c78ccb-cce3-4d64-95cc-f3454322aa30-util" (OuterVolumeSpecName: "util") pod "b6c78ccb-cce3-4d64-95cc-f3454322aa30" (UID: "b6c78ccb-cce3-4d64-95cc-f3454322aa30"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 15:11:42.447569 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:42.447543 2572 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b6c78ccb-cce3-4d64-95cc-f3454322aa30-util\") on node \"ip-10-0-134-230.ec2.internal\" DevicePath \"\"" Apr 20 15:11:42.447569 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:42.447569 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5fg5k\" (UniqueName: \"kubernetes.io/projected/b6c78ccb-cce3-4d64-95cc-f3454322aa30-kube-api-access-5fg5k\") on node \"ip-10-0-134-230.ec2.internal\" DevicePath \"\"" Apr 20 15:11:42.447683 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:42.447584 2572 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b6c78ccb-cce3-4d64-95cc-f3454322aa30-bundle\") on node \"ip-10-0-134-230.ec2.internal\" DevicePath \"\"" Apr 20 15:11:43.099031 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:43.098991 2572 generic.go:358] "Generic (PLEG): container finished" podID="399c50ec-720e-47b6-badd-f2188a2d0035" containerID="86f281c3a69a9e0c17c1fa505e339c231afa98709db488ffe6d8fb91cf827264" exitCode=0 Apr 20 15:11:43.099442 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:43.099071 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qvxc6" event={"ID":"399c50ec-720e-47b6-badd-f2188a2d0035","Type":"ContainerDied","Data":"86f281c3a69a9e0c17c1fa505e339c231afa98709db488ffe6d8fb91cf827264"} Apr 20 15:11:43.100959 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:43.100936 2572 generic.go:358] "Generic (PLEG): container finished" podID="e9442516-f7c5-46be-bd33-94af1987a8f1" containerID="3077502f5c85da65a506f8b6cd1ae7175b3263eb563d71d9cad8b4e8cf4562a5" exitCode=0 Apr 20 15:11:43.101076 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:43.101014 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1d8ttv" event={"ID":"e9442516-f7c5-46be-bd33-94af1987a8f1","Type":"ContainerDied","Data":"3077502f5c85da65a506f8b6cd1ae7175b3263eb563d71d9cad8b4e8cf4562a5"} Apr 20 15:11:43.102707 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:43.102686 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73ldzvg" event={"ID":"b6c78ccb-cce3-4d64-95cc-f3454322aa30","Type":"ContainerDied","Data":"bfd5023f608cb7c4259c2bccae99bc0f70976a2b7626304be11e4f4a350e823b"} Apr 20 15:11:43.102809 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:43.102712 2572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bfd5023f608cb7c4259c2bccae99bc0f70976a2b7626304be11e4f4a350e823b" Apr 20 15:11:43.102809 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:43.102741 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73ldzvg" Apr 20 15:11:44.236297 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:44.236272 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1d8ttv" Apr 20 15:11:44.264596 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:44.264576 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qvxc6" Apr 20 15:11:44.362733 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:44.362671 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/399c50ec-720e-47b6-badd-f2188a2d0035-bundle\") pod \"399c50ec-720e-47b6-badd-f2188a2d0035\" (UID: \"399c50ec-720e-47b6-badd-f2188a2d0035\") " Apr 20 15:11:44.362733 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:44.362704 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jtftq\" (UniqueName: \"kubernetes.io/projected/399c50ec-720e-47b6-badd-f2188a2d0035-kube-api-access-jtftq\") pod \"399c50ec-720e-47b6-badd-f2188a2d0035\" (UID: \"399c50ec-720e-47b6-badd-f2188a2d0035\") " Apr 20 15:11:44.362733 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:44.362730 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e9442516-f7c5-46be-bd33-94af1987a8f1-bundle\") pod \"e9442516-f7c5-46be-bd33-94af1987a8f1\" (UID: \"e9442516-f7c5-46be-bd33-94af1987a8f1\") " Apr 20 15:11:44.362967 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:44.362860 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e9442516-f7c5-46be-bd33-94af1987a8f1-util\") pod \"e9442516-f7c5-46be-bd33-94af1987a8f1\" (UID: \"e9442516-f7c5-46be-bd33-94af1987a8f1\") " Apr 20 15:11:44.362967 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:44.362907 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j6jcf\" (UniqueName: \"kubernetes.io/projected/e9442516-f7c5-46be-bd33-94af1987a8f1-kube-api-access-j6jcf\") pod \"e9442516-f7c5-46be-bd33-94af1987a8f1\" (UID: \"e9442516-f7c5-46be-bd33-94af1987a8f1\") " Apr 20 15:11:44.362967 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:44.362955 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/399c50ec-720e-47b6-badd-f2188a2d0035-util\") pod \"399c50ec-720e-47b6-badd-f2188a2d0035\" (UID: \"399c50ec-720e-47b6-badd-f2188a2d0035\") " Apr 20 15:11:44.363343 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:44.363305 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/399c50ec-720e-47b6-badd-f2188a2d0035-bundle" (OuterVolumeSpecName: "bundle") pod "399c50ec-720e-47b6-badd-f2188a2d0035" (UID: "399c50ec-720e-47b6-badd-f2188a2d0035"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 15:11:44.363442 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:44.363407 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9442516-f7c5-46be-bd33-94af1987a8f1-bundle" (OuterVolumeSpecName: "bundle") pod "e9442516-f7c5-46be-bd33-94af1987a8f1" (UID: "e9442516-f7c5-46be-bd33-94af1987a8f1"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 15:11:44.364943 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:44.364915 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9442516-f7c5-46be-bd33-94af1987a8f1-kube-api-access-j6jcf" (OuterVolumeSpecName: "kube-api-access-j6jcf") pod "e9442516-f7c5-46be-bd33-94af1987a8f1" (UID: "e9442516-f7c5-46be-bd33-94af1987a8f1"). InnerVolumeSpecName "kube-api-access-j6jcf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 15:11:44.365201 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:44.365179 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/399c50ec-720e-47b6-badd-f2188a2d0035-kube-api-access-jtftq" (OuterVolumeSpecName: "kube-api-access-jtftq") pod "399c50ec-720e-47b6-badd-f2188a2d0035" (UID: "399c50ec-720e-47b6-badd-f2188a2d0035"). InnerVolumeSpecName "kube-api-access-jtftq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 15:11:44.369255 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:44.369230 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9442516-f7c5-46be-bd33-94af1987a8f1-util" (OuterVolumeSpecName: "util") pod "e9442516-f7c5-46be-bd33-94af1987a8f1" (UID: "e9442516-f7c5-46be-bd33-94af1987a8f1"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 15:11:44.371250 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:44.371217 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/399c50ec-720e-47b6-badd-f2188a2d0035-util" (OuterVolumeSpecName: "util") pod "399c50ec-720e-47b6-badd-f2188a2d0035" (UID: "399c50ec-720e-47b6-badd-f2188a2d0035"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 15:11:44.463591 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:44.463572 2572 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/399c50ec-720e-47b6-badd-f2188a2d0035-util\") on node \"ip-10-0-134-230.ec2.internal\" DevicePath \"\"" Apr 20 15:11:44.463591 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:44.463591 2572 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/399c50ec-720e-47b6-badd-f2188a2d0035-bundle\") on node \"ip-10-0-134-230.ec2.internal\" DevicePath \"\"" Apr 20 15:11:44.463720 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:44.463601 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jtftq\" (UniqueName: \"kubernetes.io/projected/399c50ec-720e-47b6-badd-f2188a2d0035-kube-api-access-jtftq\") on node \"ip-10-0-134-230.ec2.internal\" DevicePath \"\"" Apr 20 15:11:44.463720 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:44.463611 2572 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e9442516-f7c5-46be-bd33-94af1987a8f1-bundle\") on node \"ip-10-0-134-230.ec2.internal\" DevicePath \"\"" Apr 20 15:11:44.463720 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:44.463619 2572 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e9442516-f7c5-46be-bd33-94af1987a8f1-util\") on node \"ip-10-0-134-230.ec2.internal\" DevicePath \"\"" Apr 20 15:11:44.463720 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:44.463627 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-j6jcf\" (UniqueName: \"kubernetes.io/projected/e9442516-f7c5-46be-bd33-94af1987a8f1-kube-api-access-j6jcf\") on node \"ip-10-0-134-230.ec2.internal\" DevicePath \"\"" Apr 20 15:11:45.110767 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:45.110742 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qvxc6" Apr 20 15:11:45.110936 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:45.110739 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qvxc6" event={"ID":"399c50ec-720e-47b6-badd-f2188a2d0035","Type":"ContainerDied","Data":"c3154629329d2c8be33c3dfea8dc33e4c1f4db53d71ea85f5a993c5d6c144c33"} Apr 20 15:11:45.110936 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:45.110849 2572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c3154629329d2c8be33c3dfea8dc33e4c1f4db53d71ea85f5a993c5d6c144c33" Apr 20 15:11:45.112497 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:45.112461 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1d8ttv" Apr 20 15:11:45.112497 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:45.112466 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1d8ttv" event={"ID":"e9442516-f7c5-46be-bd33-94af1987a8f1","Type":"ContainerDied","Data":"8fef08ee1269bbe587c7f2e706fb8d6bd7beb0a9ada677bd54a5829cb4ceaeee"} Apr 20 15:11:45.112660 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:45.112507 2572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8fef08ee1269bbe587c7f2e706fb8d6bd7beb0a9ada677bd54a5829cb4ceaeee" Apr 20 15:11:59.344355 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:59.344318 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-pcx6p"] Apr 20 15:11:59.344757 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:59.344629 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e9442516-f7c5-46be-bd33-94af1987a8f1" containerName="pull" Apr 20 15:11:59.344757 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:59.344643 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9442516-f7c5-46be-bd33-94af1987a8f1" containerName="pull" Apr 20 15:11:59.344757 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:59.344654 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="54d323d1-538a-4238-8566-911a30915416" containerName="pull" Apr 20 15:11:59.344757 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:59.344661 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="54d323d1-538a-4238-8566-911a30915416" containerName="pull" Apr 20 15:11:59.344757 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:59.344669 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="399c50ec-720e-47b6-badd-f2188a2d0035" containerName="pull" Apr 20 15:11:59.344757 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:59.344678 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="399c50ec-720e-47b6-badd-f2188a2d0035" containerName="pull" Apr 20 15:11:59.344757 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:59.344686 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e9442516-f7c5-46be-bd33-94af1987a8f1" containerName="extract" Apr 20 15:11:59.344757 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:59.344692 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9442516-f7c5-46be-bd33-94af1987a8f1" containerName="extract" Apr 20 15:11:59.344757 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:59.344700 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b6c78ccb-cce3-4d64-95cc-f3454322aa30" containerName="extract" Apr 20 15:11:59.344757 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:59.344706 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6c78ccb-cce3-4d64-95cc-f3454322aa30" containerName="extract" Apr 20 15:11:59.344757 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:59.344714 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b6c78ccb-cce3-4d64-95cc-f3454322aa30" containerName="util" Apr 20 15:11:59.344757 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:59.344719 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6c78ccb-cce3-4d64-95cc-f3454322aa30" containerName="util" Apr 20 15:11:59.344757 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:59.344725 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="54d323d1-538a-4238-8566-911a30915416" containerName="extract" Apr 20 15:11:59.344757 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:59.344730 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="54d323d1-538a-4238-8566-911a30915416" containerName="extract" Apr 20 15:11:59.344757 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:59.344740 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="54d323d1-538a-4238-8566-911a30915416" containerName="util" Apr 20 15:11:59.344757 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:59.344748 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="54d323d1-538a-4238-8566-911a30915416" containerName="util" Apr 20 15:11:59.344757 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:59.344760 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e9442516-f7c5-46be-bd33-94af1987a8f1" containerName="util" Apr 20 15:11:59.344757 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:59.344766 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9442516-f7c5-46be-bd33-94af1987a8f1" containerName="util" Apr 20 15:11:59.345308 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:59.344775 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b6c78ccb-cce3-4d64-95cc-f3454322aa30" containerName="pull" Apr 20 15:11:59.345308 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:59.344779 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6c78ccb-cce3-4d64-95cc-f3454322aa30" containerName="pull" Apr 20 15:11:59.345308 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:59.344787 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="399c50ec-720e-47b6-badd-f2188a2d0035" containerName="extract" Apr 20 15:11:59.345308 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:59.344791 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="399c50ec-720e-47b6-badd-f2188a2d0035" containerName="extract" Apr 20 15:11:59.345308 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:59.344801 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="399c50ec-720e-47b6-badd-f2188a2d0035" containerName="util" Apr 20 15:11:59.345308 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:59.344806 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="399c50ec-720e-47b6-badd-f2188a2d0035" containerName="util" Apr 20 15:11:59.345308 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:59.344857 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="54d323d1-538a-4238-8566-911a30915416" containerName="extract" Apr 20 15:11:59.345308 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:59.344866 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="e9442516-f7c5-46be-bd33-94af1987a8f1" containerName="extract" Apr 20 15:11:59.345308 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:59.344873 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="b6c78ccb-cce3-4d64-95cc-f3454322aa30" containerName="extract" Apr 20 15:11:59.345308 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:59.344881 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="399c50ec-720e-47b6-badd-f2188a2d0035" containerName="extract" Apr 20 15:11:59.391559 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:59.391529 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-pcx6p"] Apr 20 15:11:59.391715 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:59.391638 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-pcx6p" Apr 20 15:11:59.396222 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:59.396202 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 20 15:11:59.396445 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:59.396431 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-8b6tn\"" Apr 20 15:11:59.396985 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:59.396971 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 20 15:11:59.477949 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:59.477919 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6vzs\" (UniqueName: \"kubernetes.io/projected/fa2037f6-93c4-4c2c-af5c-fb40927efece-kube-api-access-t6vzs\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-pcx6p\" (UID: \"fa2037f6-93c4-4c2c-af5c-fb40927efece\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-pcx6p" Apr 20 15:11:59.478170 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:59.478150 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/fa2037f6-93c4-4c2c-af5c-fb40927efece-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-pcx6p\" (UID: \"fa2037f6-93c4-4c2c-af5c-fb40927efece\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-pcx6p" Apr 20 15:11:59.579184 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:59.579155 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/fa2037f6-93c4-4c2c-af5c-fb40927efece-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-pcx6p\" (UID: \"fa2037f6-93c4-4c2c-af5c-fb40927efece\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-pcx6p" Apr 20 15:11:59.579313 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:59.579192 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t6vzs\" (UniqueName: \"kubernetes.io/projected/fa2037f6-93c4-4c2c-af5c-fb40927efece-kube-api-access-t6vzs\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-pcx6p\" (UID: \"fa2037f6-93c4-4c2c-af5c-fb40927efece\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-pcx6p" Apr 20 15:11:59.579547 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:59.579517 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/fa2037f6-93c4-4c2c-af5c-fb40927efece-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-pcx6p\" (UID: \"fa2037f6-93c4-4c2c-af5c-fb40927efece\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-pcx6p" Apr 20 15:11:59.604213 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:59.604152 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6vzs\" (UniqueName: \"kubernetes.io/projected/fa2037f6-93c4-4c2c-af5c-fb40927efece-kube-api-access-t6vzs\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-pcx6p\" (UID: \"fa2037f6-93c4-4c2c-af5c-fb40927efece\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-pcx6p" Apr 20 15:11:59.701529 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:59.701495 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-pcx6p" Apr 20 15:11:59.836759 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:11:59.836734 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-pcx6p"] Apr 20 15:11:59.839371 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:11:59.839342 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa2037f6_93c4_4c2c_af5c_fb40927efece.slice/crio-8dcccd961f507bc70409f94cc745179e3906ea3955948db5c756a241a10b5682 WatchSource:0}: Error finding container 8dcccd961f507bc70409f94cc745179e3906ea3955948db5c756a241a10b5682: Status 404 returned error can't find the container with id 8dcccd961f507bc70409f94cc745179e3906ea3955948db5c756a241a10b5682 Apr 20 15:12:00.166560 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:00.166528 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-pcx6p" event={"ID":"fa2037f6-93c4-4c2c-af5c-fb40927efece","Type":"ContainerStarted","Data":"8dcccd961f507bc70409f94cc745179e3906ea3955948db5c756a241a10b5682"} Apr 20 15:12:06.189737 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:06.189695 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-pcx6p" event={"ID":"fa2037f6-93c4-4c2c-af5c-fb40927efece","Type":"ContainerStarted","Data":"ba245e4c8ff9907a7407afbae816abfc3bb2a66ce74329d085b581f2da65a39f"} Apr 20 15:12:06.190103 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:06.189790 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-pcx6p" Apr 20 15:12:06.209134 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:06.209088 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-pcx6p" podStartSLOduration=1.324002631 podStartE2EDuration="7.209074249s" podCreationTimestamp="2026-04-20 15:11:59 +0000 UTC" firstStartedPulling="2026-04-20 15:11:59.841585046 +0000 UTC m=+579.972460327" lastFinishedPulling="2026-04-20 15:12:05.726656661 +0000 UTC m=+585.857531945" observedRunningTime="2026-04-20 15:12:06.207156096 +0000 UTC m=+586.338031399" watchObservedRunningTime="2026-04-20 15:12:06.209074249 +0000 UTC m=+586.339949553" Apr 20 15:12:17.195824 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:17.195793 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-pcx6p" Apr 20 15:12:18.706121 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:18.706088 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-xgw9x"] Apr 20 15:12:18.708868 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:18.708852 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-xgw9x" Apr 20 15:12:18.724620 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:18.724593 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-xgw9x"] Apr 20 15:12:18.738744 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:18.738715 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/db5fdd1e-bd48-4e12-8780-bfd6262659bf-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-84b657d985-xgw9x\" (UID: \"db5fdd1e-bd48-4e12-8780-bfd6262659bf\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-xgw9x" Apr 20 15:12:18.739179 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:18.739120 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8f4k\" (UniqueName: \"kubernetes.io/projected/db5fdd1e-bd48-4e12-8780-bfd6262659bf-kube-api-access-p8f4k\") pod \"kuadrant-operator-controller-manager-84b657d985-xgw9x\" (UID: \"db5fdd1e-bd48-4e12-8780-bfd6262659bf\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-xgw9x" Apr 20 15:12:18.815051 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:18.815012 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-xgw9x"] Apr 20 15:12:18.815257 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:12:18.815230 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[extensions-socket-volume kube-api-access-p8f4k], unattached volumes=[], failed to process volumes=[]: context canceled" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-xgw9x" podUID="db5fdd1e-bd48-4e12-8780-bfd6262659bf" Apr 20 15:12:18.840143 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:18.840113 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/db5fdd1e-bd48-4e12-8780-bfd6262659bf-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-84b657d985-xgw9x\" (UID: \"db5fdd1e-bd48-4e12-8780-bfd6262659bf\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-xgw9x" Apr 20 15:12:18.840232 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:18.840172 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p8f4k\" (UniqueName: \"kubernetes.io/projected/db5fdd1e-bd48-4e12-8780-bfd6262659bf-kube-api-access-p8f4k\") pod \"kuadrant-operator-controller-manager-84b657d985-xgw9x\" (UID: \"db5fdd1e-bd48-4e12-8780-bfd6262659bf\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-xgw9x" Apr 20 15:12:18.840600 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:18.840579 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/db5fdd1e-bd48-4e12-8780-bfd6262659bf-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-84b657d985-xgw9x\" (UID: \"db5fdd1e-bd48-4e12-8780-bfd6262659bf\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-xgw9x" Apr 20 15:12:18.864590 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:18.864556 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8f4k\" (UniqueName: \"kubernetes.io/projected/db5fdd1e-bd48-4e12-8780-bfd6262659bf-kube-api-access-p8f4k\") pod \"kuadrant-operator-controller-manager-84b657d985-xgw9x\" (UID: \"db5fdd1e-bd48-4e12-8780-bfd6262659bf\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-xgw9x" Apr 20 15:12:18.877953 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:18.877922 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-pcx6p"] Apr 20 15:12:18.878158 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:18.878129 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-pcx6p" podUID="fa2037f6-93c4-4c2c-af5c-fb40927efece" containerName="manager" containerID="cri-o://ba245e4c8ff9907a7407afbae816abfc3bb2a66ce74329d085b581f2da65a39f" gracePeriod=2 Apr 20 15:12:18.889497 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:18.889453 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-pcx6p"] Apr 20 15:12:18.921231 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:18.921195 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-xgw9x"] Apr 20 15:12:18.948639 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:18.948606 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-r6vwv"] Apr 20 15:12:18.948907 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:18.948893 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fa2037f6-93c4-4c2c-af5c-fb40927efece" containerName="manager" Apr 20 15:12:18.948907 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:18.948908 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa2037f6-93c4-4c2c-af5c-fb40927efece" containerName="manager" Apr 20 15:12:18.949025 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:18.948971 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="fa2037f6-93c4-4c2c-af5c-fb40927efece" containerName="manager" Apr 20 15:12:18.950576 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:18.950557 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-xgw9x"] Apr 20 15:12:18.950670 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:18.950658 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-r6vwv" Apr 20 15:12:18.953526 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:18.953495 2572 status_manager.go:895] "Failed to get status for pod" podUID="fa2037f6-93c4-4c2c-af5c-fb40927efece" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-pcx6p" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-pcx6p\" is forbidden: User \"system:node:ip-10-0-134-230.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-230.ec2.internal' and this object" Apr 20 15:12:18.995002 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:18.994955 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-r6vwv"] Apr 20 15:12:19.041922 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:19.041891 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/d69536c5-81c8-494e-a12a-97cb14f80690-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-r6vwv\" (UID: \"d69536c5-81c8-494e-a12a-97cb14f80690\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-r6vwv" Apr 20 15:12:19.042062 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:19.041936 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpcxw\" (UniqueName: \"kubernetes.io/projected/d69536c5-81c8-494e-a12a-97cb14f80690-kube-api-access-lpcxw\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-r6vwv\" (UID: \"d69536c5-81c8-494e-a12a-97cb14f80690\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-r6vwv" Apr 20 15:12:19.106866 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:19.106844 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-pcx6p" Apr 20 15:12:19.110724 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:19.110698 2572 status_manager.go:895] "Failed to get status for pod" podUID="fa2037f6-93c4-4c2c-af5c-fb40927efece" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-pcx6p" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-pcx6p\" is forbidden: User \"system:node:ip-10-0-134-230.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-230.ec2.internal' and this object" Apr 20 15:12:19.142259 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:19.142237 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/fa2037f6-93c4-4c2c-af5c-fb40927efece-extensions-socket-volume\") pod \"fa2037f6-93c4-4c2c-af5c-fb40927efece\" (UID: \"fa2037f6-93c4-4c2c-af5c-fb40927efece\") " Apr 20 15:12:19.142368 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:19.142291 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t6vzs\" (UniqueName: \"kubernetes.io/projected/fa2037f6-93c4-4c2c-af5c-fb40927efece-kube-api-access-t6vzs\") pod \"fa2037f6-93c4-4c2c-af5c-fb40927efece\" (UID: \"fa2037f6-93c4-4c2c-af5c-fb40927efece\") " Apr 20 15:12:19.142434 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:19.142417 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/d69536c5-81c8-494e-a12a-97cb14f80690-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-r6vwv\" (UID: \"d69536c5-81c8-494e-a12a-97cb14f80690\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-r6vwv" Apr 20 15:12:19.142527 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:19.142477 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lpcxw\" (UniqueName: \"kubernetes.io/projected/d69536c5-81c8-494e-a12a-97cb14f80690-kube-api-access-lpcxw\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-r6vwv\" (UID: \"d69536c5-81c8-494e-a12a-97cb14f80690\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-r6vwv" Apr 20 15:12:19.142843 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:19.142817 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa2037f6-93c4-4c2c-af5c-fb40927efece-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "fa2037f6-93c4-4c2c-af5c-fb40927efece" (UID: "fa2037f6-93c4-4c2c-af5c-fb40927efece"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 15:12:19.142923 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:19.142838 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/d69536c5-81c8-494e-a12a-97cb14f80690-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-r6vwv\" (UID: \"d69536c5-81c8-494e-a12a-97cb14f80690\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-r6vwv" Apr 20 15:12:19.144387 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:19.144367 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa2037f6-93c4-4c2c-af5c-fb40927efece-kube-api-access-t6vzs" (OuterVolumeSpecName: "kube-api-access-t6vzs") pod "fa2037f6-93c4-4c2c-af5c-fb40927efece" (UID: "fa2037f6-93c4-4c2c-af5c-fb40927efece"). InnerVolumeSpecName "kube-api-access-t6vzs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 15:12:19.150702 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:19.150678 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpcxw\" (UniqueName: \"kubernetes.io/projected/d69536c5-81c8-494e-a12a-97cb14f80690-kube-api-access-lpcxw\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-r6vwv\" (UID: \"d69536c5-81c8-494e-a12a-97cb14f80690\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-r6vwv" Apr 20 15:12:19.236243 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:19.236156 2572 generic.go:358] "Generic (PLEG): container finished" podID="fa2037f6-93c4-4c2c-af5c-fb40927efece" containerID="ba245e4c8ff9907a7407afbae816abfc3bb2a66ce74329d085b581f2da65a39f" exitCode=0 Apr 20 15:12:19.236243 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:19.236207 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-pcx6p" Apr 20 15:12:19.236456 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:19.236255 2572 scope.go:117] "RemoveContainer" containerID="ba245e4c8ff9907a7407afbae816abfc3bb2a66ce74329d085b581f2da65a39f" Apr 20 15:12:19.236456 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:19.236395 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-xgw9x" Apr 20 15:12:19.238572 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:19.238543 2572 status_manager.go:895] "Failed to get status for pod" podUID="fa2037f6-93c4-4c2c-af5c-fb40927efece" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-pcx6p" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-pcx6p\" is forbidden: User \"system:node:ip-10-0-134-230.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-230.ec2.internal' and this object" Apr 20 15:12:19.240714 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:19.240352 2572 status_manager.go:895] "Failed to get status for pod" podUID="db5fdd1e-bd48-4e12-8780-bfd6262659bf" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-xgw9x" err="pods \"kuadrant-operator-controller-manager-84b657d985-xgw9x\" is forbidden: User \"system:node:ip-10-0-134-230.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-230.ec2.internal' and this object" Apr 20 15:12:19.242221 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:19.242181 2572 status_manager.go:895] "Failed to get status for pod" podUID="fa2037f6-93c4-4c2c-af5c-fb40927efece" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-pcx6p" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-pcx6p\" is forbidden: User \"system:node:ip-10-0-134-230.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-230.ec2.internal' and this object" Apr 20 15:12:19.245539 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:19.244217 2572 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/fa2037f6-93c4-4c2c-af5c-fb40927efece-extensions-socket-volume\") on node \"ip-10-0-134-230.ec2.internal\" DevicePath \"\"" Apr 20 15:12:19.245539 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:19.244247 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-t6vzs\" (UniqueName: \"kubernetes.io/projected/fa2037f6-93c4-4c2c-af5c-fb40927efece-kube-api-access-t6vzs\") on node \"ip-10-0-134-230.ec2.internal\" DevicePath \"\"" Apr 20 15:12:19.246430 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:19.246411 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-xgw9x" Apr 20 15:12:19.248251 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:19.248227 2572 status_manager.go:895] "Failed to get status for pod" podUID="fa2037f6-93c4-4c2c-af5c-fb40927efece" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-pcx6p" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-pcx6p\" is forbidden: User \"system:node:ip-10-0-134-230.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-230.ec2.internal' and this object" Apr 20 15:12:19.249903 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:19.249887 2572 scope.go:117] "RemoveContainer" containerID="ba245e4c8ff9907a7407afbae816abfc3bb2a66ce74329d085b581f2da65a39f" Apr 20 15:12:19.250020 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:19.249991 2572 status_manager.go:895] "Failed to get status for pod" podUID="db5fdd1e-bd48-4e12-8780-bfd6262659bf" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-xgw9x" err="pods \"kuadrant-operator-controller-manager-84b657d985-xgw9x\" is forbidden: User \"system:node:ip-10-0-134-230.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-230.ec2.internal' and this object" Apr 20 15:12:19.250154 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:12:19.250135 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba245e4c8ff9907a7407afbae816abfc3bb2a66ce74329d085b581f2da65a39f\": container with ID starting with ba245e4c8ff9907a7407afbae816abfc3bb2a66ce74329d085b581f2da65a39f not found: ID does not exist" containerID="ba245e4c8ff9907a7407afbae816abfc3bb2a66ce74329d085b581f2da65a39f" Apr 20 15:12:19.250199 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:19.250161 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba245e4c8ff9907a7407afbae816abfc3bb2a66ce74329d085b581f2da65a39f"} err="failed to get container status \"ba245e4c8ff9907a7407afbae816abfc3bb2a66ce74329d085b581f2da65a39f\": rpc error: code = NotFound desc = could not find container \"ba245e4c8ff9907a7407afbae816abfc3bb2a66ce74329d085b581f2da65a39f\": container with ID starting with ba245e4c8ff9907a7407afbae816abfc3bb2a66ce74329d085b581f2da65a39f not found: ID does not exist" Apr 20 15:12:19.251743 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:19.251725 2572 status_manager.go:895] "Failed to get status for pod" podUID="db5fdd1e-bd48-4e12-8780-bfd6262659bf" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-xgw9x" err="pods \"kuadrant-operator-controller-manager-84b657d985-xgw9x\" is forbidden: User \"system:node:ip-10-0-134-230.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-230.ec2.internal' and this object" Apr 20 15:12:19.253399 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:19.253382 2572 status_manager.go:895] "Failed to get status for pod" podUID="fa2037f6-93c4-4c2c-af5c-fb40927efece" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-pcx6p" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-pcx6p\" is forbidden: User \"system:node:ip-10-0-134-230.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-230.ec2.internal' and this object" Apr 20 15:12:19.262669 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:19.262650 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-r6vwv" Apr 20 15:12:19.345002 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:19.344978 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p8f4k\" (UniqueName: \"kubernetes.io/projected/db5fdd1e-bd48-4e12-8780-bfd6262659bf-kube-api-access-p8f4k\") pod \"db5fdd1e-bd48-4e12-8780-bfd6262659bf\" (UID: \"db5fdd1e-bd48-4e12-8780-bfd6262659bf\") " Apr 20 15:12:19.345133 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:19.345042 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/db5fdd1e-bd48-4e12-8780-bfd6262659bf-extensions-socket-volume\") pod \"db5fdd1e-bd48-4e12-8780-bfd6262659bf\" (UID: \"db5fdd1e-bd48-4e12-8780-bfd6262659bf\") " Apr 20 15:12:19.345279 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:19.345260 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db5fdd1e-bd48-4e12-8780-bfd6262659bf-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "db5fdd1e-bd48-4e12-8780-bfd6262659bf" (UID: "db5fdd1e-bd48-4e12-8780-bfd6262659bf"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 15:12:19.348636 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:19.348611 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db5fdd1e-bd48-4e12-8780-bfd6262659bf-kube-api-access-p8f4k" (OuterVolumeSpecName: "kube-api-access-p8f4k") pod "db5fdd1e-bd48-4e12-8780-bfd6262659bf" (UID: "db5fdd1e-bd48-4e12-8780-bfd6262659bf"). InnerVolumeSpecName "kube-api-access-p8f4k". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 15:12:19.386905 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:19.386878 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-r6vwv"] Apr 20 15:12:19.389100 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:12:19.389073 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd69536c5_81c8_494e_a12a_97cb14f80690.slice/crio-cd2f564175abdf36fd30f8f32f95e1c28189a96693031467de7ffe7108e07345 WatchSource:0}: Error finding container cd2f564175abdf36fd30f8f32f95e1c28189a96693031467de7ffe7108e07345: Status 404 returned error can't find the container with id cd2f564175abdf36fd30f8f32f95e1c28189a96693031467de7ffe7108e07345 Apr 20 15:12:19.446031 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:19.446003 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-p8f4k\" (UniqueName: \"kubernetes.io/projected/db5fdd1e-bd48-4e12-8780-bfd6262659bf-kube-api-access-p8f4k\") on node \"ip-10-0-134-230.ec2.internal\" DevicePath \"\"" Apr 20 15:12:19.446031 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:19.446031 2572 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/db5fdd1e-bd48-4e12-8780-bfd6262659bf-extensions-socket-volume\") on node \"ip-10-0-134-230.ec2.internal\" DevicePath \"\"" Apr 20 15:12:20.244211 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:20.244123 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-xgw9x" Apr 20 15:12:20.244211 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:20.244136 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-r6vwv" event={"ID":"d69536c5-81c8-494e-a12a-97cb14f80690","Type":"ContainerStarted","Data":"51e191e6c665c7a8b17bc14759915ffc6dd9093b2e4c802326ef97d3d1824b77"} Apr 20 15:12:20.244211 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:20.244167 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-r6vwv" event={"ID":"d69536c5-81c8-494e-a12a-97cb14f80690","Type":"ContainerStarted","Data":"cd2f564175abdf36fd30f8f32f95e1c28189a96693031467de7ffe7108e07345"} Apr 20 15:12:20.244730 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:20.244292 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-r6vwv" Apr 20 15:12:20.246117 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:20.246092 2572 status_manager.go:895] "Failed to get status for pod" podUID="db5fdd1e-bd48-4e12-8780-bfd6262659bf" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-xgw9x" err="pods \"kuadrant-operator-controller-manager-84b657d985-xgw9x\" is forbidden: User \"system:node:ip-10-0-134-230.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-230.ec2.internal' and this object" Apr 20 15:12:20.247942 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:20.247909 2572 status_manager.go:895] "Failed to get status for pod" podUID="fa2037f6-93c4-4c2c-af5c-fb40927efece" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-pcx6p" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-pcx6p\" is forbidden: User \"system:node:ip-10-0-134-230.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-230.ec2.internal' and this object" Apr 20 15:12:20.272989 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:20.272958 2572 status_manager.go:895] "Failed to get status for pod" podUID="db5fdd1e-bd48-4e12-8780-bfd6262659bf" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-xgw9x" err="pods \"kuadrant-operator-controller-manager-84b657d985-xgw9x\" is forbidden: User \"system:node:ip-10-0-134-230.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-230.ec2.internal' and this object" Apr 20 15:12:20.273637 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:20.273602 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-r6vwv" podStartSLOduration=2.273591667 podStartE2EDuration="2.273591667s" podCreationTimestamp="2026-04-20 15:12:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 15:12:20.270827807 +0000 UTC m=+600.401703108" watchObservedRunningTime="2026-04-20 15:12:20.273591667 +0000 UTC m=+600.404466969" Apr 20 15:12:20.274784 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:20.274751 2572 status_manager.go:895] "Failed to get status for pod" podUID="fa2037f6-93c4-4c2c-af5c-fb40927efece" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-pcx6p" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-pcx6p\" is forbidden: User \"system:node:ip-10-0-134-230.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-230.ec2.internal' and this object" Apr 20 15:12:20.410122 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:20.410091 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db5fdd1e-bd48-4e12-8780-bfd6262659bf" path="/var/lib/kubelet/pods/db5fdd1e-bd48-4e12-8780-bfd6262659bf/volumes" Apr 20 15:12:20.410266 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:20.410090 2572 status_manager.go:895] "Failed to get status for pod" podUID="db5fdd1e-bd48-4e12-8780-bfd6262659bf" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-xgw9x" err="pods \"kuadrant-operator-controller-manager-84b657d985-xgw9x\" is forbidden: User \"system:node:ip-10-0-134-230.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-230.ec2.internal' and this object" Apr 20 15:12:20.410366 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:20.410354 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa2037f6-93c4-4c2c-af5c-fb40927efece" path="/var/lib/kubelet/pods/fa2037f6-93c4-4c2c-af5c-fb40927efece/volumes" Apr 20 15:12:20.412005 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:20.411979 2572 status_manager.go:895] "Failed to get status for pod" podUID="fa2037f6-93c4-4c2c-af5c-fb40927efece" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-pcx6p" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-pcx6p\" is forbidden: User \"system:node:ip-10-0-134-230.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-230.ec2.internal' and this object" Apr 20 15:12:31.249127 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:31.249096 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-r6vwv" Apr 20 15:12:35.170387 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:35.170351 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-r6vwv"] Apr 20 15:12:35.170851 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:35.170586 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-r6vwv" podUID="d69536c5-81c8-494e-a12a-97cb14f80690" containerName="manager" containerID="cri-o://51e191e6c665c7a8b17bc14759915ffc6dd9093b2e4c802326ef97d3d1824b77" gracePeriod=10 Apr 20 15:12:35.295140 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:35.295107 2572 generic.go:358] "Generic (PLEG): container finished" podID="d69536c5-81c8-494e-a12a-97cb14f80690" containerID="51e191e6c665c7a8b17bc14759915ffc6dd9093b2e4c802326ef97d3d1824b77" exitCode=0 Apr 20 15:12:35.295295 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:35.295178 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-r6vwv" event={"ID":"d69536c5-81c8-494e-a12a-97cb14f80690","Type":"ContainerDied","Data":"51e191e6c665c7a8b17bc14759915ffc6dd9093b2e4c802326ef97d3d1824b77"} Apr 20 15:12:35.403113 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:35.403090 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-r6vwv" Apr 20 15:12:35.467413 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:35.467333 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/d69536c5-81c8-494e-a12a-97cb14f80690-extensions-socket-volume\") pod \"d69536c5-81c8-494e-a12a-97cb14f80690\" (UID: \"d69536c5-81c8-494e-a12a-97cb14f80690\") " Apr 20 15:12:35.467413 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:35.467384 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lpcxw\" (UniqueName: \"kubernetes.io/projected/d69536c5-81c8-494e-a12a-97cb14f80690-kube-api-access-lpcxw\") pod \"d69536c5-81c8-494e-a12a-97cb14f80690\" (UID: \"d69536c5-81c8-494e-a12a-97cb14f80690\") " Apr 20 15:12:35.467745 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:35.467719 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d69536c5-81c8-494e-a12a-97cb14f80690-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "d69536c5-81c8-494e-a12a-97cb14f80690" (UID: "d69536c5-81c8-494e-a12a-97cb14f80690"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 15:12:35.469359 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:35.469339 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d69536c5-81c8-494e-a12a-97cb14f80690-kube-api-access-lpcxw" (OuterVolumeSpecName: "kube-api-access-lpcxw") pod "d69536c5-81c8-494e-a12a-97cb14f80690" (UID: "d69536c5-81c8-494e-a12a-97cb14f80690"). InnerVolumeSpecName "kube-api-access-lpcxw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 15:12:35.568708 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:35.568675 2572 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/d69536c5-81c8-494e-a12a-97cb14f80690-extensions-socket-volume\") on node \"ip-10-0-134-230.ec2.internal\" DevicePath \"\"" Apr 20 15:12:35.568708 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:35.568702 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lpcxw\" (UniqueName: \"kubernetes.io/projected/d69536c5-81c8-494e-a12a-97cb14f80690-kube-api-access-lpcxw\") on node \"ip-10-0-134-230.ec2.internal\" DevicePath \"\"" Apr 20 15:12:36.299868 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:36.299834 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-r6vwv" event={"ID":"d69536c5-81c8-494e-a12a-97cb14f80690","Type":"ContainerDied","Data":"cd2f564175abdf36fd30f8f32f95e1c28189a96693031467de7ffe7108e07345"} Apr 20 15:12:36.299868 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:36.299871 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-r6vwv" Apr 20 15:12:36.300359 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:36.299874 2572 scope.go:117] "RemoveContainer" containerID="51e191e6c665c7a8b17bc14759915ffc6dd9093b2e4c802326ef97d3d1824b77" Apr 20 15:12:36.321016 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:36.320987 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-r6vwv"] Apr 20 15:12:36.325533 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:36.325511 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-r6vwv"] Apr 20 15:12:36.409205 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:36.409162 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d69536c5-81c8-494e-a12a-97cb14f80690" path="/var/lib/kubelet/pods/d69536c5-81c8-494e-a12a-97cb14f80690/volumes" Apr 20 15:12:51.415137 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:51.415050 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-q76sp"] Apr 20 15:12:51.415723 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:51.415508 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d69536c5-81c8-494e-a12a-97cb14f80690" containerName="manager" Apr 20 15:12:51.415723 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:51.415526 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="d69536c5-81c8-494e-a12a-97cb14f80690" containerName="manager" Apr 20 15:12:51.415723 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:51.415604 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="d69536c5-81c8-494e-a12a-97cb14f80690" containerName="manager" Apr 20 15:12:51.418590 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:51.418566 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-q76sp" Apr 20 15:12:51.421151 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:51.421129 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"maas-default-gateway-openshift-default-dockercfg-t5ns6\"" Apr 20 15:12:51.430095 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:51.430074 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-q76sp"] Apr 20 15:12:51.482988 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:51.482953 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/6f8428f5-731d-47f5-a2a3-6e064ad22824-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-q76sp\" (UID: \"6f8428f5-731d-47f5-a2a3-6e064ad22824\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-q76sp" Apr 20 15:12:51.482988 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:51.482985 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/6f8428f5-731d-47f5-a2a3-6e064ad22824-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-q76sp\" (UID: \"6f8428f5-731d-47f5-a2a3-6e064ad22824\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-q76sp" Apr 20 15:12:51.483201 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:51.483005 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/6f8428f5-731d-47f5-a2a3-6e064ad22824-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-q76sp\" (UID: \"6f8428f5-731d-47f5-a2a3-6e064ad22824\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-q76sp" Apr 20 15:12:51.483201 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:51.483021 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/6f8428f5-731d-47f5-a2a3-6e064ad22824-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-q76sp\" (UID: \"6f8428f5-731d-47f5-a2a3-6e064ad22824\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-q76sp" Apr 20 15:12:51.483201 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:51.483068 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/6f8428f5-731d-47f5-a2a3-6e064ad22824-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-q76sp\" (UID: \"6f8428f5-731d-47f5-a2a3-6e064ad22824\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-q76sp" Apr 20 15:12:51.483201 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:51.483137 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/6f8428f5-731d-47f5-a2a3-6e064ad22824-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-q76sp\" (UID: \"6f8428f5-731d-47f5-a2a3-6e064ad22824\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-q76sp" Apr 20 15:12:51.483354 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:51.483255 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/6f8428f5-731d-47f5-a2a3-6e064ad22824-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-q76sp\" (UID: \"6f8428f5-731d-47f5-a2a3-6e064ad22824\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-q76sp" Apr 20 15:12:51.483354 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:51.483290 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8v4h\" (UniqueName: \"kubernetes.io/projected/6f8428f5-731d-47f5-a2a3-6e064ad22824-kube-api-access-d8v4h\") pod \"maas-default-gateway-openshift-default-58b6f876-q76sp\" (UID: \"6f8428f5-731d-47f5-a2a3-6e064ad22824\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-q76sp" Apr 20 15:12:51.483354 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:51.483324 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/6f8428f5-731d-47f5-a2a3-6e064ad22824-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-q76sp\" (UID: \"6f8428f5-731d-47f5-a2a3-6e064ad22824\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-q76sp" Apr 20 15:12:51.584001 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:51.583968 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/6f8428f5-731d-47f5-a2a3-6e064ad22824-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-q76sp\" (UID: \"6f8428f5-731d-47f5-a2a3-6e064ad22824\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-q76sp" Apr 20 15:12:51.584001 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:51.584004 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d8v4h\" (UniqueName: \"kubernetes.io/projected/6f8428f5-731d-47f5-a2a3-6e064ad22824-kube-api-access-d8v4h\") pod \"maas-default-gateway-openshift-default-58b6f876-q76sp\" (UID: \"6f8428f5-731d-47f5-a2a3-6e064ad22824\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-q76sp" Apr 20 15:12:51.584239 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:51.584027 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/6f8428f5-731d-47f5-a2a3-6e064ad22824-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-q76sp\" (UID: \"6f8428f5-731d-47f5-a2a3-6e064ad22824\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-q76sp" Apr 20 15:12:51.584239 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:51.584167 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/6f8428f5-731d-47f5-a2a3-6e064ad22824-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-q76sp\" (UID: \"6f8428f5-731d-47f5-a2a3-6e064ad22824\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-q76sp" Apr 20 15:12:51.584239 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:51.584217 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/6f8428f5-731d-47f5-a2a3-6e064ad22824-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-q76sp\" (UID: \"6f8428f5-731d-47f5-a2a3-6e064ad22824\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-q76sp" Apr 20 15:12:51.584400 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:51.584250 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/6f8428f5-731d-47f5-a2a3-6e064ad22824-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-q76sp\" (UID: \"6f8428f5-731d-47f5-a2a3-6e064ad22824\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-q76sp" Apr 20 15:12:51.584400 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:51.584286 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/6f8428f5-731d-47f5-a2a3-6e064ad22824-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-q76sp\" (UID: \"6f8428f5-731d-47f5-a2a3-6e064ad22824\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-q76sp" Apr 20 15:12:51.584400 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:51.584393 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/6f8428f5-731d-47f5-a2a3-6e064ad22824-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-q76sp\" (UID: \"6f8428f5-731d-47f5-a2a3-6e064ad22824\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-q76sp" Apr 20 15:12:51.584572 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:51.584437 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/6f8428f5-731d-47f5-a2a3-6e064ad22824-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-q76sp\" (UID: \"6f8428f5-731d-47f5-a2a3-6e064ad22824\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-q76sp" Apr 20 15:12:51.584572 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:51.584469 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/6f8428f5-731d-47f5-a2a3-6e064ad22824-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-q76sp\" (UID: \"6f8428f5-731d-47f5-a2a3-6e064ad22824\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-q76sp" Apr 20 15:12:51.584572 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:51.584541 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/6f8428f5-731d-47f5-a2a3-6e064ad22824-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-q76sp\" (UID: \"6f8428f5-731d-47f5-a2a3-6e064ad22824\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-q76sp" Apr 20 15:12:51.584726 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:51.584597 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/6f8428f5-731d-47f5-a2a3-6e064ad22824-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-q76sp\" (UID: \"6f8428f5-731d-47f5-a2a3-6e064ad22824\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-q76sp" Apr 20 15:12:51.584779 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:51.584766 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/6f8428f5-731d-47f5-a2a3-6e064ad22824-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-q76sp\" (UID: \"6f8428f5-731d-47f5-a2a3-6e064ad22824\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-q76sp" Apr 20 15:12:51.585928 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:51.585866 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/6f8428f5-731d-47f5-a2a3-6e064ad22824-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-q76sp\" (UID: \"6f8428f5-731d-47f5-a2a3-6e064ad22824\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-q76sp" Apr 20 15:12:51.587120 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:51.587059 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/6f8428f5-731d-47f5-a2a3-6e064ad22824-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-q76sp\" (UID: \"6f8428f5-731d-47f5-a2a3-6e064ad22824\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-q76sp" Apr 20 15:12:51.587287 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:51.587263 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/6f8428f5-731d-47f5-a2a3-6e064ad22824-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-q76sp\" (UID: \"6f8428f5-731d-47f5-a2a3-6e064ad22824\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-q76sp" Apr 20 15:12:51.600690 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:51.600660 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8v4h\" (UniqueName: \"kubernetes.io/projected/6f8428f5-731d-47f5-a2a3-6e064ad22824-kube-api-access-d8v4h\") pod \"maas-default-gateway-openshift-default-58b6f876-q76sp\" (UID: \"6f8428f5-731d-47f5-a2a3-6e064ad22824\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-q76sp" Apr 20 15:12:51.600917 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:51.600898 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/6f8428f5-731d-47f5-a2a3-6e064ad22824-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-q76sp\" (UID: \"6f8428f5-731d-47f5-a2a3-6e064ad22824\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-q76sp" Apr 20 15:12:51.733274 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:51.733188 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-q76sp" Apr 20 15:12:51.866689 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:51.866654 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-q76sp"] Apr 20 15:12:51.867478 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:12:51.867447 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f8428f5_731d_47f5_a2a3_6e064ad22824.slice/crio-94cf1aa4c9efb83506e5e23f45724791014e08fb5f8f3512a7f2beb80fdd9b22 WatchSource:0}: Error finding container 94cf1aa4c9efb83506e5e23f45724791014e08fb5f8f3512a7f2beb80fdd9b22: Status 404 returned error can't find the container with id 94cf1aa4c9efb83506e5e23f45724791014e08fb5f8f3512a7f2beb80fdd9b22 Apr 20 15:12:51.869210 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:51.869193 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 15:12:52.359436 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:52.359400 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-q76sp" event={"ID":"6f8428f5-731d-47f5-a2a3-6e064ad22824","Type":"ContainerStarted","Data":"94cf1aa4c9efb83506e5e23f45724791014e08fb5f8f3512a7f2beb80fdd9b22"} Apr 20 15:12:54.244976 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:54.244936 2572 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236216Ki","pods":"250"} Apr 20 15:12:54.245228 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:54.245008 2572 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236216Ki","pods":"250"} Apr 20 15:12:54.245228 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:54.245044 2572 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236216Ki","pods":"250"} Apr 20 15:12:54.368998 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:54.368958 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-q76sp" event={"ID":"6f8428f5-731d-47f5-a2a3-6e064ad22824","Type":"ContainerStarted","Data":"1c1a435e9ff43a9f3faa636330f7c7b86de8bf7e45f48fa559877292abd4a6c0"} Apr 20 15:12:54.391347 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:54.391282 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-q76sp" podStartSLOduration=1.015899479 podStartE2EDuration="3.391263532s" podCreationTimestamp="2026-04-20 15:12:51 +0000 UTC" firstStartedPulling="2026-04-20 15:12:51.869318143 +0000 UTC m=+632.000193424" lastFinishedPulling="2026-04-20 15:12:54.244682196 +0000 UTC m=+634.375557477" observedRunningTime="2026-04-20 15:12:54.38789801 +0000 UTC m=+634.518773315" watchObservedRunningTime="2026-04-20 15:12:54.391263532 +0000 UTC m=+634.522138834" Apr 20 15:12:54.734028 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:54.733990 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-q76sp" Apr 20 15:12:55.563386 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:55.563355 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-94j7w"] Apr 20 15:12:55.566714 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:55.566693 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-94j7w" Apr 20 15:12:55.569314 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:55.569294 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 20 15:12:55.570304 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:55.570282 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 20 15:12:55.570410 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:55.570304 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 20 15:12:55.570410 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:55.570282 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-qnq8w\"" Apr 20 15:12:55.578272 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:55.578250 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-94j7w"] Apr 20 15:12:55.620425 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:55.620399 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5qvl\" (UniqueName: \"kubernetes.io/projected/c1a85951-ba0b-48b9-9a1b-2a9a8848ad13-kube-api-access-d5qvl\") pod \"limitador-limitador-7d549b5b-94j7w\" (UID: \"c1a85951-ba0b-48b9-9a1b-2a9a8848ad13\") " pod="kuadrant-system/limitador-limitador-7d549b5b-94j7w" Apr 20 15:12:55.620543 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:55.620459 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/c1a85951-ba0b-48b9-9a1b-2a9a8848ad13-config-file\") pod \"limitador-limitador-7d549b5b-94j7w\" (UID: \"c1a85951-ba0b-48b9-9a1b-2a9a8848ad13\") " pod="kuadrant-system/limitador-limitador-7d549b5b-94j7w" Apr 20 15:12:55.665629 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:55.665592 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-94j7w"] Apr 20 15:12:55.721064 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:55.721029 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d5qvl\" (UniqueName: \"kubernetes.io/projected/c1a85951-ba0b-48b9-9a1b-2a9a8848ad13-kube-api-access-d5qvl\") pod \"limitador-limitador-7d549b5b-94j7w\" (UID: \"c1a85951-ba0b-48b9-9a1b-2a9a8848ad13\") " pod="kuadrant-system/limitador-limitador-7d549b5b-94j7w" Apr 20 15:12:55.721222 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:55.721098 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/c1a85951-ba0b-48b9-9a1b-2a9a8848ad13-config-file\") pod \"limitador-limitador-7d549b5b-94j7w\" (UID: \"c1a85951-ba0b-48b9-9a1b-2a9a8848ad13\") " pod="kuadrant-system/limitador-limitador-7d549b5b-94j7w" Apr 20 15:12:55.721648 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:55.721630 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/c1a85951-ba0b-48b9-9a1b-2a9a8848ad13-config-file\") pod \"limitador-limitador-7d549b5b-94j7w\" (UID: \"c1a85951-ba0b-48b9-9a1b-2a9a8848ad13\") " pod="kuadrant-system/limitador-limitador-7d549b5b-94j7w" Apr 20 15:12:55.729129 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:55.729096 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5qvl\" (UniqueName: \"kubernetes.io/projected/c1a85951-ba0b-48b9-9a1b-2a9a8848ad13-kube-api-access-d5qvl\") pod \"limitador-limitador-7d549b5b-94j7w\" (UID: \"c1a85951-ba0b-48b9-9a1b-2a9a8848ad13\") " pod="kuadrant-system/limitador-limitador-7d549b5b-94j7w" Apr 20 15:12:55.738692 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:55.738666 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-q76sp" Apr 20 15:12:55.856138 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:55.856055 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-n6kbp"] Apr 20 15:12:55.859570 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:55.859537 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-n6kbp" Apr 20 15:12:55.866309 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:55.866285 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-n6kbp"] Apr 20 15:12:55.876370 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:55.876349 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-94j7w" Apr 20 15:12:55.892685 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:55.892658 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-n6kbp"] Apr 20 15:12:55.923551 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:55.923479 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pngh5\" (UniqueName: \"kubernetes.io/projected/854cce36-de61-4a85-8526-67aa5a1fa03c-kube-api-access-pngh5\") pod \"limitador-limitador-78c99df468-n6kbp\" (UID: \"854cce36-de61-4a85-8526-67aa5a1fa03c\") " pod="kuadrant-system/limitador-limitador-78c99df468-n6kbp" Apr 20 15:12:55.923637 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:55.923578 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/854cce36-de61-4a85-8526-67aa5a1fa03c-config-file\") pod \"limitador-limitador-78c99df468-n6kbp\" (UID: \"854cce36-de61-4a85-8526-67aa5a1fa03c\") " pod="kuadrant-system/limitador-limitador-78c99df468-n6kbp" Apr 20 15:12:55.996317 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:55.996294 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-94j7w"] Apr 20 15:12:55.997983 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:12:55.997956 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1a85951_ba0b_48b9_9a1b_2a9a8848ad13.slice/crio-fb8803589a09bd566fba3700a8a4d1a7e462f3d42e91c36c62cde1f32ee2b74c WatchSource:0}: Error finding container fb8803589a09bd566fba3700a8a4d1a7e462f3d42e91c36c62cde1f32ee2b74c: Status 404 returned error can't find the container with id fb8803589a09bd566fba3700a8a4d1a7e462f3d42e91c36c62cde1f32ee2b74c Apr 20 15:12:56.024949 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:56.024923 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pngh5\" (UniqueName: \"kubernetes.io/projected/854cce36-de61-4a85-8526-67aa5a1fa03c-kube-api-access-pngh5\") pod \"limitador-limitador-78c99df468-n6kbp\" (UID: \"854cce36-de61-4a85-8526-67aa5a1fa03c\") " pod="kuadrant-system/limitador-limitador-78c99df468-n6kbp" Apr 20 15:12:56.025044 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:56.024968 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/854cce36-de61-4a85-8526-67aa5a1fa03c-config-file\") pod \"limitador-limitador-78c99df468-n6kbp\" (UID: \"854cce36-de61-4a85-8526-67aa5a1fa03c\") " pod="kuadrant-system/limitador-limitador-78c99df468-n6kbp" Apr 20 15:12:56.025842 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:56.025824 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/854cce36-de61-4a85-8526-67aa5a1fa03c-config-file\") pod \"limitador-limitador-78c99df468-n6kbp\" (UID: \"854cce36-de61-4a85-8526-67aa5a1fa03c\") " pod="kuadrant-system/limitador-limitador-78c99df468-n6kbp" Apr 20 15:12:56.032251 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:56.032233 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pngh5\" (UniqueName: \"kubernetes.io/projected/854cce36-de61-4a85-8526-67aa5a1fa03c-kube-api-access-pngh5\") pod \"limitador-limitador-78c99df468-n6kbp\" (UID: \"854cce36-de61-4a85-8526-67aa5a1fa03c\") " pod="kuadrant-system/limitador-limitador-78c99df468-n6kbp" Apr 20 15:12:56.170409 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:56.170335 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-n6kbp" Apr 20 15:12:56.285945 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:56.285916 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-n6kbp"] Apr 20 15:12:56.287066 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:12:56.287040 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod854cce36_de61_4a85_8526_67aa5a1fa03c.slice/crio-5e5c931c45d6d28837c3bbb2c8133b9337af2984157edcc622428684fd88e1ed WatchSource:0}: Error finding container 5e5c931c45d6d28837c3bbb2c8133b9337af2984157edcc622428684fd88e1ed: Status 404 returned error can't find the container with id 5e5c931c45d6d28837c3bbb2c8133b9337af2984157edcc622428684fd88e1ed Apr 20 15:12:56.378934 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:56.378897 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-94j7w" event={"ID":"c1a85951-ba0b-48b9-9a1b-2a9a8848ad13","Type":"ContainerStarted","Data":"fb8803589a09bd566fba3700a8a4d1a7e462f3d42e91c36c62cde1f32ee2b74c"} Apr 20 15:12:56.380139 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:56.380112 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-n6kbp" event={"ID":"854cce36-de61-4a85-8526-67aa5a1fa03c","Type":"ContainerStarted","Data":"5e5c931c45d6d28837c3bbb2c8133b9337af2984157edcc622428684fd88e1ed"} Apr 20 15:12:56.380410 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:56.380386 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-q76sp" Apr 20 15:12:56.381395 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:56.381376 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-q76sp" Apr 20 15:12:59.394381 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:59.394281 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-94j7w" event={"ID":"c1a85951-ba0b-48b9-9a1b-2a9a8848ad13","Type":"ContainerStarted","Data":"c87a6e214868f5104cf079211d06aad0402056dbb82d06083e67be613adc7d41"} Apr 20 15:12:59.394381 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:59.394345 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-7d549b5b-94j7w" Apr 20 15:12:59.395710 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:59.395686 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-n6kbp" event={"ID":"854cce36-de61-4a85-8526-67aa5a1fa03c","Type":"ContainerStarted","Data":"09980c23cc725040a0c355a0ab91b57936548b197cb6a083fe4508392be09685"} Apr 20 15:12:59.395826 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:59.395805 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-78c99df468-n6kbp" Apr 20 15:12:59.414164 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:59.414121 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-7d549b5b-94j7w" podStartSLOduration=1.271294553 podStartE2EDuration="4.414108111s" podCreationTimestamp="2026-04-20 15:12:55 +0000 UTC" firstStartedPulling="2026-04-20 15:12:55.99970867 +0000 UTC m=+636.130583951" lastFinishedPulling="2026-04-20 15:12:59.142522224 +0000 UTC m=+639.273397509" observedRunningTime="2026-04-20 15:12:59.412050683 +0000 UTC m=+639.542925987" watchObservedRunningTime="2026-04-20 15:12:59.414108111 +0000 UTC m=+639.544983465" Apr 20 15:12:59.426583 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:12:59.426538 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-78c99df468-n6kbp" podStartSLOduration=1.58259756 podStartE2EDuration="4.426526844s" podCreationTimestamp="2026-04-20 15:12:55 +0000 UTC" firstStartedPulling="2026-04-20 15:12:56.288705549 +0000 UTC m=+636.419580830" lastFinishedPulling="2026-04-20 15:12:59.132634833 +0000 UTC m=+639.263510114" observedRunningTime="2026-04-20 15:12:59.425718183 +0000 UTC m=+639.556593490" watchObservedRunningTime="2026-04-20 15:12:59.426526844 +0000 UTC m=+639.557402146" Apr 20 15:13:10.400693 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:13:10.400664 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-78c99df468-n6kbp" Apr 20 15:13:10.401037 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:13:10.400710 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-7d549b5b-94j7w" Apr 20 15:13:10.455711 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:13:10.455673 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-94j7w"] Apr 20 15:13:10.455933 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:13:10.455892 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-limitador-7d549b5b-94j7w" podUID="c1a85951-ba0b-48b9-9a1b-2a9a8848ad13" containerName="limitador" containerID="cri-o://c87a6e214868f5104cf079211d06aad0402056dbb82d06083e67be613adc7d41" gracePeriod=30 Apr 20 15:13:11.003786 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:13:11.003765 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-94j7w" Apr 20 15:13:11.142927 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:13:11.142853 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d5qvl\" (UniqueName: \"kubernetes.io/projected/c1a85951-ba0b-48b9-9a1b-2a9a8848ad13-kube-api-access-d5qvl\") pod \"c1a85951-ba0b-48b9-9a1b-2a9a8848ad13\" (UID: \"c1a85951-ba0b-48b9-9a1b-2a9a8848ad13\") " Apr 20 15:13:11.142927 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:13:11.142894 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/c1a85951-ba0b-48b9-9a1b-2a9a8848ad13-config-file\") pod \"c1a85951-ba0b-48b9-9a1b-2a9a8848ad13\" (UID: \"c1a85951-ba0b-48b9-9a1b-2a9a8848ad13\") " Apr 20 15:13:11.143279 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:13:11.143255 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1a85951-ba0b-48b9-9a1b-2a9a8848ad13-config-file" (OuterVolumeSpecName: "config-file") pod "c1a85951-ba0b-48b9-9a1b-2a9a8848ad13" (UID: "c1a85951-ba0b-48b9-9a1b-2a9a8848ad13"). InnerVolumeSpecName "config-file". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 15:13:11.144884 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:13:11.144859 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1a85951-ba0b-48b9-9a1b-2a9a8848ad13-kube-api-access-d5qvl" (OuterVolumeSpecName: "kube-api-access-d5qvl") pod "c1a85951-ba0b-48b9-9a1b-2a9a8848ad13" (UID: "c1a85951-ba0b-48b9-9a1b-2a9a8848ad13"). InnerVolumeSpecName "kube-api-access-d5qvl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 15:13:11.243875 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:13:11.243846 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-d5qvl\" (UniqueName: \"kubernetes.io/projected/c1a85951-ba0b-48b9-9a1b-2a9a8848ad13-kube-api-access-d5qvl\") on node \"ip-10-0-134-230.ec2.internal\" DevicePath \"\"" Apr 20 15:13:11.243875 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:13:11.243872 2572 reconciler_common.go:299] "Volume detached for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/c1a85951-ba0b-48b9-9a1b-2a9a8848ad13-config-file\") on node \"ip-10-0-134-230.ec2.internal\" DevicePath \"\"" Apr 20 15:13:11.437189 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:13:11.437104 2572 generic.go:358] "Generic (PLEG): container finished" podID="c1a85951-ba0b-48b9-9a1b-2a9a8848ad13" containerID="c87a6e214868f5104cf079211d06aad0402056dbb82d06083e67be613adc7d41" exitCode=0 Apr 20 15:13:11.437189 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:13:11.437173 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-94j7w" Apr 20 15:13:11.437640 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:13:11.437187 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-94j7w" event={"ID":"c1a85951-ba0b-48b9-9a1b-2a9a8848ad13","Type":"ContainerDied","Data":"c87a6e214868f5104cf079211d06aad0402056dbb82d06083e67be613adc7d41"} Apr 20 15:13:11.437640 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:13:11.437225 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-94j7w" event={"ID":"c1a85951-ba0b-48b9-9a1b-2a9a8848ad13","Type":"ContainerDied","Data":"fb8803589a09bd566fba3700a8a4d1a7e462f3d42e91c36c62cde1f32ee2b74c"} Apr 20 15:13:11.437640 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:13:11.437240 2572 scope.go:117] "RemoveContainer" containerID="c87a6e214868f5104cf079211d06aad0402056dbb82d06083e67be613adc7d41" Apr 20 15:13:11.445432 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:13:11.445415 2572 scope.go:117] "RemoveContainer" containerID="c87a6e214868f5104cf079211d06aad0402056dbb82d06083e67be613adc7d41" Apr 20 15:13:11.445702 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:13:11.445684 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c87a6e214868f5104cf079211d06aad0402056dbb82d06083e67be613adc7d41\": container with ID starting with c87a6e214868f5104cf079211d06aad0402056dbb82d06083e67be613adc7d41 not found: ID does not exist" containerID="c87a6e214868f5104cf079211d06aad0402056dbb82d06083e67be613adc7d41" Apr 20 15:13:11.445756 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:13:11.445710 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c87a6e214868f5104cf079211d06aad0402056dbb82d06083e67be613adc7d41"} err="failed to get container status \"c87a6e214868f5104cf079211d06aad0402056dbb82d06083e67be613adc7d41\": rpc error: code = NotFound desc = could not find container \"c87a6e214868f5104cf079211d06aad0402056dbb82d06083e67be613adc7d41\": container with ID starting with c87a6e214868f5104cf079211d06aad0402056dbb82d06083e67be613adc7d41 not found: ID does not exist" Apr 20 15:13:11.460475 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:13:11.460452 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-94j7w"] Apr 20 15:13:11.464318 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:13:11.464296 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-94j7w"] Apr 20 15:13:11.590640 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:13:11.590608 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/postgres-868db5846d-gqcfc"] Apr 20 15:13:11.590901 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:13:11.590890 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c1a85951-ba0b-48b9-9a1b-2a9a8848ad13" containerName="limitador" Apr 20 15:13:11.590945 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:13:11.590903 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1a85951-ba0b-48b9-9a1b-2a9a8848ad13" containerName="limitador" Apr 20 15:13:11.590979 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:13:11.590962 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="c1a85951-ba0b-48b9-9a1b-2a9a8848ad13" containerName="limitador" Apr 20 15:13:11.594873 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:13:11.594854 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-gqcfc" Apr 20 15:13:11.597300 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:13:11.597278 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"postgres-creds\"" Apr 20 15:13:11.597409 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:13:11.597329 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"default-dockercfg-gspq6\"" Apr 20 15:13:11.602305 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:13:11.602283 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-gqcfc"] Apr 20 15:13:11.748175 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:13:11.748150 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86k6z\" (UniqueName: \"kubernetes.io/projected/fbb2fbe2-9ebf-4e0b-ac7e-57db6538eafc-kube-api-access-86k6z\") pod \"postgres-868db5846d-gqcfc\" (UID: \"fbb2fbe2-9ebf-4e0b-ac7e-57db6538eafc\") " pod="opendatahub/postgres-868db5846d-gqcfc" Apr 20 15:13:11.748335 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:13:11.748191 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/fbb2fbe2-9ebf-4e0b-ac7e-57db6538eafc-data\") pod \"postgres-868db5846d-gqcfc\" (UID: \"fbb2fbe2-9ebf-4e0b-ac7e-57db6538eafc\") " pod="opendatahub/postgres-868db5846d-gqcfc" Apr 20 15:13:11.849464 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:13:11.849435 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-86k6z\" (UniqueName: \"kubernetes.io/projected/fbb2fbe2-9ebf-4e0b-ac7e-57db6538eafc-kube-api-access-86k6z\") pod \"postgres-868db5846d-gqcfc\" (UID: \"fbb2fbe2-9ebf-4e0b-ac7e-57db6538eafc\") " pod="opendatahub/postgres-868db5846d-gqcfc" Apr 20 15:13:11.849636 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:13:11.849506 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/fbb2fbe2-9ebf-4e0b-ac7e-57db6538eafc-data\") pod \"postgres-868db5846d-gqcfc\" (UID: \"fbb2fbe2-9ebf-4e0b-ac7e-57db6538eafc\") " pod="opendatahub/postgres-868db5846d-gqcfc" Apr 20 15:13:11.849859 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:13:11.849841 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/fbb2fbe2-9ebf-4e0b-ac7e-57db6538eafc-data\") pod \"postgres-868db5846d-gqcfc\" (UID: \"fbb2fbe2-9ebf-4e0b-ac7e-57db6538eafc\") " pod="opendatahub/postgres-868db5846d-gqcfc" Apr 20 15:13:11.857834 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:13:11.857802 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-86k6z\" (UniqueName: \"kubernetes.io/projected/fbb2fbe2-9ebf-4e0b-ac7e-57db6538eafc-kube-api-access-86k6z\") pod \"postgres-868db5846d-gqcfc\" (UID: \"fbb2fbe2-9ebf-4e0b-ac7e-57db6538eafc\") " pod="opendatahub/postgres-868db5846d-gqcfc" Apr 20 15:13:11.906764 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:13:11.906744 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-gqcfc" Apr 20 15:13:12.021893 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:13:12.021870 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-gqcfc"] Apr 20 15:13:12.023954 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:13:12.023927 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfbb2fbe2_9ebf_4e0b_ac7e_57db6538eafc.slice/crio-2a513fffeb3cef52c7233eea40220cf488865d161abc97a5e8dac1dfa3426417 WatchSource:0}: Error finding container 2a513fffeb3cef52c7233eea40220cf488865d161abc97a5e8dac1dfa3426417: Status 404 returned error can't find the container with id 2a513fffeb3cef52c7233eea40220cf488865d161abc97a5e8dac1dfa3426417 Apr 20 15:13:12.410158 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:13:12.410072 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1a85951-ba0b-48b9-9a1b-2a9a8848ad13" path="/var/lib/kubelet/pods/c1a85951-ba0b-48b9-9a1b-2a9a8848ad13/volumes" Apr 20 15:13:12.441074 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:13:12.441044 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-gqcfc" event={"ID":"fbb2fbe2-9ebf-4e0b-ac7e-57db6538eafc","Type":"ContainerStarted","Data":"2a513fffeb3cef52c7233eea40220cf488865d161abc97a5e8dac1dfa3426417"} Apr 20 15:13:17.464556 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:13:17.464459 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-gqcfc" event={"ID":"fbb2fbe2-9ebf-4e0b-ac7e-57db6538eafc","Type":"ContainerStarted","Data":"2941503c2facf098f5f5766d9afe8f54b05b50716d74ec3b14b8b225c52a8210"} Apr 20 15:13:17.464893 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:13:17.464635 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/postgres-868db5846d-gqcfc" Apr 20 15:13:17.481043 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:13:17.480996 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/postgres-868db5846d-gqcfc" podStartSLOduration=1.305226604 podStartE2EDuration="6.480983089s" podCreationTimestamp="2026-04-20 15:13:11 +0000 UTC" firstStartedPulling="2026-04-20 15:13:12.025322248 +0000 UTC m=+652.156197532" lastFinishedPulling="2026-04-20 15:13:17.201078719 +0000 UTC m=+657.331954017" observedRunningTime="2026-04-20 15:13:17.479389382 +0000 UTC m=+657.610264687" watchObservedRunningTime="2026-04-20 15:13:17.480983089 +0000 UTC m=+657.611858392" Apr 20 15:13:23.496302 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:13:23.496273 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/postgres-868db5846d-gqcfc" Apr 20 15:13:32.920571 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:13:32.920542 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-7d56498df5-2wdmm"] Apr 20 15:13:32.952723 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:13:32.952691 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-7d56498df5-2wdmm"] Apr 20 15:13:32.952723 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:13:32.952717 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-7d56498df5-2wdmm" Apr 20 15:13:32.955275 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:13:32.955256 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"maas-parameters\"" Apr 20 15:13:32.955540 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:13:32.955518 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-5m5vb\"" Apr 20 15:13:32.955642 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:13:32.955545 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-serving-cert\"" Apr 20 15:13:33.018834 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:13:33.018804 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqxm4\" (UniqueName: \"kubernetes.io/projected/2f618e72-f5db-4a9a-bbca-bde37b0e2017-kube-api-access-zqxm4\") pod \"maas-api-7d56498df5-2wdmm\" (UID: \"2f618e72-f5db-4a9a-bbca-bde37b0e2017\") " pod="opendatahub/maas-api-7d56498df5-2wdmm" Apr 20 15:13:33.018969 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:13:33.018917 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/2f618e72-f5db-4a9a-bbca-bde37b0e2017-maas-api-tls\") pod \"maas-api-7d56498df5-2wdmm\" (UID: \"2f618e72-f5db-4a9a-bbca-bde37b0e2017\") " pod="opendatahub/maas-api-7d56498df5-2wdmm" Apr 20 15:13:33.119772 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:13:33.119744 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zqxm4\" (UniqueName: \"kubernetes.io/projected/2f618e72-f5db-4a9a-bbca-bde37b0e2017-kube-api-access-zqxm4\") pod \"maas-api-7d56498df5-2wdmm\" (UID: \"2f618e72-f5db-4a9a-bbca-bde37b0e2017\") " pod="opendatahub/maas-api-7d56498df5-2wdmm" Apr 20 15:13:33.119915 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:13:33.119801 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/2f618e72-f5db-4a9a-bbca-bde37b0e2017-maas-api-tls\") pod \"maas-api-7d56498df5-2wdmm\" (UID: \"2f618e72-f5db-4a9a-bbca-bde37b0e2017\") " pod="opendatahub/maas-api-7d56498df5-2wdmm" Apr 20 15:13:33.122214 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:13:33.122194 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/2f618e72-f5db-4a9a-bbca-bde37b0e2017-maas-api-tls\") pod \"maas-api-7d56498df5-2wdmm\" (UID: \"2f618e72-f5db-4a9a-bbca-bde37b0e2017\") " pod="opendatahub/maas-api-7d56498df5-2wdmm" Apr 20 15:13:33.130520 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:13:33.130476 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqxm4\" (UniqueName: \"kubernetes.io/projected/2f618e72-f5db-4a9a-bbca-bde37b0e2017-kube-api-access-zqxm4\") pod \"maas-api-7d56498df5-2wdmm\" (UID: \"2f618e72-f5db-4a9a-bbca-bde37b0e2017\") " pod="opendatahub/maas-api-7d56498df5-2wdmm" Apr 20 15:13:33.262752 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:13:33.262723 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-7d56498df5-2wdmm" Apr 20 15:13:33.377839 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:13:33.377810 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-n6kbp"] Apr 20 15:13:33.381185 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:13:33.381120 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-7d56498df5-2wdmm"] Apr 20 15:13:33.383240 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:13:33.383216 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f618e72_f5db_4a9a_bbca_bde37b0e2017.slice/crio-455b69d0c61b53b8e1731a563219334a4b4121a3da3590f97e688b7c907ffaf8 WatchSource:0}: Error finding container 455b69d0c61b53b8e1731a563219334a4b4121a3da3590f97e688b7c907ffaf8: Status 404 returned error can't find the container with id 455b69d0c61b53b8e1731a563219334a4b4121a3da3590f97e688b7c907ffaf8 Apr 20 15:13:33.521560 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:13:33.521468 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-7d56498df5-2wdmm" event={"ID":"2f618e72-f5db-4a9a-bbca-bde37b0e2017","Type":"ContainerStarted","Data":"455b69d0c61b53b8e1731a563219334a4b4121a3da3590f97e688b7c907ffaf8"} Apr 20 15:13:36.536834 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:13:36.536802 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-7d56498df5-2wdmm" event={"ID":"2f618e72-f5db-4a9a-bbca-bde37b0e2017","Type":"ContainerStarted","Data":"eaf05daf1b9c581a459d7fd4694e21532b34f7f0d4764934497ca274c06be871"} Apr 20 15:13:36.537215 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:13:36.536937 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-7d56498df5-2wdmm" Apr 20 15:13:36.554158 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:13:36.554111 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-7d56498df5-2wdmm" podStartSLOduration=1.898559554 podStartE2EDuration="4.554099082s" podCreationTimestamp="2026-04-20 15:13:32 +0000 UTC" firstStartedPulling="2026-04-20 15:13:33.384585515 +0000 UTC m=+673.515460799" lastFinishedPulling="2026-04-20 15:13:36.040125046 +0000 UTC m=+676.171000327" observedRunningTime="2026-04-20 15:13:36.55264523 +0000 UTC m=+676.683520532" watchObservedRunningTime="2026-04-20 15:13:36.554099082 +0000 UTC m=+676.684974384" Apr 20 15:13:42.546550 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:13:42.546517 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-7d56498df5-2wdmm" Apr 20 15:13:42.695501 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:13:42.695452 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-7458b9fb6c-r2kct"] Apr 20 15:13:42.698429 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:13:42.698407 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-7458b9fb6c-r2kct" Apr 20 15:13:42.700893 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:13:42.700873 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-hs4dp\"" Apr 20 15:13:42.706507 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:13:42.706466 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-7458b9fb6c-r2kct"] Apr 20 15:13:42.813808 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:13:42.813728 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gd88g\" (UniqueName: \"kubernetes.io/projected/7d0c6552-5aa6-4354-86fc-1c77fcda2bd5-kube-api-access-gd88g\") pod \"maas-controller-7458b9fb6c-r2kct\" (UID: \"7d0c6552-5aa6-4354-86fc-1c77fcda2bd5\") " pod="opendatahub/maas-controller-7458b9fb6c-r2kct" Apr 20 15:13:42.915036 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:13:42.915002 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gd88g\" (UniqueName: \"kubernetes.io/projected/7d0c6552-5aa6-4354-86fc-1c77fcda2bd5-kube-api-access-gd88g\") pod \"maas-controller-7458b9fb6c-r2kct\" (UID: \"7d0c6552-5aa6-4354-86fc-1c77fcda2bd5\") " pod="opendatahub/maas-controller-7458b9fb6c-r2kct" Apr 20 15:13:42.922414 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:13:42.922392 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gd88g\" (UniqueName: \"kubernetes.io/projected/7d0c6552-5aa6-4354-86fc-1c77fcda2bd5-kube-api-access-gd88g\") pod \"maas-controller-7458b9fb6c-r2kct\" (UID: \"7d0c6552-5aa6-4354-86fc-1c77fcda2bd5\") " pod="opendatahub/maas-controller-7458b9fb6c-r2kct" Apr 20 15:13:43.008891 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:13:43.008865 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-7458b9fb6c-r2kct" Apr 20 15:13:43.124997 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:13:43.124972 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-7458b9fb6c-r2kct"] Apr 20 15:13:43.126997 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:13:43.126970 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d0c6552_5aa6_4354_86fc_1c77fcda2bd5.slice/crio-c84c38c1da76026fb95b24f9ca4bde079871ac3d572f41e14595dad95a92353b WatchSource:0}: Error finding container c84c38c1da76026fb95b24f9ca4bde079871ac3d572f41e14595dad95a92353b: Status 404 returned error can't find the container with id c84c38c1da76026fb95b24f9ca4bde079871ac3d572f41e14595dad95a92353b Apr 20 15:13:43.561776 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:13:43.561738 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-7458b9fb6c-r2kct" event={"ID":"7d0c6552-5aa6-4354-86fc-1c77fcda2bd5","Type":"ContainerStarted","Data":"c84c38c1da76026fb95b24f9ca4bde079871ac3d572f41e14595dad95a92353b"} Apr 20 15:13:45.570769 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:13:45.570737 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-7458b9fb6c-r2kct" event={"ID":"7d0c6552-5aa6-4354-86fc-1c77fcda2bd5","Type":"ContainerStarted","Data":"0aa582e889c70d9a72842cd5375bd92a6c9e70dbeae0574f7982a7dbbbf475e0"} Apr 20 15:13:45.571146 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:13:45.570784 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-7458b9fb6c-r2kct" Apr 20 15:13:45.587455 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:13:45.587405 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-7458b9fb6c-r2kct" podStartSLOduration=2.022651114 podStartE2EDuration="3.587391757s" podCreationTimestamp="2026-04-20 15:13:42 +0000 UTC" firstStartedPulling="2026-04-20 15:13:43.128191205 +0000 UTC m=+683.259066486" lastFinishedPulling="2026-04-20 15:13:44.692931833 +0000 UTC m=+684.823807129" observedRunningTime="2026-04-20 15:13:45.585815842 +0000 UTC m=+685.716691144" watchObservedRunningTime="2026-04-20 15:13:45.587391757 +0000 UTC m=+685.718267059" Apr 20 15:13:56.579361 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:13:56.579332 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-7458b9fb6c-r2kct" Apr 20 15:14:08.494503 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:08.494456 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-ggxzr"] Apr 20 15:14:08.499168 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:08.499151 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-ggxzr" Apr 20 15:14:08.501668 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:08.501635 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 20 15:14:08.502607 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:08.502589 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-68bzf\"" Apr 20 15:14:08.502607 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:08.502599 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 20 15:14:08.502752 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:08.502589 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-2-simulated-kserve-self-signed-certs\"" Apr 20 15:14:08.514789 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:08.514764 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-ggxzr"] Apr 20 15:14:08.622818 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:08.622783 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a2fad171-af85-4000-82c1-7548c2b85d2e-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-ggxzr\" (UID: \"a2fad171-af85-4000-82c1-7548c2b85d2e\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-ggxzr" Apr 20 15:14:08.622818 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:08.622830 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a2fad171-af85-4000-82c1-7548c2b85d2e-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-ggxzr\" (UID: \"a2fad171-af85-4000-82c1-7548c2b85d2e\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-ggxzr" Apr 20 15:14:08.623020 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:08.622853 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a2fad171-af85-4000-82c1-7548c2b85d2e-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-ggxzr\" (UID: \"a2fad171-af85-4000-82c1-7548c2b85d2e\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-ggxzr" Apr 20 15:14:08.623020 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:08.622928 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rczpx\" (UniqueName: \"kubernetes.io/projected/a2fad171-af85-4000-82c1-7548c2b85d2e-kube-api-access-rczpx\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-ggxzr\" (UID: \"a2fad171-af85-4000-82c1-7548c2b85d2e\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-ggxzr" Apr 20 15:14:08.623020 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:08.622997 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a2fad171-af85-4000-82c1-7548c2b85d2e-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-ggxzr\" (UID: \"a2fad171-af85-4000-82c1-7548c2b85d2e\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-ggxzr" Apr 20 15:14:08.623120 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:08.623025 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a2fad171-af85-4000-82c1-7548c2b85d2e-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-ggxzr\" (UID: \"a2fad171-af85-4000-82c1-7548c2b85d2e\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-ggxzr" Apr 20 15:14:08.724429 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:08.724388 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a2fad171-af85-4000-82c1-7548c2b85d2e-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-ggxzr\" (UID: \"a2fad171-af85-4000-82c1-7548c2b85d2e\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-ggxzr" Apr 20 15:14:08.724639 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:08.724440 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a2fad171-af85-4000-82c1-7548c2b85d2e-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-ggxzr\" (UID: \"a2fad171-af85-4000-82c1-7548c2b85d2e\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-ggxzr" Apr 20 15:14:08.724639 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:08.724467 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a2fad171-af85-4000-82c1-7548c2b85d2e-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-ggxzr\" (UID: \"a2fad171-af85-4000-82c1-7548c2b85d2e\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-ggxzr" Apr 20 15:14:08.724639 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:08.724517 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rczpx\" (UniqueName: \"kubernetes.io/projected/a2fad171-af85-4000-82c1-7548c2b85d2e-kube-api-access-rczpx\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-ggxzr\" (UID: \"a2fad171-af85-4000-82c1-7548c2b85d2e\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-ggxzr" Apr 20 15:14:08.724639 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:08.724572 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a2fad171-af85-4000-82c1-7548c2b85d2e-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-ggxzr\" (UID: \"a2fad171-af85-4000-82c1-7548c2b85d2e\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-ggxzr" Apr 20 15:14:08.724639 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:08.724605 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a2fad171-af85-4000-82c1-7548c2b85d2e-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-ggxzr\" (UID: \"a2fad171-af85-4000-82c1-7548c2b85d2e\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-ggxzr" Apr 20 15:14:08.724913 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:08.724885 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a2fad171-af85-4000-82c1-7548c2b85d2e-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-ggxzr\" (UID: \"a2fad171-af85-4000-82c1-7548c2b85d2e\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-ggxzr" Apr 20 15:14:08.724976 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:08.724940 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a2fad171-af85-4000-82c1-7548c2b85d2e-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-ggxzr\" (UID: \"a2fad171-af85-4000-82c1-7548c2b85d2e\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-ggxzr" Apr 20 15:14:08.725021 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:08.724993 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a2fad171-af85-4000-82c1-7548c2b85d2e-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-ggxzr\" (UID: \"a2fad171-af85-4000-82c1-7548c2b85d2e\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-ggxzr" Apr 20 15:14:08.726741 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:08.726700 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a2fad171-af85-4000-82c1-7548c2b85d2e-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-ggxzr\" (UID: \"a2fad171-af85-4000-82c1-7548c2b85d2e\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-ggxzr" Apr 20 15:14:08.726858 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:08.726828 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a2fad171-af85-4000-82c1-7548c2b85d2e-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-ggxzr\" (UID: \"a2fad171-af85-4000-82c1-7548c2b85d2e\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-ggxzr" Apr 20 15:14:08.732319 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:08.732294 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczpx\" (UniqueName: \"kubernetes.io/projected/a2fad171-af85-4000-82c1-7548c2b85d2e-kube-api-access-rczpx\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-ggxzr\" (UID: \"a2fad171-af85-4000-82c1-7548c2b85d2e\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-ggxzr" Apr 20 15:14:08.813527 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:08.813435 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-ggxzr" Apr 20 15:14:08.939293 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:08.939266 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-ggxzr"] Apr 20 15:14:08.940722 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:14:08.940699 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda2fad171_af85_4000_82c1_7548c2b85d2e.slice/crio-7701b3963e9bffbce2d1e6d8a25ab10716a40f698ed8f69f68ff6023c571c94d WatchSource:0}: Error finding container 7701b3963e9bffbce2d1e6d8a25ab10716a40f698ed8f69f68ff6023c571c94d: Status 404 returned error can't find the container with id 7701b3963e9bffbce2d1e6d8a25ab10716a40f698ed8f69f68ff6023c571c94d Apr 20 15:14:09.074960 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:09.074869 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-n6kbp"] Apr 20 15:14:09.662650 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:09.662611 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-ggxzr" event={"ID":"a2fad171-af85-4000-82c1-7548c2b85d2e","Type":"ContainerStarted","Data":"7701b3963e9bffbce2d1e6d8a25ab10716a40f698ed8f69f68ff6023c571c94d"} Apr 20 15:14:12.380361 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:12.380323 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-n6kbp"] Apr 20 15:14:12.505233 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:12.505199 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-7d56498df5-2wdmm"] Apr 20 15:14:12.505536 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:12.505506 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-7d56498df5-2wdmm" podUID="2f618e72-f5db-4a9a-bbca-bde37b0e2017" containerName="maas-api" containerID="cri-o://eaf05daf1b9c581a459d7fd4694e21532b34f7f0d4764934497ca274c06be871" gracePeriod=30 Apr 20 15:14:12.541312 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:12.541279 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="opendatahub/maas-api-7d56498df5-2wdmm" podUID="2f618e72-f5db-4a9a-bbca-bde37b0e2017" containerName="maas-api" probeResult="failure" output="Get \"https://10.132.0.39:8443/health\": dial tcp 10.132.0.39:8443: connect: connection refused" Apr 20 15:14:12.677203 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:12.677072 2572 generic.go:358] "Generic (PLEG): container finished" podID="2f618e72-f5db-4a9a-bbca-bde37b0e2017" containerID="eaf05daf1b9c581a459d7fd4694e21532b34f7f0d4764934497ca274c06be871" exitCode=0 Apr 20 15:14:12.677203 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:12.677133 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-7d56498df5-2wdmm" event={"ID":"2f618e72-f5db-4a9a-bbca-bde37b0e2017","Type":"ContainerDied","Data":"eaf05daf1b9c581a459d7fd4694e21532b34f7f0d4764934497ca274c06be871"} Apr 20 15:14:12.766803 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:12.766604 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-7d56498df5-2wdmm" Apr 20 15:14:12.867741 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:12.867708 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zqxm4\" (UniqueName: \"kubernetes.io/projected/2f618e72-f5db-4a9a-bbca-bde37b0e2017-kube-api-access-zqxm4\") pod \"2f618e72-f5db-4a9a-bbca-bde37b0e2017\" (UID: \"2f618e72-f5db-4a9a-bbca-bde37b0e2017\") " Apr 20 15:14:12.867902 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:12.867802 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/2f618e72-f5db-4a9a-bbca-bde37b0e2017-maas-api-tls\") pod \"2f618e72-f5db-4a9a-bbca-bde37b0e2017\" (UID: \"2f618e72-f5db-4a9a-bbca-bde37b0e2017\") " Apr 20 15:14:12.870059 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:12.870033 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f618e72-f5db-4a9a-bbca-bde37b0e2017-maas-api-tls" (OuterVolumeSpecName: "maas-api-tls") pod "2f618e72-f5db-4a9a-bbca-bde37b0e2017" (UID: "2f618e72-f5db-4a9a-bbca-bde37b0e2017"). InnerVolumeSpecName "maas-api-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 15:14:12.870176 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:12.870161 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f618e72-f5db-4a9a-bbca-bde37b0e2017-kube-api-access-zqxm4" (OuterVolumeSpecName: "kube-api-access-zqxm4") pod "2f618e72-f5db-4a9a-bbca-bde37b0e2017" (UID: "2f618e72-f5db-4a9a-bbca-bde37b0e2017"). InnerVolumeSpecName "kube-api-access-zqxm4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 15:14:12.968814 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:12.968739 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zqxm4\" (UniqueName: \"kubernetes.io/projected/2f618e72-f5db-4a9a-bbca-bde37b0e2017-kube-api-access-zqxm4\") on node \"ip-10-0-134-230.ec2.internal\" DevicePath \"\"" Apr 20 15:14:12.968814 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:12.968771 2572 reconciler_common.go:299] "Volume detached for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/2f618e72-f5db-4a9a-bbca-bde37b0e2017-maas-api-tls\") on node \"ip-10-0-134-230.ec2.internal\" DevicePath \"\"" Apr 20 15:14:13.687953 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:13.687912 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-7d56498df5-2wdmm" event={"ID":"2f618e72-f5db-4a9a-bbca-bde37b0e2017","Type":"ContainerDied","Data":"455b69d0c61b53b8e1731a563219334a4b4121a3da3590f97e688b7c907ffaf8"} Apr 20 15:14:13.688408 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:13.687970 2572 scope.go:117] "RemoveContainer" containerID="eaf05daf1b9c581a459d7fd4694e21532b34f7f0d4764934497ca274c06be871" Apr 20 15:14:13.688408 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:13.687969 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-7d56498df5-2wdmm" Apr 20 15:14:13.713110 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:13.713086 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-7d56498df5-2wdmm"] Apr 20 15:14:13.716690 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:13.716670 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-7d56498df5-2wdmm"] Apr 20 15:14:14.411820 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:14.411785 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f618e72-f5db-4a9a-bbca-bde37b0e2017" path="/var/lib/kubelet/pods/2f618e72-f5db-4a9a-bbca-bde37b0e2017/volumes" Apr 20 15:14:16.706547 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:16.706508 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-ggxzr" event={"ID":"a2fad171-af85-4000-82c1-7548c2b85d2e","Type":"ContainerStarted","Data":"a15e5ea51dcb6d4ed64a3d10fc1316332aa9e3ed99410654a1431939067f11bb"} Apr 20 15:14:21.724592 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:21.724505 2572 generic.go:358] "Generic (PLEG): container finished" podID="a2fad171-af85-4000-82c1-7548c2b85d2e" containerID="a15e5ea51dcb6d4ed64a3d10fc1316332aa9e3ed99410654a1431939067f11bb" exitCode=0 Apr 20 15:14:21.724592 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:21.724513 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-ggxzr" event={"ID":"a2fad171-af85-4000-82c1-7548c2b85d2e","Type":"ContainerDied","Data":"a15e5ea51dcb6d4ed64a3d10fc1316332aa9e3ed99410654a1431939067f11bb"} Apr 20 15:14:23.735151 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:23.735118 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-ggxzr" event={"ID":"a2fad171-af85-4000-82c1-7548c2b85d2e","Type":"ContainerStarted","Data":"86c0e5fe55c971f96e9512563b7f4bb4ca2f296bbc8626e754a5f0679b775aa4"} Apr 20 15:14:23.735557 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:23.735328 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-ggxzr" Apr 20 15:14:23.755119 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:23.755069 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-ggxzr" podStartSLOduration=1.797301915 podStartE2EDuration="15.755056633s" podCreationTimestamp="2026-04-20 15:14:08 +0000 UTC" firstStartedPulling="2026-04-20 15:14:08.942564828 +0000 UTC m=+709.073440109" lastFinishedPulling="2026-04-20 15:14:22.900319528 +0000 UTC m=+723.031194827" observedRunningTime="2026-04-20 15:14:23.75152667 +0000 UTC m=+723.882401978" watchObservedRunningTime="2026-04-20 15:14:23.755056633 +0000 UTC m=+723.885931993" Apr 20 15:14:28.902155 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:28.902014 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-p2rrh"] Apr 20 15:14:28.902707 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:28.902530 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2f618e72-f5db-4a9a-bbca-bde37b0e2017" containerName="maas-api" Apr 20 15:14:28.902707 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:28.902551 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f618e72-f5db-4a9a-bbca-bde37b0e2017" containerName="maas-api" Apr 20 15:14:28.902707 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:28.902674 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="2f618e72-f5db-4a9a-bbca-bde37b0e2017" containerName="maas-api" Apr 20 15:14:28.905968 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:28.905950 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-p2rrh" Apr 20 15:14:28.908397 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:28.908377 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"facebook-opt-125m-simulated-kserve-self-signed-certs\"" Apr 20 15:14:28.916232 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:28.916209 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-p2rrh"] Apr 20 15:14:29.002651 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:29.002624 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b7151849-59c5-497d-98c0-b99e422e5f17-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-p2rrh\" (UID: \"b7151849-59c5-497d-98c0-b99e422e5f17\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-p2rrh" Apr 20 15:14:29.002809 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:29.002667 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b7151849-59c5-497d-98c0-b99e422e5f17-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-p2rrh\" (UID: \"b7151849-59c5-497d-98c0-b99e422e5f17\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-p2rrh" Apr 20 15:14:29.002809 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:29.002729 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b7151849-59c5-497d-98c0-b99e422e5f17-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-p2rrh\" (UID: \"b7151849-59c5-497d-98c0-b99e422e5f17\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-p2rrh" Apr 20 15:14:29.002809 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:29.002787 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q48vw\" (UniqueName: \"kubernetes.io/projected/b7151849-59c5-497d-98c0-b99e422e5f17-kube-api-access-q48vw\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-p2rrh\" (UID: \"b7151849-59c5-497d-98c0-b99e422e5f17\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-p2rrh" Apr 20 15:14:29.002809 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:29.002805 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b7151849-59c5-497d-98c0-b99e422e5f17-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-p2rrh\" (UID: \"b7151849-59c5-497d-98c0-b99e422e5f17\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-p2rrh" Apr 20 15:14:29.002987 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:29.002827 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b7151849-59c5-497d-98c0-b99e422e5f17-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-p2rrh\" (UID: \"b7151849-59c5-497d-98c0-b99e422e5f17\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-p2rrh" Apr 20 15:14:29.103700 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:29.103664 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b7151849-59c5-497d-98c0-b99e422e5f17-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-p2rrh\" (UID: \"b7151849-59c5-497d-98c0-b99e422e5f17\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-p2rrh" Apr 20 15:14:29.103872 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:29.103707 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b7151849-59c5-497d-98c0-b99e422e5f17-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-p2rrh\" (UID: \"b7151849-59c5-497d-98c0-b99e422e5f17\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-p2rrh" Apr 20 15:14:29.103872 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:29.103735 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q48vw\" (UniqueName: \"kubernetes.io/projected/b7151849-59c5-497d-98c0-b99e422e5f17-kube-api-access-q48vw\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-p2rrh\" (UID: \"b7151849-59c5-497d-98c0-b99e422e5f17\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-p2rrh" Apr 20 15:14:29.103872 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:29.103752 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b7151849-59c5-497d-98c0-b99e422e5f17-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-p2rrh\" (UID: \"b7151849-59c5-497d-98c0-b99e422e5f17\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-p2rrh" Apr 20 15:14:29.103872 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:29.103774 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b7151849-59c5-497d-98c0-b99e422e5f17-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-p2rrh\" (UID: \"b7151849-59c5-497d-98c0-b99e422e5f17\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-p2rrh" Apr 20 15:14:29.103872 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:29.103813 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b7151849-59c5-497d-98c0-b99e422e5f17-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-p2rrh\" (UID: \"b7151849-59c5-497d-98c0-b99e422e5f17\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-p2rrh" Apr 20 15:14:29.104133 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:29.104091 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b7151849-59c5-497d-98c0-b99e422e5f17-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-p2rrh\" (UID: \"b7151849-59c5-497d-98c0-b99e422e5f17\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-p2rrh" Apr 20 15:14:29.104189 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:29.104145 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b7151849-59c5-497d-98c0-b99e422e5f17-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-p2rrh\" (UID: \"b7151849-59c5-497d-98c0-b99e422e5f17\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-p2rrh" Apr 20 15:14:29.104226 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:29.104183 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b7151849-59c5-497d-98c0-b99e422e5f17-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-p2rrh\" (UID: \"b7151849-59c5-497d-98c0-b99e422e5f17\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-p2rrh" Apr 20 15:14:29.105947 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:29.105919 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b7151849-59c5-497d-98c0-b99e422e5f17-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-p2rrh\" (UID: \"b7151849-59c5-497d-98c0-b99e422e5f17\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-p2rrh" Apr 20 15:14:29.106235 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:29.106219 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b7151849-59c5-497d-98c0-b99e422e5f17-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-p2rrh\" (UID: \"b7151849-59c5-497d-98c0-b99e422e5f17\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-p2rrh" Apr 20 15:14:29.111194 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:29.111169 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q48vw\" (UniqueName: \"kubernetes.io/projected/b7151849-59c5-497d-98c0-b99e422e5f17-kube-api-access-q48vw\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-p2rrh\" (UID: \"b7151849-59c5-497d-98c0-b99e422e5f17\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-p2rrh" Apr 20 15:14:29.216557 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:29.216472 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-p2rrh" Apr 20 15:14:29.341898 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:29.341872 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-p2rrh"] Apr 20 15:14:29.343532 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:14:29.343505 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb7151849_59c5_497d_98c0_b99e422e5f17.slice/crio-801c08fff85b9268d5552052239d4709cd9dd8261aa14a9c658f9012d3a9b811 WatchSource:0}: Error finding container 801c08fff85b9268d5552052239d4709cd9dd8261aa14a9c658f9012d3a9b811: Status 404 returned error can't find the container with id 801c08fff85b9268d5552052239d4709cd9dd8261aa14a9c658f9012d3a9b811 Apr 20 15:14:29.680299 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:29.680263 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-n6kbp"] Apr 20 15:14:29.758970 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:29.758933 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-p2rrh" event={"ID":"b7151849-59c5-497d-98c0-b99e422e5f17","Type":"ContainerStarted","Data":"c1033b4232e91f0944b3553f2a9caf8af0d8397d727cb4145a8fdbea20c584d9"} Apr 20 15:14:29.759131 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:29.758978 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-p2rrh" event={"ID":"b7151849-59c5-497d-98c0-b99e422e5f17","Type":"ContainerStarted","Data":"801c08fff85b9268d5552052239d4709cd9dd8261aa14a9c658f9012d3a9b811"} Apr 20 15:14:34.756848 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:34.756821 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-ggxzr" Apr 20 15:14:34.780902 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:34.780867 2572 generic.go:358] "Generic (PLEG): container finished" podID="b7151849-59c5-497d-98c0-b99e422e5f17" containerID="c1033b4232e91f0944b3553f2a9caf8af0d8397d727cb4145a8fdbea20c584d9" exitCode=0 Apr 20 15:14:34.781065 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:34.780904 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-p2rrh" event={"ID":"b7151849-59c5-497d-98c0-b99e422e5f17","Type":"ContainerDied","Data":"c1033b4232e91f0944b3553f2a9caf8af0d8397d727cb4145a8fdbea20c584d9"} Apr 20 15:14:35.785950 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:35.785916 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-p2rrh" event={"ID":"b7151849-59c5-497d-98c0-b99e422e5f17","Type":"ContainerStarted","Data":"12deadb60da92d2c1390e6aed18549bcf38341061b4e94d8e20e5abe454c328f"} Apr 20 15:14:35.786341 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:35.786135 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-p2rrh" Apr 20 15:14:35.807527 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:35.807461 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-p2rrh" podStartSLOduration=7.585333173 podStartE2EDuration="7.807448467s" podCreationTimestamp="2026-04-20 15:14:28 +0000 UTC" firstStartedPulling="2026-04-20 15:14:34.781721353 +0000 UTC m=+734.912596642" lastFinishedPulling="2026-04-20 15:14:35.003836637 +0000 UTC m=+735.134711936" observedRunningTime="2026-04-20 15:14:35.803952007 +0000 UTC m=+735.934827345" watchObservedRunningTime="2026-04-20 15:14:35.807448467 +0000 UTC m=+735.938323769" Apr 20 15:14:38.094659 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:38.094620 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcclv56p"] Apr 20 15:14:38.098203 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:38.098182 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcclv56p" Apr 20 15:14:38.100606 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:38.100579 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-unab60ef4d3a239b5143b412cab04acac3-kserve-self-signed-certs\"" Apr 20 15:14:38.108900 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:38.108873 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcclv56p"] Apr 20 15:14:38.179120 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:38.179090 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/48bae398-8ec9-4e2d-bb73-1c569af4f980-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcclv56p\" (UID: \"48bae398-8ec9-4e2d-bb73-1c569af4f980\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcclv56p" Apr 20 15:14:38.179277 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:38.179124 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/48bae398-8ec9-4e2d-bb73-1c569af4f980-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcclv56p\" (UID: \"48bae398-8ec9-4e2d-bb73-1c569af4f980\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcclv56p" Apr 20 15:14:38.179277 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:38.179197 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/48bae398-8ec9-4e2d-bb73-1c569af4f980-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcclv56p\" (UID: \"48bae398-8ec9-4e2d-bb73-1c569af4f980\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcclv56p" Apr 20 15:14:38.179277 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:38.179250 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qx6xv\" (UniqueName: \"kubernetes.io/projected/48bae398-8ec9-4e2d-bb73-1c569af4f980-kube-api-access-qx6xv\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcclv56p\" (UID: \"48bae398-8ec9-4e2d-bb73-1c569af4f980\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcclv56p" Apr 20 15:14:38.179436 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:38.179282 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/48bae398-8ec9-4e2d-bb73-1c569af4f980-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcclv56p\" (UID: \"48bae398-8ec9-4e2d-bb73-1c569af4f980\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcclv56p" Apr 20 15:14:38.179436 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:38.179344 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/48bae398-8ec9-4e2d-bb73-1c569af4f980-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcclv56p\" (UID: \"48bae398-8ec9-4e2d-bb73-1c569af4f980\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcclv56p" Apr 20 15:14:38.280108 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:38.280069 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/48bae398-8ec9-4e2d-bb73-1c569af4f980-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcclv56p\" (UID: \"48bae398-8ec9-4e2d-bb73-1c569af4f980\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcclv56p" Apr 20 15:14:38.280256 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:38.280141 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/48bae398-8ec9-4e2d-bb73-1c569af4f980-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcclv56p\" (UID: \"48bae398-8ec9-4e2d-bb73-1c569af4f980\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcclv56p" Apr 20 15:14:38.280256 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:38.280215 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/48bae398-8ec9-4e2d-bb73-1c569af4f980-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcclv56p\" (UID: \"48bae398-8ec9-4e2d-bb73-1c569af4f980\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcclv56p" Apr 20 15:14:38.280256 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:38.280251 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/48bae398-8ec9-4e2d-bb73-1c569af4f980-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcclv56p\" (UID: \"48bae398-8ec9-4e2d-bb73-1c569af4f980\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcclv56p" Apr 20 15:14:38.280432 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:38.280317 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/48bae398-8ec9-4e2d-bb73-1c569af4f980-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcclv56p\" (UID: \"48bae398-8ec9-4e2d-bb73-1c569af4f980\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcclv56p" Apr 20 15:14:38.280432 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:38.280347 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qx6xv\" (UniqueName: \"kubernetes.io/projected/48bae398-8ec9-4e2d-bb73-1c569af4f980-kube-api-access-qx6xv\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcclv56p\" (UID: \"48bae398-8ec9-4e2d-bb73-1c569af4f980\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcclv56p" Apr 20 15:14:38.280601 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:38.280577 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/48bae398-8ec9-4e2d-bb73-1c569af4f980-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcclv56p\" (UID: \"48bae398-8ec9-4e2d-bb73-1c569af4f980\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcclv56p" Apr 20 15:14:38.280760 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:38.280742 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/48bae398-8ec9-4e2d-bb73-1c569af4f980-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcclv56p\" (UID: \"48bae398-8ec9-4e2d-bb73-1c569af4f980\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcclv56p" Apr 20 15:14:38.280821 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:38.280734 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/48bae398-8ec9-4e2d-bb73-1c569af4f980-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcclv56p\" (UID: \"48bae398-8ec9-4e2d-bb73-1c569af4f980\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcclv56p" Apr 20 15:14:38.282444 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:38.282420 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/48bae398-8ec9-4e2d-bb73-1c569af4f980-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcclv56p\" (UID: \"48bae398-8ec9-4e2d-bb73-1c569af4f980\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcclv56p" Apr 20 15:14:38.282607 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:38.282591 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/48bae398-8ec9-4e2d-bb73-1c569af4f980-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcclv56p\" (UID: \"48bae398-8ec9-4e2d-bb73-1c569af4f980\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcclv56p" Apr 20 15:14:38.288597 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:38.288573 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qx6xv\" (UniqueName: \"kubernetes.io/projected/48bae398-8ec9-4e2d-bb73-1c569af4f980-kube-api-access-qx6xv\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcclv56p\" (UID: \"48bae398-8ec9-4e2d-bb73-1c569af4f980\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcclv56p" Apr 20 15:14:38.410070 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:38.409994 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcclv56p" Apr 20 15:14:38.538289 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:38.538252 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcclv56p"] Apr 20 15:14:38.542177 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:14:38.542148 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48bae398_8ec9_4e2d_bb73_1c569af4f980.slice/crio-6594a8f35d2c61b624d46eb12582475e3c853c49bf1033ca6034d46909555847 WatchSource:0}: Error finding container 6594a8f35d2c61b624d46eb12582475e3c853c49bf1033ca6034d46909555847: Status 404 returned error can't find the container with id 6594a8f35d2c61b624d46eb12582475e3c853c49bf1033ca6034d46909555847 Apr 20 15:14:38.798406 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:38.798376 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcclv56p" event={"ID":"48bae398-8ec9-4e2d-bb73-1c569af4f980","Type":"ContainerStarted","Data":"2898f2bf2ea05fc36e46e4967ba0f34e1814488bd4e10c4ec84daf1432ed5f2e"} Apr 20 15:14:38.798406 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:38.798413 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcclv56p" event={"ID":"48bae398-8ec9-4e2d-bb73-1c569af4f980","Type":"ContainerStarted","Data":"6594a8f35d2c61b624d46eb12582475e3c853c49bf1033ca6034d46909555847"} Apr 20 15:14:39.778548 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:39.778514 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-n6kbp"] Apr 20 15:14:44.820122 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:44.820086 2572 generic.go:358] "Generic (PLEG): container finished" podID="48bae398-8ec9-4e2d-bb73-1c569af4f980" containerID="2898f2bf2ea05fc36e46e4967ba0f34e1814488bd4e10c4ec84daf1432ed5f2e" exitCode=0 Apr 20 15:14:44.820548 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:44.820161 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcclv56p" event={"ID":"48bae398-8ec9-4e2d-bb73-1c569af4f980","Type":"ContainerDied","Data":"2898f2bf2ea05fc36e46e4967ba0f34e1814488bd4e10c4ec84daf1432ed5f2e"} Apr 20 15:14:45.825263 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:45.825227 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcclv56p" event={"ID":"48bae398-8ec9-4e2d-bb73-1c569af4f980","Type":"ContainerStarted","Data":"28a457faba0de8d5b3f123a3750253d0aadc1e603f4dbfdb9ee74f7325b3ff47"} Apr 20 15:14:45.825635 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:45.825430 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcclv56p" Apr 20 15:14:46.802820 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:46.802789 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-p2rrh" Apr 20 15:14:46.825247 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:46.825195 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcclv56p" podStartSLOduration=8.571238536 podStartE2EDuration="8.825181994s" podCreationTimestamp="2026-04-20 15:14:38 +0000 UTC" firstStartedPulling="2026-04-20 15:14:44.820790285 +0000 UTC m=+744.951665566" lastFinishedPulling="2026-04-20 15:14:45.074733744 +0000 UTC m=+745.205609024" observedRunningTime="2026-04-20 15:14:45.85600616 +0000 UTC m=+745.986881462" watchObservedRunningTime="2026-04-20 15:14:46.825181994 +0000 UTC m=+746.956057297" Apr 20 15:14:56.841703 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:14:56.841668 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcclv56p" Apr 20 15:15:01.379472 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:15:01.379435 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-n6kbp"] Apr 20 15:15:39.074541 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:15:39.074511 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-n6kbp"] Apr 20 15:16:31.781943 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:16:31.781909 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-n6kbp"] Apr 20 15:16:42.589344 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:16:42.589306 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-n6kbp"] Apr 20 15:16:51.393823 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:16:51.393788 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-n6kbp"] Apr 20 15:17:01.293496 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:17:01.293463 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-n6kbp"] Apr 20 15:17:10.590794 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:17:10.590715 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-n6kbp"] Apr 20 15:17:21.285586 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:17:21.285549 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-n6kbp"] Apr 20 15:17:27.887547 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:17:27.887503 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-7458b9fb6c-r2kct"] Apr 20 15:17:27.888138 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:17:27.887805 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-7458b9fb6c-r2kct" podUID="7d0c6552-5aa6-4354-86fc-1c77fcda2bd5" containerName="manager" containerID="cri-o://0aa582e889c70d9a72842cd5375bd92a6c9e70dbeae0574f7982a7dbbbf475e0" gracePeriod=10 Apr 20 15:17:28.128290 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:17:28.128267 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-7458b9fb6c-r2kct" Apr 20 15:17:28.209067 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:17:28.208985 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gd88g\" (UniqueName: \"kubernetes.io/projected/7d0c6552-5aa6-4354-86fc-1c77fcda2bd5-kube-api-access-gd88g\") pod \"7d0c6552-5aa6-4354-86fc-1c77fcda2bd5\" (UID: \"7d0c6552-5aa6-4354-86fc-1c77fcda2bd5\") " Apr 20 15:17:28.210979 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:17:28.210946 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d0c6552-5aa6-4354-86fc-1c77fcda2bd5-kube-api-access-gd88g" (OuterVolumeSpecName: "kube-api-access-gd88g") pod "7d0c6552-5aa6-4354-86fc-1c77fcda2bd5" (UID: "7d0c6552-5aa6-4354-86fc-1c77fcda2bd5"). InnerVolumeSpecName "kube-api-access-gd88g". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 15:17:28.310091 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:17:28.310062 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gd88g\" (UniqueName: \"kubernetes.io/projected/7d0c6552-5aa6-4354-86fc-1c77fcda2bd5-kube-api-access-gd88g\") on node \"ip-10-0-134-230.ec2.internal\" DevicePath \"\"" Apr 20 15:17:28.425045 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:17:28.425017 2572 generic.go:358] "Generic (PLEG): container finished" podID="7d0c6552-5aa6-4354-86fc-1c77fcda2bd5" containerID="0aa582e889c70d9a72842cd5375bd92a6c9e70dbeae0574f7982a7dbbbf475e0" exitCode=0 Apr 20 15:17:28.425152 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:17:28.425075 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-7458b9fb6c-r2kct" Apr 20 15:17:28.425152 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:17:28.425092 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-7458b9fb6c-r2kct" event={"ID":"7d0c6552-5aa6-4354-86fc-1c77fcda2bd5","Type":"ContainerDied","Data":"0aa582e889c70d9a72842cd5375bd92a6c9e70dbeae0574f7982a7dbbbf475e0"} Apr 20 15:17:28.425152 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:17:28.425126 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-7458b9fb6c-r2kct" event={"ID":"7d0c6552-5aa6-4354-86fc-1c77fcda2bd5","Type":"ContainerDied","Data":"c84c38c1da76026fb95b24f9ca4bde079871ac3d572f41e14595dad95a92353b"} Apr 20 15:17:28.425152 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:17:28.425147 2572 scope.go:117] "RemoveContainer" containerID="0aa582e889c70d9a72842cd5375bd92a6c9e70dbeae0574f7982a7dbbbf475e0" Apr 20 15:17:28.433377 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:17:28.433362 2572 scope.go:117] "RemoveContainer" containerID="0aa582e889c70d9a72842cd5375bd92a6c9e70dbeae0574f7982a7dbbbf475e0" Apr 20 15:17:28.433613 ip-10-0-134-230 kubenswrapper[2572]: E0420 15:17:28.433596 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0aa582e889c70d9a72842cd5375bd92a6c9e70dbeae0574f7982a7dbbbf475e0\": container with ID starting with 0aa582e889c70d9a72842cd5375bd92a6c9e70dbeae0574f7982a7dbbbf475e0 not found: ID does not exist" containerID="0aa582e889c70d9a72842cd5375bd92a6c9e70dbeae0574f7982a7dbbbf475e0" Apr 20 15:17:28.433672 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:17:28.433621 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0aa582e889c70d9a72842cd5375bd92a6c9e70dbeae0574f7982a7dbbbf475e0"} err="failed to get container status \"0aa582e889c70d9a72842cd5375bd92a6c9e70dbeae0574f7982a7dbbbf475e0\": rpc error: code = NotFound desc = could not find container \"0aa582e889c70d9a72842cd5375bd92a6c9e70dbeae0574f7982a7dbbbf475e0\": container with ID starting with 0aa582e889c70d9a72842cd5375bd92a6c9e70dbeae0574f7982a7dbbbf475e0 not found: ID does not exist" Apr 20 15:17:28.441517 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:17:28.441479 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-7458b9fb6c-r2kct"] Apr 20 15:17:28.445353 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:17:28.445330 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-7458b9fb6c-r2kct"] Apr 20 15:17:30.410460 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:17:30.410426 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d0c6552-5aa6-4354-86fc-1c77fcda2bd5" path="/var/lib/kubelet/pods/7d0c6552-5aa6-4354-86fc-1c77fcda2bd5/volumes" Apr 20 15:18:21.986780 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:18:21.986744 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-n6kbp"] Apr 20 15:18:37.884372 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:18:37.884332 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-n6kbp"] Apr 20 15:19:16.887681 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:19:16.887645 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-n6kbp"] Apr 20 15:19:33.283006 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:19:33.282975 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-n6kbp"] Apr 20 15:19:48.079872 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:19:48.079836 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-n6kbp"] Apr 20 15:20:05.170422 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:20:05.170381 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-n6kbp"] Apr 20 15:20:57.284677 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:20:57.284642 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-n6kbp"] Apr 20 15:21:05.473834 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:21:05.473795 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-n6kbp"] Apr 20 15:21:21.578499 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:21:21.578459 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-n6kbp"] Apr 20 15:21:31.079171 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:21:31.079136 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-n6kbp"] Apr 20 15:21:47.877391 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:21:47.877313 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-n6kbp"] Apr 20 15:21:56.177865 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:21:56.177826 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-n6kbp"] Apr 20 15:22:28.778832 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:22:28.778800 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-n6kbp"] Apr 20 15:22:37.480460 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:22:37.480426 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-n6kbp"] Apr 20 15:22:46.175180 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:22:46.175142 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-n6kbp"] Apr 20 15:22:53.476981 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:22:53.476949 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-n6kbp"] Apr 20 15:23:02.074624 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:23:02.074586 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-n6kbp"] Apr 20 15:23:19.182496 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:23:19.182426 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-n6kbp"] Apr 20 15:23:30.774805 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:23:30.774767 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-n6kbp"] Apr 20 15:24:17.079595 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:24:17.079560 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-n6kbp"] Apr 20 15:24:24.974229 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:24:24.974190 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-n6kbp"] Apr 20 15:24:34.181396 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:24:34.181355 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-n6kbp"] Apr 20 15:24:43.173960 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:24:43.173882 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-n6kbp"] Apr 20 15:24:52.277178 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:24:52.277144 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-n6kbp"] Apr 20 15:25:00.504230 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:25:00.504197 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-n6kbp"] Apr 20 15:25:09.984518 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:25:09.984472 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-n6kbp"] Apr 20 15:25:17.975043 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:25:17.975009 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-n6kbp"] Apr 20 15:25:26.679509 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:25:26.679463 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-n6kbp"] Apr 20 15:25:35.474503 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:25:35.474450 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-n6kbp"] Apr 20 15:25:44.482200 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:25:44.482164 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-n6kbp"] Apr 20 15:25:53.081596 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:25:53.081562 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-n6kbp"] Apr 20 15:26:01.878670 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:26:01.878633 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-n6kbp"] Apr 20 15:26:10.082081 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:26:10.081999 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-n6kbp"] Apr 20 15:26:20.169650 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:26:20.169616 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-n6kbp"] Apr 20 15:26:28.079353 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:26:28.079316 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-n6kbp"] Apr 20 15:26:37.180976 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:26:37.180938 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-n6kbp"] Apr 20 15:26:45.174263 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:26:45.174225 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-n6kbp"] Apr 20 15:29:05.083303 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:29:05.083264 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-n6kbp"] Apr 20 15:29:09.980232 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:29:09.980151 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-n6kbp"] Apr 20 15:29:34.781339 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:29:34.781303 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-n6kbp"] Apr 20 15:29:42.786321 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:29:42.786282 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-n6kbp"] Apr 20 15:29:51.778438 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:29:51.778401 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-n6kbp"] Apr 20 15:30:02.668024 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:30:02.667988 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-n6kbp"] Apr 20 15:30:10.181872 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:30:10.181832 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-n6kbp"] Apr 20 15:30:21.871079 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:30:21.871042 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-n6kbp"] Apr 20 15:30:30.874563 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:30:30.874526 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-n6kbp"] Apr 20 15:30:40.884152 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:30:40.884073 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-n6kbp"] Apr 20 15:30:49.879119 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:30:49.879079 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-n6kbp"] Apr 20 15:30:59.380693 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:30:59.380653 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-n6kbp"] Apr 20 15:31:09.277234 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:31:09.277200 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-n6kbp"] Apr 20 15:31:42.470646 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:31:42.470615 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-n6kbp"] Apr 20 15:32:25.081703 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:32:25.081662 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-n6kbp"] Apr 20 15:32:32.873145 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:32:32.873109 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-n6kbp"] Apr 20 15:32:42.878701 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:32:42.878663 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-n6kbp"] Apr 20 15:32:51.373237 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:32:51.373195 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-n6kbp"] Apr 20 15:32:59.278363 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:32:59.278328 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-n6kbp"] Apr 20 15:33:12.281700 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:33:12.281664 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-n6kbp"] Apr 20 15:33:21.377260 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:33:21.377226 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-n6kbp"] Apr 20 15:33:27.374532 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:33:27.374478 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-n6kbp"] Apr 20 15:33:37.979458 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:33:37.979420 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-n6kbp"] Apr 20 15:33:46.379055 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:33:46.378968 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-n6kbp"] Apr 20 15:33:53.876314 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:33:53.876277 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-n6kbp"] Apr 20 15:34:04.382556 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:34:04.382513 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-n6kbp"] Apr 20 15:34:22.679364 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:34:22.679331 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-n6kbp"] Apr 20 15:34:30.983883 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:34:30.983848 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-n6kbp"] Apr 20 15:34:39.175233 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:34:39.175197 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-n6kbp"] Apr 20 15:34:48.067125 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:34:48.067090 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-n6kbp"] Apr 20 15:35:05.177075 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:35:05.177039 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-n6kbp"] Apr 20 15:35:13.274674 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:35:13.274594 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-n6kbp"] Apr 20 15:35:22.588473 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:35:22.588438 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-n6kbp"] Apr 20 15:35:30.978820 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:35:30.978783 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-n6kbp"] Apr 20 15:35:39.270947 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:35:39.270909 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-n6kbp"] Apr 20 15:35:47.673036 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:35:47.672996 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-n6kbp"] Apr 20 15:35:56.725160 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:35:56.725121 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-n6kbp"] Apr 20 15:36:07.985142 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:36:07.985106 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-n6kbp"] Apr 20 15:36:16.997429 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:36:16.997394 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-n6kbp"] Apr 20 15:36:28.478766 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:36:28.478736 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-n6kbp"] Apr 20 15:36:37.380833 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:36:37.380801 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-n6kbp"] Apr 20 15:36:46.309431 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:36:46.309338 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-n6kbp"] Apr 20 15:36:53.978231 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:36:53.978193 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-n6kbp"] Apr 20 15:37:02.978469 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:37:02.978433 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-n6kbp"] Apr 20 15:37:19.834552 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:37:19.834516 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-n6kbp"] Apr 20 15:37:28.479407 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:37:28.479377 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-n6kbp"] Apr 20 15:37:36.578164 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:37:36.578127 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-n6kbp"] Apr 20 15:37:45.674341 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:37:45.674307 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-n6kbp"] Apr 20 15:38:08.874528 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:38:08.874481 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-n6kbp"] Apr 20 15:38:21.776380 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:38:21.776344 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-n6kbp"] Apr 20 15:38:27.773062 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:38:27.773029 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-2r77g_0653a16c-21ee-4062-b961-87c17974c316/manager/0.log" Apr 20 15:38:28.137223 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:38:28.137144 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-ws2wq_ba566142-9484-403a-b32d-f2a048b6021b/manager/2.log" Apr 20 15:38:28.489193 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:38:28.489118 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-6cdf6786d9-jz9pb_6be64f0b-1aba-438c-a1bd-ae1c81cd4a97/manager/0.log" Apr 20 15:38:28.607423 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:38:28.607392 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-gqcfc_fbb2fbe2-9ebf-4e0b-ac7e-57db6538eafc/postgres/0.log" Apr 20 15:38:29.349284 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:38:29.349255 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759wwhp7_54d323d1-538a-4238-8566-911a30915416/pull/0.log" Apr 20 15:38:29.354852 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:38:29.354831 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759wwhp7_54d323d1-538a-4238-8566-911a30915416/extract/0.log" Apr 20 15:38:29.360542 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:38:29.360517 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759wwhp7_54d323d1-538a-4238-8566-911a30915416/util/0.log" Apr 20 15:38:29.468628 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:38:29.468602 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qvxc6_399c50ec-720e-47b6-badd-f2188a2d0035/util/0.log" Apr 20 15:38:29.475121 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:38:29.475097 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qvxc6_399c50ec-720e-47b6-badd-f2188a2d0035/pull/0.log" Apr 20 15:38:29.481273 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:38:29.481254 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qvxc6_399c50ec-720e-47b6-badd-f2188a2d0035/extract/0.log" Apr 20 15:38:29.592663 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:38:29.592633 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73ldzvg_b6c78ccb-cce3-4d64-95cc-f3454322aa30/pull/0.log" Apr 20 15:38:29.598791 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:38:29.598773 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73ldzvg_b6c78ccb-cce3-4d64-95cc-f3454322aa30/extract/0.log" Apr 20 15:38:29.604730 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:38:29.604676 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73ldzvg_b6c78ccb-cce3-4d64-95cc-f3454322aa30/util/0.log" Apr 20 15:38:29.711566 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:38:29.711542 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1d8ttv_e9442516-f7c5-46be-bd33-94af1987a8f1/util/0.log" Apr 20 15:38:29.717467 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:38:29.717449 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1d8ttv_e9442516-f7c5-46be-bd33-94af1987a8f1/pull/0.log" Apr 20 15:38:29.723019 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:38:29.723003 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1d8ttv_e9442516-f7c5-46be-bd33-94af1987a8f1/extract/0.log" Apr 20 15:38:30.543324 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:38:30.543291 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-n6kbp_854cce36-de61-4a85-8526-67aa5a1fa03c/limitador/0.log" Apr 20 15:38:31.124826 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:38:31.124802 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-fgkrd_7159e679-12b5-4114-8f79-9e8ea4c77bdc/discovery/0.log" Apr 20 15:38:31.233383 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:38:31.233350 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-7fcf5d587f-fnkd4_a507084f-a532-4ee0-b5ae-4bbde0f9b484/kube-auth-proxy/0.log" Apr 20 15:38:31.462394 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:38:31.462308 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-58b6f876-q76sp_6f8428f5-731d-47f5-a2a3-6e064ad22824/istio-proxy/0.log" Apr 20 15:38:31.914249 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:38:31.914221 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-7f849f6b56-ggxzr_a2fad171-af85-4000-82c1-7548c2b85d2e/main/0.log" Apr 20 15:38:31.920663 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:38:31.920645 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-7f849f6b56-ggxzr_a2fad171-af85-4000-82c1-7548c2b85d2e/storage-initializer/0.log" Apr 20 15:38:32.261916 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:38:32.261886 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcclv56p_48bae398-8ec9-4e2d-bb73-1c569af4f980/storage-initializer/0.log" Apr 20 15:38:32.268522 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:38:32.268498 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcclv56p_48bae398-8ec9-4e2d-bb73-1c569af4f980/main/0.log" Apr 20 15:38:32.380383 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:38:32.380356 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-8f8dc67b7-p2rrh_b7151849-59c5-497d-98c0-b99e422e5f17/main/0.log" Apr 20 15:38:32.386261 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:38:32.386240 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-8f8dc67b7-p2rrh_b7151849-59c5-497d-98c0-b99e422e5f17/storage-initializer/0.log" Apr 20 15:38:39.270886 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:38:39.270856 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-wld75_1c29cec3-b554-4473-90f3-87629635db89/global-pull-secret-syncer/0.log" Apr 20 15:38:39.340101 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:38:39.340075 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-qh2tf_cb13e155-4ffa-49d2-8c49-c34c374a7d61/konnectivity-agent/0.log" Apr 20 15:38:39.441768 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:38:39.441738 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-134-230.ec2.internal_22ceeb306285ffc0c74b817e3784b6a3/haproxy/0.log" Apr 20 15:38:43.481691 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:38:43.481660 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759wwhp7_54d323d1-538a-4238-8566-911a30915416/extract/0.log" Apr 20 15:38:43.504174 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:38:43.504152 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759wwhp7_54d323d1-538a-4238-8566-911a30915416/util/0.log" Apr 20 15:38:43.526060 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:38:43.526037 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759wwhp7_54d323d1-538a-4238-8566-911a30915416/pull/0.log" Apr 20 15:38:43.557130 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:38:43.557109 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qvxc6_399c50ec-720e-47b6-badd-f2188a2d0035/extract/0.log" Apr 20 15:38:43.581965 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:38:43.581944 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qvxc6_399c50ec-720e-47b6-badd-f2188a2d0035/util/0.log" Apr 20 15:38:43.606832 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:38:43.606807 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qvxc6_399c50ec-720e-47b6-badd-f2188a2d0035/pull/0.log" Apr 20 15:38:43.636550 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:38:43.636532 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73ldzvg_b6c78ccb-cce3-4d64-95cc-f3454322aa30/extract/0.log" Apr 20 15:38:43.661267 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:38:43.661244 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73ldzvg_b6c78ccb-cce3-4d64-95cc-f3454322aa30/util/0.log" Apr 20 15:38:43.682766 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:38:43.682743 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73ldzvg_b6c78ccb-cce3-4d64-95cc-f3454322aa30/pull/0.log" Apr 20 15:38:43.709797 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:38:43.709776 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1d8ttv_e9442516-f7c5-46be-bd33-94af1987a8f1/extract/0.log" Apr 20 15:38:43.731527 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:38:43.731509 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1d8ttv_e9442516-f7c5-46be-bd33-94af1987a8f1/util/0.log" Apr 20 15:38:43.752212 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:38:43.752192 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1d8ttv_e9442516-f7c5-46be-bd33-94af1987a8f1/pull/0.log" Apr 20 15:38:44.348799 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:38:44.348769 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-n6kbp_854cce36-de61-4a85-8526-67aa5a1fa03c/limitador/0.log" Apr 20 15:38:46.224119 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:38:46.224088 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-4rv5w_a9732813-f0fb-4123-952e-fbb3c1c45a99/node-exporter/0.log" Apr 20 15:38:46.244504 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:38:46.244457 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-4rv5w_a9732813-f0fb-4123-952e-fbb3c1c45a99/kube-rbac-proxy/0.log" Apr 20 15:38:46.263350 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:38:46.263333 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-4rv5w_a9732813-f0fb-4123-952e-fbb3c1c45a99/init-textfile/0.log" Apr 20 15:38:47.877969 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:38:47.877940 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-58p5q/perf-node-gather-daemonset-769nr"] Apr 20 15:38:47.878344 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:38:47.878288 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7d0c6552-5aa6-4354-86fc-1c77fcda2bd5" containerName="manager" Apr 20 15:38:47.878344 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:38:47.878302 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d0c6552-5aa6-4354-86fc-1c77fcda2bd5" containerName="manager" Apr 20 15:38:47.878547 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:38:47.878407 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="7d0c6552-5aa6-4354-86fc-1c77fcda2bd5" containerName="manager" Apr 20 15:38:47.881633 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:38:47.881615 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-58p5q/perf-node-gather-daemonset-769nr" Apr 20 15:38:47.883805 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:38:47.883784 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-58p5q\"/\"kube-root-ca.crt\"" Apr 20 15:38:47.884802 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:38:47.884762 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-58p5q\"/\"openshift-service-ca.crt\"" Apr 20 15:38:47.884802 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:38:47.884784 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-58p5q\"/\"default-dockercfg-qgplg\"" Apr 20 15:38:47.887842 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:38:47.887786 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-58p5q/perf-node-gather-daemonset-769nr"] Apr 20 15:38:47.939559 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:38:47.939535 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-8p7wk_eb88787e-8848-4d3f-bcdd-871260569c2c/networking-console-plugin/0.log" Apr 20 15:38:47.991717 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:38:47.991685 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a47fdaad-7e30-4967-a49d-b058fa9044bf-sys\") pod \"perf-node-gather-daemonset-769nr\" (UID: \"a47fdaad-7e30-4967-a49d-b058fa9044bf\") " pod="openshift-must-gather-58p5q/perf-node-gather-daemonset-769nr" Apr 20 15:38:47.991838 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:38:47.991727 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/a47fdaad-7e30-4967-a49d-b058fa9044bf-podres\") pod \"perf-node-gather-daemonset-769nr\" (UID: \"a47fdaad-7e30-4967-a49d-b058fa9044bf\") " pod="openshift-must-gather-58p5q/perf-node-gather-daemonset-769nr" Apr 20 15:38:47.991838 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:38:47.991759 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a47fdaad-7e30-4967-a49d-b058fa9044bf-lib-modules\") pod \"perf-node-gather-daemonset-769nr\" (UID: \"a47fdaad-7e30-4967-a49d-b058fa9044bf\") " pod="openshift-must-gather-58p5q/perf-node-gather-daemonset-769nr" Apr 20 15:38:47.991944 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:38:47.991849 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/a47fdaad-7e30-4967-a49d-b058fa9044bf-proc\") pod \"perf-node-gather-daemonset-769nr\" (UID: \"a47fdaad-7e30-4967-a49d-b058fa9044bf\") " pod="openshift-must-gather-58p5q/perf-node-gather-daemonset-769nr" Apr 20 15:38:47.991944 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:38:47.991880 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9zbg\" (UniqueName: \"kubernetes.io/projected/a47fdaad-7e30-4967-a49d-b058fa9044bf-kube-api-access-q9zbg\") pod \"perf-node-gather-daemonset-769nr\" (UID: \"a47fdaad-7e30-4967-a49d-b058fa9044bf\") " pod="openshift-must-gather-58p5q/perf-node-gather-daemonset-769nr" Apr 20 15:38:48.092241 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:38:48.092211 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/a47fdaad-7e30-4967-a49d-b058fa9044bf-proc\") pod \"perf-node-gather-daemonset-769nr\" (UID: \"a47fdaad-7e30-4967-a49d-b058fa9044bf\") " pod="openshift-must-gather-58p5q/perf-node-gather-daemonset-769nr" Apr 20 15:38:48.092241 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:38:48.092245 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q9zbg\" (UniqueName: \"kubernetes.io/projected/a47fdaad-7e30-4967-a49d-b058fa9044bf-kube-api-access-q9zbg\") pod \"perf-node-gather-daemonset-769nr\" (UID: \"a47fdaad-7e30-4967-a49d-b058fa9044bf\") " pod="openshift-must-gather-58p5q/perf-node-gather-daemonset-769nr" Apr 20 15:38:48.092413 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:38:48.092270 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a47fdaad-7e30-4967-a49d-b058fa9044bf-sys\") pod \"perf-node-gather-daemonset-769nr\" (UID: \"a47fdaad-7e30-4967-a49d-b058fa9044bf\") " pod="openshift-must-gather-58p5q/perf-node-gather-daemonset-769nr" Apr 20 15:38:48.092413 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:38:48.092302 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/a47fdaad-7e30-4967-a49d-b058fa9044bf-podres\") pod \"perf-node-gather-daemonset-769nr\" (UID: \"a47fdaad-7e30-4967-a49d-b058fa9044bf\") " pod="openshift-must-gather-58p5q/perf-node-gather-daemonset-769nr" Apr 20 15:38:48.092413 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:38:48.092324 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/a47fdaad-7e30-4967-a49d-b058fa9044bf-proc\") pod \"perf-node-gather-daemonset-769nr\" (UID: \"a47fdaad-7e30-4967-a49d-b058fa9044bf\") " pod="openshift-must-gather-58p5q/perf-node-gather-daemonset-769nr" Apr 20 15:38:48.092413 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:38:48.092337 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a47fdaad-7e30-4967-a49d-b058fa9044bf-lib-modules\") pod \"perf-node-gather-daemonset-769nr\" (UID: \"a47fdaad-7e30-4967-a49d-b058fa9044bf\") " pod="openshift-must-gather-58p5q/perf-node-gather-daemonset-769nr" Apr 20 15:38:48.092413 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:38:48.092400 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a47fdaad-7e30-4967-a49d-b058fa9044bf-sys\") pod \"perf-node-gather-daemonset-769nr\" (UID: \"a47fdaad-7e30-4967-a49d-b058fa9044bf\") " pod="openshift-must-gather-58p5q/perf-node-gather-daemonset-769nr" Apr 20 15:38:48.092609 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:38:48.092423 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/a47fdaad-7e30-4967-a49d-b058fa9044bf-podres\") pod \"perf-node-gather-daemonset-769nr\" (UID: \"a47fdaad-7e30-4967-a49d-b058fa9044bf\") " pod="openshift-must-gather-58p5q/perf-node-gather-daemonset-769nr" Apr 20 15:38:48.092609 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:38:48.092513 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a47fdaad-7e30-4967-a49d-b058fa9044bf-lib-modules\") pod \"perf-node-gather-daemonset-769nr\" (UID: \"a47fdaad-7e30-4967-a49d-b058fa9044bf\") " pod="openshift-must-gather-58p5q/perf-node-gather-daemonset-769nr" Apr 20 15:38:48.099583 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:38:48.099565 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9zbg\" (UniqueName: \"kubernetes.io/projected/a47fdaad-7e30-4967-a49d-b058fa9044bf-kube-api-access-q9zbg\") pod \"perf-node-gather-daemonset-769nr\" (UID: \"a47fdaad-7e30-4967-a49d-b058fa9044bf\") " pod="openshift-must-gather-58p5q/perf-node-gather-daemonset-769nr" Apr 20 15:38:48.192766 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:38:48.192683 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-58p5q/perf-node-gather-daemonset-769nr" Apr 20 15:38:48.323341 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:38:48.323315 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-58p5q/perf-node-gather-daemonset-769nr"] Apr 20 15:38:48.325188 ip-10-0-134-230 kubenswrapper[2572]: W0420 15:38:48.325165 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-poda47fdaad_7e30_4967_a49d_b058fa9044bf.slice/crio-6a2efcd1fa53ab29cda691e99d842eea1f7223f7fe3ab3ee9df20e02c4bbba3f WatchSource:0}: Error finding container 6a2efcd1fa53ab29cda691e99d842eea1f7223f7fe3ab3ee9df20e02c4bbba3f: Status 404 returned error can't find the container with id 6a2efcd1fa53ab29cda691e99d842eea1f7223f7fe3ab3ee9df20e02c4bbba3f Apr 20 15:38:48.327159 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:38:48.327145 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 15:38:49.083959 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:38:49.083929 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-58p5q/perf-node-gather-daemonset-769nr" event={"ID":"a47fdaad-7e30-4967-a49d-b058fa9044bf","Type":"ContainerStarted","Data":"eb36d93693241c17b52ed496d0ead4a0aadcaf630d2f07372fdcc55db19505cc"} Apr 20 15:38:49.083959 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:38:49.083965 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-58p5q/perf-node-gather-daemonset-769nr" event={"ID":"a47fdaad-7e30-4967-a49d-b058fa9044bf","Type":"ContainerStarted","Data":"6a2efcd1fa53ab29cda691e99d842eea1f7223f7fe3ab3ee9df20e02c4bbba3f"} Apr 20 15:38:49.084387 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:38:49.083988 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-58p5q/perf-node-gather-daemonset-769nr" Apr 20 15:38:49.102671 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:38:49.102625 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-58p5q/perf-node-gather-daemonset-769nr" podStartSLOduration=2.102611832 podStartE2EDuration="2.102611832s" podCreationTimestamp="2026-04-20 15:38:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 15:38:49.100102898 +0000 UTC m=+2189.230978228" watchObservedRunningTime="2026-04-20 15:38:49.102611832 +0000 UTC m=+2189.233487136" Apr 20 15:38:50.239806 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:38:50.239776 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-p98mt_20db58c7-db2e-4b2b-be5a-cf2278346010/dns/0.log" Apr 20 15:38:50.259382 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:38:50.259351 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-p98mt_20db58c7-db2e-4b2b-be5a-cf2278346010/kube-rbac-proxy/0.log" Apr 20 15:38:50.391218 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:38:50.391188 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-p5xng_034b6f21-f85a-4440-a061-d39a6df5dde4/dns-node-resolver/0.log" Apr 20 15:38:50.901983 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:38:50.901952 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-x8tdg_22757559-941f-4d9e-9128-3aeefc6665f3/node-ca/0.log" Apr 20 15:38:51.821407 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:38:51.821378 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-fgkrd_7159e679-12b5-4114-8f79-9e8ea4c77bdc/discovery/0.log" Apr 20 15:38:51.840352 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:38:51.840323 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-7fcf5d587f-fnkd4_a507084f-a532-4ee0-b5ae-4bbde0f9b484/kube-auth-proxy/0.log" Apr 20 15:38:51.917875 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:38:51.917848 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-58b6f876-q76sp_6f8428f5-731d-47f5-a2a3-6e064ad22824/istio-proxy/0.log" Apr 20 15:38:52.448875 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:38:52.448847 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-fdmd9_8337a855-24f1-476f-b9e0-49701fd9bda2/serve-healthcheck-canary/0.log" Apr 20 15:38:53.008066 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:38:53.008040 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-7qmvr_ed7ca0c9-d248-41fb-a871-7313c1a4e1eb/kube-rbac-proxy/0.log" Apr 20 15:38:53.030114 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:38:53.030091 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-7qmvr_ed7ca0c9-d248-41fb-a871-7313c1a4e1eb/exporter/0.log" Apr 20 15:38:53.050369 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:38:53.050347 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-7qmvr_ed7ca0c9-d248-41fb-a871-7313c1a4e1eb/extractor/0.log" Apr 20 15:38:54.905332 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:38:54.905299 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-2r77g_0653a16c-21ee-4062-b961-87c17974c316/manager/0.log" Apr 20 15:38:55.049779 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:38:55.049734 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-ws2wq_ba566142-9484-403a-b32d-f2a048b6021b/manager/1.log" Apr 20 15:38:55.061696 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:38:55.061671 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-ws2wq_ba566142-9484-403a-b32d-f2a048b6021b/manager/2.log" Apr 20 15:38:55.097574 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:38:55.097550 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-58p5q/perf-node-gather-daemonset-769nr" Apr 20 15:38:55.149499 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:38:55.149460 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-6cdf6786d9-jz9pb_6be64f0b-1aba-438c-a1bd-ae1c81cd4a97/manager/0.log" Apr 20 15:38:55.166934 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:38:55.166848 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-gqcfc_fbb2fbe2-9ebf-4e0b-ac7e-57db6538eafc/postgres/0.log" Apr 20 15:38:56.381245 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:38:56.381215 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_openshift-lws-operator-bfc7f696d-qkldg_c2a3d95d-2318-4e9e-b741-a5263163d18e/openshift-lws-operator/0.log" Apr 20 15:39:02.296920 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:39:02.296895 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-v4jvg_1f117f6f-4e31-4bc5-91d0-9a6176af628e/kube-multus-additional-cni-plugins/0.log" Apr 20 15:39:02.319888 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:39:02.319861 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-v4jvg_1f117f6f-4e31-4bc5-91d0-9a6176af628e/egress-router-binary-copy/0.log" Apr 20 15:39:02.345310 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:39:02.345280 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-v4jvg_1f117f6f-4e31-4bc5-91d0-9a6176af628e/cni-plugins/0.log" Apr 20 15:39:02.366981 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:39:02.366956 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-v4jvg_1f117f6f-4e31-4bc5-91d0-9a6176af628e/bond-cni-plugin/0.log" Apr 20 15:39:02.390313 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:39:02.390289 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-v4jvg_1f117f6f-4e31-4bc5-91d0-9a6176af628e/routeoverride-cni/0.log" Apr 20 15:39:02.413870 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:39:02.413846 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-v4jvg_1f117f6f-4e31-4bc5-91d0-9a6176af628e/whereabouts-cni-bincopy/0.log" Apr 20 15:39:02.438120 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:39:02.438096 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-v4jvg_1f117f6f-4e31-4bc5-91d0-9a6176af628e/whereabouts-cni/0.log" Apr 20 15:39:02.480569 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:39:02.480539 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-b568r_de3eeafd-2ce5-4f51-9232-ac55f91bb7af/kube-multus/0.log" Apr 20 15:39:02.710852 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:39:02.710774 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-sq52t_7215f3fe-093a-42b9-bea0-26a93cb4e1ff/network-metrics-daemon/0.log" Apr 20 15:39:02.732932 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:39:02.732910 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-sq52t_7215f3fe-093a-42b9-bea0-26a93cb4e1ff/kube-rbac-proxy/0.log" Apr 20 15:39:03.659180 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:39:03.659080 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nhq74_772b645d-af27-49ac-9efa-a2cf5ea2725a/ovn-controller/0.log" Apr 20 15:39:03.691878 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:39:03.691845 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nhq74_772b645d-af27-49ac-9efa-a2cf5ea2725a/ovn-acl-logging/0.log" Apr 20 15:39:03.716667 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:39:03.716645 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nhq74_772b645d-af27-49ac-9efa-a2cf5ea2725a/kube-rbac-proxy-node/0.log" Apr 20 15:39:03.739208 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:39:03.739188 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nhq74_772b645d-af27-49ac-9efa-a2cf5ea2725a/kube-rbac-proxy-ovn-metrics/0.log" Apr 20 15:39:03.758709 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:39:03.758689 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nhq74_772b645d-af27-49ac-9efa-a2cf5ea2725a/northd/0.log" Apr 20 15:39:03.784718 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:39:03.784694 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nhq74_772b645d-af27-49ac-9efa-a2cf5ea2725a/nbdb/0.log" Apr 20 15:39:03.820847 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:39:03.820807 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nhq74_772b645d-af27-49ac-9efa-a2cf5ea2725a/sbdb/0.log" Apr 20 15:39:03.928925 ip-10-0-134-230 kubenswrapper[2572]: I0420 15:39:03.928850 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nhq74_772b645d-af27-49ac-9efa-a2cf5ea2725a/ovnkube-controller/0.log"