Apr 16 20:11:40.213260 ip-10-0-131-31 systemd[1]: Starting Kubernetes Kubelet... Apr 16 20:11:40.630841 ip-10-0-131-31 kubenswrapper[2577]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 20:11:40.630841 ip-10-0-131-31 kubenswrapper[2577]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 20:11:40.630841 ip-10-0-131-31 kubenswrapper[2577]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 20:11:40.630841 ip-10-0-131-31 kubenswrapper[2577]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 20:11:40.630841 ip-10-0-131-31 kubenswrapper[2577]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 20:11:40.632348 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.632228 2577 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 20:11:40.635284 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635268 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 20:11:40.635284 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635284 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 20:11:40.635350 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635288 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 20:11:40.635350 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635291 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 20:11:40.635350 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635294 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 20:11:40.635350 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635297 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 20:11:40.635350 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635299 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 20:11:40.635350 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635303 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 20:11:40.635350 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635305 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 20:11:40.635350 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635308 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 20:11:40.635350 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635311 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 20:11:40.635350 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635314 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 20:11:40.635350 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635317 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 20:11:40.635350 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635320 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 20:11:40.635350 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635323 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 20:11:40.635350 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635325 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 20:11:40.635350 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635328 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 20:11:40.635350 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635330 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 20:11:40.635350 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635333 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 20:11:40.635350 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635336 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 20:11:40.635350 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635338 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 20:11:40.635350 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635341 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 20:11:40.635865 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635344 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 20:11:40.635865 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635353 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 20:11:40.635865 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635357 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 20:11:40.635865 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635360 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 20:11:40.635865 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635368 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 20:11:40.635865 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635372 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 20:11:40.635865 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635375 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 20:11:40.635865 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635378 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 20:11:40.635865 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635381 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 20:11:40.635865 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635384 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 20:11:40.635865 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635387 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 20:11:40.635865 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635390 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 20:11:40.635865 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635393 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 20:11:40.635865 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635396 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 20:11:40.635865 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635398 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 20:11:40.635865 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635401 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 20:11:40.635865 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635403 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 20:11:40.635865 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635406 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 20:11:40.635865 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635409 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 20:11:40.635865 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635411 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 20:11:40.636403 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635414 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 20:11:40.636403 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635416 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 20:11:40.636403 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635419 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 16 20:11:40.636403 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635421 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 20:11:40.636403 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635424 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 20:11:40.636403 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635426 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 20:11:40.636403 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635429 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 20:11:40.636403 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635432 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 20:11:40.636403 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635434 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 20:11:40.636403 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635437 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 20:11:40.636403 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635439 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 20:11:40.636403 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635442 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 20:11:40.636403 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635444 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 20:11:40.636403 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635447 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 20:11:40.636403 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635451 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 20:11:40.636403 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635453 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 20:11:40.636403 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635456 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 20:11:40.636403 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635459 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 20:11:40.636403 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635461 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 20:11:40.636403 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635464 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 20:11:40.636898 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635466 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 20:11:40.636898 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635469 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 20:11:40.636898 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635471 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 20:11:40.636898 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635474 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 20:11:40.636898 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635476 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 20:11:40.636898 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635479 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 20:11:40.636898 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635481 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 20:11:40.636898 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635484 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 20:11:40.636898 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635486 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 20:11:40.636898 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635489 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 20:11:40.636898 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635491 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 20:11:40.636898 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635494 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 20:11:40.636898 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635497 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 20:11:40.636898 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635499 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 20:11:40.636898 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635502 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 20:11:40.636898 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635504 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 20:11:40.636898 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635509 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 20:11:40.636898 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635513 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 20:11:40.636898 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635515 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 20:11:40.637400 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635518 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 20:11:40.637400 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635520 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 20:11:40.637400 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635523 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 20:11:40.637400 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635525 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 20:11:40.637400 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635528 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 20:11:40.637400 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635882 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 20:11:40.637400 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635888 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 20:11:40.637400 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635891 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 20:11:40.637400 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635896 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 20:11:40.637400 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635900 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 20:11:40.637400 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635903 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 20:11:40.637400 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635906 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 20:11:40.637400 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635909 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 20:11:40.637400 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635911 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 20:11:40.637400 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635914 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 20:11:40.637400 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635916 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 20:11:40.637400 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635919 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 20:11:40.637400 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635921 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 20:11:40.637400 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635924 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 20:11:40.637858 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635927 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 20:11:40.637858 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635929 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 20:11:40.637858 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635932 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 20:11:40.637858 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635934 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 20:11:40.637858 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635937 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 20:11:40.637858 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635939 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 20:11:40.637858 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635942 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 20:11:40.637858 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635959 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 20:11:40.637858 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635962 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 20:11:40.637858 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635964 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 20:11:40.637858 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635967 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 20:11:40.637858 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635970 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 20:11:40.637858 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635972 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 20:11:40.637858 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635975 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 20:11:40.637858 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635979 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 20:11:40.637858 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635982 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 20:11:40.637858 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635985 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 20:11:40.637858 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635988 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 20:11:40.637858 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635990 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 20:11:40.638377 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635993 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 20:11:40.638377 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635996 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 20:11:40.638377 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.635999 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 20:11:40.638377 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.636002 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 20:11:40.638377 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.636005 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 20:11:40.638377 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.636007 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 20:11:40.638377 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.636010 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 20:11:40.638377 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.636012 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 20:11:40.638377 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.636015 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 20:11:40.638377 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.636018 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 20:11:40.638377 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.636020 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 20:11:40.638377 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.636023 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 20:11:40.638377 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.636025 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 20:11:40.638377 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.636027 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 20:11:40.638377 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.636030 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 20:11:40.638377 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.636032 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 20:11:40.638377 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.636035 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 20:11:40.638377 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.636037 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 20:11:40.638377 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.636041 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 20:11:40.638377 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.636044 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 20:11:40.638868 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.636047 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 20:11:40.638868 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.636049 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 20:11:40.638868 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.636055 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 20:11:40.638868 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.636058 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 20:11:40.638868 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.636061 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 20:11:40.638868 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.636064 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 20:11:40.638868 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.636066 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 20:11:40.638868 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.636069 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 20:11:40.638868 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.636072 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 20:11:40.638868 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.636074 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 20:11:40.638868 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.636077 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 20:11:40.638868 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.636079 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 20:11:40.638868 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.636082 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 20:11:40.638868 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.636085 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 20:11:40.638868 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.636087 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 20:11:40.638868 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.636090 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 20:11:40.638868 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.636093 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 20:11:40.638868 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.636095 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 20:11:40.638868 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.636098 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 20:11:40.638868 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.636100 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 20:11:40.639379 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.636103 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 20:11:40.639379 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.636105 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 20:11:40.639379 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.636108 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 20:11:40.639379 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.636110 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 20:11:40.639379 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.636113 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 20:11:40.639379 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.636115 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 20:11:40.639379 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.636118 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 20:11:40.639379 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.636120 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 20:11:40.639379 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.636124 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 20:11:40.639379 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.636128 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 16 20:11:40.639379 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.636130 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 20:11:40.639379 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.636133 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 20:11:40.639379 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.636135 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 20:11:40.639379 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636211 2577 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 20:11:40.639379 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636222 2577 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 20:11:40.639379 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636228 2577 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 20:11:40.639379 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636233 2577 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 20:11:40.639379 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636237 2577 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 20:11:40.639379 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636240 2577 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 20:11:40.639379 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636245 2577 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 20:11:40.639379 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636249 2577 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 20:11:40.639886 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636253 2577 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 20:11:40.639886 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636256 2577 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 20:11:40.639886 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636259 2577 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 20:11:40.639886 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636262 2577 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 20:11:40.639886 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636266 2577 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 20:11:40.639886 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636269 2577 flags.go:64] FLAG: --cgroup-root="" Apr 16 20:11:40.639886 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636271 2577 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 20:11:40.639886 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636274 2577 flags.go:64] FLAG: --client-ca-file="" Apr 16 20:11:40.639886 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636277 2577 flags.go:64] FLAG: --cloud-config="" Apr 16 20:11:40.639886 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636280 2577 flags.go:64] FLAG: --cloud-provider="external" Apr 16 20:11:40.639886 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636283 2577 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 20:11:40.639886 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636288 2577 flags.go:64] FLAG: --cluster-domain="" Apr 16 20:11:40.639886 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636291 2577 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 20:11:40.639886 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636294 2577 flags.go:64] FLAG: --config-dir="" Apr 16 20:11:40.639886 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636297 2577 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 20:11:40.639886 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636300 2577 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 20:11:40.639886 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636304 2577 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 20:11:40.639886 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636307 2577 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 20:11:40.639886 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636310 2577 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 20:11:40.639886 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636313 2577 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 20:11:40.639886 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636317 2577 flags.go:64] FLAG: --contention-profiling="false" Apr 16 20:11:40.639886 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636320 2577 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 20:11:40.639886 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636323 2577 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 20:11:40.639886 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636326 2577 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 20:11:40.639886 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636329 2577 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 20:11:40.640514 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636333 2577 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 20:11:40.640514 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636336 2577 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 20:11:40.640514 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636344 2577 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 20:11:40.640514 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636347 2577 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 20:11:40.640514 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636350 2577 flags.go:64] FLAG: --enable-server="true" Apr 16 20:11:40.640514 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636353 2577 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 20:11:40.640514 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636357 2577 flags.go:64] FLAG: --event-burst="100" Apr 16 20:11:40.640514 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636360 2577 flags.go:64] FLAG: --event-qps="50" Apr 16 20:11:40.640514 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636363 2577 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 20:11:40.640514 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636365 2577 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 20:11:40.640514 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636369 2577 flags.go:64] FLAG: --eviction-hard="" Apr 16 20:11:40.640514 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636372 2577 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 20:11:40.640514 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636375 2577 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 20:11:40.640514 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636378 2577 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 20:11:40.640514 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636381 2577 flags.go:64] FLAG: --eviction-soft="" Apr 16 20:11:40.640514 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636384 2577 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 20:11:40.640514 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636387 2577 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 20:11:40.640514 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636390 2577 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 20:11:40.640514 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636393 2577 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 20:11:40.640514 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636396 2577 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 20:11:40.640514 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636399 2577 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 20:11:40.640514 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636402 2577 flags.go:64] FLAG: --feature-gates="" Apr 16 20:11:40.640514 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636411 2577 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 20:11:40.640514 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636414 2577 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 20:11:40.640514 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636418 2577 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 20:11:40.641174 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636421 2577 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 20:11:40.641174 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636424 2577 flags.go:64] FLAG: --healthz-port="10248" Apr 16 20:11:40.641174 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636427 2577 flags.go:64] FLAG: --help="false" Apr 16 20:11:40.641174 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636430 2577 flags.go:64] FLAG: --hostname-override="ip-10-0-131-31.ec2.internal" Apr 16 20:11:40.641174 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636433 2577 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 20:11:40.641174 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636436 2577 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 20:11:40.641174 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636439 2577 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 20:11:40.641174 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636442 2577 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 20:11:40.641174 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636446 2577 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 20:11:40.641174 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636451 2577 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 20:11:40.641174 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636454 2577 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 20:11:40.641174 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636457 2577 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 20:11:40.641174 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636459 2577 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 20:11:40.641174 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636462 2577 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 20:11:40.641174 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636465 2577 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 20:11:40.641174 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636468 2577 flags.go:64] FLAG: --kube-reserved="" Apr 16 20:11:40.641174 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636471 2577 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 20:11:40.641174 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636474 2577 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 20:11:40.641174 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636477 2577 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 20:11:40.641174 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636480 2577 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 20:11:40.641174 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636483 2577 flags.go:64] FLAG: --lock-file="" Apr 16 20:11:40.641174 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636486 2577 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 20:11:40.641174 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636489 2577 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 20:11:40.641174 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636492 2577 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 20:11:40.641791 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636497 2577 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 20:11:40.641791 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636500 2577 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 20:11:40.641791 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636503 2577 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 20:11:40.641791 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636506 2577 flags.go:64] FLAG: --logging-format="text" Apr 16 20:11:40.641791 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636509 2577 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 20:11:40.641791 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636512 2577 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 20:11:40.641791 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636515 2577 flags.go:64] FLAG: --manifest-url="" Apr 16 20:11:40.641791 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636518 2577 flags.go:64] FLAG: --manifest-url-header="" Apr 16 20:11:40.641791 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636522 2577 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 20:11:40.641791 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636525 2577 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 20:11:40.641791 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636530 2577 flags.go:64] FLAG: --max-pods="110" Apr 16 20:11:40.641791 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636532 2577 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 20:11:40.641791 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636535 2577 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 20:11:40.641791 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636538 2577 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 20:11:40.641791 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636541 2577 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 20:11:40.641791 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636544 2577 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 20:11:40.641791 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636547 2577 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 20:11:40.641791 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636552 2577 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 20:11:40.641791 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636560 2577 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 20:11:40.641791 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636563 2577 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 20:11:40.641791 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636566 2577 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 20:11:40.641791 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636569 2577 flags.go:64] FLAG: --pod-cidr="" Apr 16 20:11:40.641791 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636572 2577 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 16 20:11:40.642380 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636577 2577 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 20:11:40.642380 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636580 2577 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 20:11:40.642380 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636583 2577 flags.go:64] FLAG: --pods-per-core="0" Apr 16 20:11:40.642380 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636586 2577 flags.go:64] FLAG: --port="10250" Apr 16 20:11:40.642380 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636590 2577 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 20:11:40.642380 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636593 2577 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-025a7353b9ad2db4f" Apr 16 20:11:40.642380 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636596 2577 flags.go:64] FLAG: --qos-reserved="" Apr 16 20:11:40.642380 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636599 2577 flags.go:64] FLAG: --read-only-port="10255" Apr 16 20:11:40.642380 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636602 2577 flags.go:64] FLAG: --register-node="true" Apr 16 20:11:40.642380 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636605 2577 flags.go:64] FLAG: --register-schedulable="true" Apr 16 20:11:40.642380 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636608 2577 flags.go:64] FLAG: --register-with-taints="" Apr 16 20:11:40.642380 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636612 2577 flags.go:64] FLAG: --registry-burst="10" Apr 16 20:11:40.642380 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636615 2577 flags.go:64] FLAG: --registry-qps="5" Apr 16 20:11:40.642380 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636618 2577 flags.go:64] FLAG: --reserved-cpus="" Apr 16 20:11:40.642380 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636620 2577 flags.go:64] FLAG: --reserved-memory="" Apr 16 20:11:40.642380 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636624 2577 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 20:11:40.642380 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636627 2577 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 20:11:40.642380 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636630 2577 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 20:11:40.642380 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636633 2577 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 20:11:40.642380 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636636 2577 flags.go:64] FLAG: --runonce="false" Apr 16 20:11:40.642380 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636639 2577 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 20:11:40.642380 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636642 2577 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 20:11:40.642380 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636645 2577 flags.go:64] FLAG: --seccomp-default="false" Apr 16 20:11:40.642380 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636648 2577 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 20:11:40.642380 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636650 2577 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 20:11:40.642380 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636653 2577 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 20:11:40.643014 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636656 2577 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 20:11:40.643014 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636661 2577 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 20:11:40.643014 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636664 2577 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 20:11:40.643014 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636667 2577 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 20:11:40.643014 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636670 2577 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 20:11:40.643014 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636673 2577 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 20:11:40.643014 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636676 2577 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 20:11:40.643014 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636678 2577 flags.go:64] FLAG: --system-cgroups="" Apr 16 20:11:40.643014 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636681 2577 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 20:11:40.643014 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636688 2577 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 20:11:40.643014 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636691 2577 flags.go:64] FLAG: --tls-cert-file="" Apr 16 20:11:40.643014 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636694 2577 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 20:11:40.643014 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636699 2577 flags.go:64] FLAG: --tls-min-version="" Apr 16 20:11:40.643014 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636702 2577 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 20:11:40.643014 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636705 2577 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 20:11:40.643014 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636708 2577 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 20:11:40.643014 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636711 2577 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 20:11:40.643014 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636714 2577 flags.go:64] FLAG: --v="2" Apr 16 20:11:40.643014 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636719 2577 flags.go:64] FLAG: --version="false" Apr 16 20:11:40.643014 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636723 2577 flags.go:64] FLAG: --vmodule="" Apr 16 20:11:40.643014 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636727 2577 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 20:11:40.643014 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.636731 2577 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 20:11:40.643014 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.636834 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 20:11:40.643014 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.636838 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 20:11:40.643014 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.636841 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 20:11:40.643639 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.636844 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 20:11:40.643639 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.636847 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 20:11:40.643639 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.636849 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 20:11:40.643639 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.636852 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 20:11:40.643639 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.636855 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 20:11:40.643639 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.636858 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 20:11:40.643639 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.636861 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 20:11:40.643639 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.636863 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 20:11:40.643639 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.636868 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 20:11:40.643639 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.636871 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 20:11:40.643639 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.636874 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 20:11:40.643639 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.636876 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 20:11:40.643639 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.636879 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 20:11:40.643639 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.636882 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 20:11:40.643639 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.636885 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 16 20:11:40.643639 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.636888 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 20:11:40.643639 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.636892 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 20:11:40.643639 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.636894 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 20:11:40.643639 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.636897 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 20:11:40.643639 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.636900 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 20:11:40.644205 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.636903 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 20:11:40.644205 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.636905 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 20:11:40.644205 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.636908 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 20:11:40.644205 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.636911 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 20:11:40.644205 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.636915 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 20:11:40.644205 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.636919 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 20:11:40.644205 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.636921 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 20:11:40.644205 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.636924 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 20:11:40.644205 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.636926 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 20:11:40.644205 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.636929 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 20:11:40.644205 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.636931 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 20:11:40.644205 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.636934 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 20:11:40.644205 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.636936 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 20:11:40.644205 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.636939 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 20:11:40.644205 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.636941 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 20:11:40.644205 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.636959 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 20:11:40.644205 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.636962 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 20:11:40.644205 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.636965 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 20:11:40.644205 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.636967 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 20:11:40.644205 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.636970 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 20:11:40.644691 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.636975 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 20:11:40.644691 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.636978 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 20:11:40.644691 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.636981 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 20:11:40.644691 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.636984 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 20:11:40.644691 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.636986 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 20:11:40.644691 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.636989 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 20:11:40.644691 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.636992 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 20:11:40.644691 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.636994 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 20:11:40.644691 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.637001 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 20:11:40.644691 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.637003 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 20:11:40.644691 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.637006 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 20:11:40.644691 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.637009 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 20:11:40.644691 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.637012 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 20:11:40.644691 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.637015 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 20:11:40.644691 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.637017 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 20:11:40.644691 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.637020 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 20:11:40.644691 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.637022 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 20:11:40.644691 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.637025 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 20:11:40.644691 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.637027 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 20:11:40.645288 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.637030 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 20:11:40.645288 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.637033 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 20:11:40.645288 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.637035 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 20:11:40.645288 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.637038 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 20:11:40.645288 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.637041 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 20:11:40.645288 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.637044 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 20:11:40.645288 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.637046 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 20:11:40.645288 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.637049 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 20:11:40.645288 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.637052 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 20:11:40.645288 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.637056 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 20:11:40.645288 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.637059 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 20:11:40.645288 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.637062 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 20:11:40.645288 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.637065 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 20:11:40.645288 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.637069 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 20:11:40.645288 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.637072 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 20:11:40.645288 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.637075 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 20:11:40.645288 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.637084 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 20:11:40.645288 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.637086 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 20:11:40.645288 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.637089 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 20:11:40.645288 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.637092 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 20:11:40.645786 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.637094 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 20:11:40.645786 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.637098 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 20:11:40.645786 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.637101 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 20:11:40.645786 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.637104 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 20:11:40.645786 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.637743 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 20:11:40.645786 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.644389 2577 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 20:11:40.645786 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.644403 2577 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 20:11:40.645786 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644459 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 20:11:40.645786 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644464 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 20:11:40.645786 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644467 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 20:11:40.645786 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644470 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 20:11:40.645786 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644473 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 20:11:40.645786 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644476 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 20:11:40.645786 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644479 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 20:11:40.645786 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644482 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 20:11:40.646181 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644485 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 20:11:40.646181 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644488 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 20:11:40.646181 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644490 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 20:11:40.646181 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644493 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 20:11:40.646181 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644496 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 20:11:40.646181 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644498 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 20:11:40.646181 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644501 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 20:11:40.646181 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644503 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 20:11:40.646181 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644506 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 20:11:40.646181 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644509 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 20:11:40.646181 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644512 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 20:11:40.646181 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644514 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 20:11:40.646181 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644517 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 20:11:40.646181 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644520 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 20:11:40.646181 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644522 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 20:11:40.646181 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644525 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 20:11:40.646181 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644528 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 20:11:40.646181 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644530 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 20:11:40.646181 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644533 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 20:11:40.646181 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644535 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 20:11:40.646661 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644538 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 20:11:40.646661 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644541 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 20:11:40.646661 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644543 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 20:11:40.646661 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644547 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 20:11:40.646661 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644549 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 20:11:40.646661 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644552 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 20:11:40.646661 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644555 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 20:11:40.646661 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644557 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 20:11:40.646661 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644560 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 20:11:40.646661 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644562 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 20:11:40.646661 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644565 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 20:11:40.646661 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644568 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 20:11:40.646661 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644572 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 20:11:40.646661 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644574 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 20:11:40.646661 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644578 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 20:11:40.646661 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644581 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 20:11:40.646661 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644584 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 20:11:40.646661 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644587 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 20:11:40.646661 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644589 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 20:11:40.647145 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644592 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 20:11:40.647145 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644595 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 20:11:40.647145 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644597 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 20:11:40.647145 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644600 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 20:11:40.647145 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644602 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 20:11:40.647145 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644605 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 20:11:40.647145 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644607 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 20:11:40.647145 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644610 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 20:11:40.647145 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644613 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 20:11:40.647145 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644616 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 20:11:40.647145 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644618 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 20:11:40.647145 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644621 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 20:11:40.647145 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644624 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 20:11:40.647145 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644626 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 20:11:40.647145 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644629 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 20:11:40.647145 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644633 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 20:11:40.647145 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644638 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 20:11:40.647145 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644641 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 20:11:40.647145 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644644 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 20:11:40.647145 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644647 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 20:11:40.647673 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644650 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 20:11:40.647673 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644652 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 20:11:40.647673 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644655 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 20:11:40.647673 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644658 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 20:11:40.647673 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644660 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 20:11:40.647673 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644663 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 20:11:40.647673 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644666 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 20:11:40.647673 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644668 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 20:11:40.647673 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644671 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 20:11:40.647673 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644674 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 16 20:11:40.647673 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644676 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 20:11:40.647673 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644679 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 20:11:40.647673 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644681 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 20:11:40.647673 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644684 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 20:11:40.647673 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644686 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 20:11:40.647673 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644689 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 20:11:40.647673 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644691 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 20:11:40.647673 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644694 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 20:11:40.647673 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644697 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 20:11:40.648155 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.644703 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 20:11:40.648155 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644807 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 20:11:40.648155 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644812 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 20:11:40.648155 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644815 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 20:11:40.648155 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644818 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 20:11:40.648155 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644821 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 20:11:40.648155 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644824 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 20:11:40.648155 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644826 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 20:11:40.648155 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644829 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 20:11:40.648155 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644832 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 20:11:40.648155 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644835 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 20:11:40.648155 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644838 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 20:11:40.648155 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644840 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 20:11:40.648155 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644843 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 20:11:40.648155 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644845 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 20:11:40.648155 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644848 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 20:11:40.648553 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644850 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 20:11:40.648553 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644853 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 20:11:40.648553 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644856 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 20:11:40.648553 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644859 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 20:11:40.648553 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644861 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 20:11:40.648553 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644864 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 20:11:40.648553 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644867 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 20:11:40.648553 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644869 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 20:11:40.648553 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644872 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 20:11:40.648553 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644874 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 20:11:40.648553 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644877 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 20:11:40.648553 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644879 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 20:11:40.648553 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644881 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 20:11:40.648553 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644884 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 20:11:40.648553 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644886 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 20:11:40.648553 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644889 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 20:11:40.648553 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644891 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 20:11:40.648553 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644894 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 20:11:40.648553 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644897 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 20:11:40.649158 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644900 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 20:11:40.649158 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644902 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 20:11:40.649158 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644905 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 20:11:40.649158 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644907 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 20:11:40.649158 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644909 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 20:11:40.649158 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644912 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 20:11:40.649158 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644915 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 20:11:40.649158 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644918 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 20:11:40.649158 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644920 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 20:11:40.649158 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644923 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 20:11:40.649158 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644926 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 20:11:40.649158 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644929 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 20:11:40.649158 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644931 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 20:11:40.649158 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644934 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 20:11:40.649158 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644936 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 20:11:40.649158 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644939 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 20:11:40.649158 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644942 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 16 20:11:40.649158 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644959 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 20:11:40.649158 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644962 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 20:11:40.649158 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644966 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 20:11:40.649661 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644970 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 20:11:40.649661 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644972 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 20:11:40.649661 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644975 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 20:11:40.649661 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644978 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 20:11:40.649661 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644980 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 20:11:40.649661 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644983 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 20:11:40.649661 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644985 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 20:11:40.649661 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644988 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 20:11:40.649661 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644991 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 20:11:40.649661 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644993 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 20:11:40.649661 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644996 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 20:11:40.649661 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.644998 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 20:11:40.649661 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.645001 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 20:11:40.649661 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.645004 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 20:11:40.649661 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.645007 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 20:11:40.649661 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.645009 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 20:11:40.649661 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.645012 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 20:11:40.649661 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.645015 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 20:11:40.649661 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.645017 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 20:11:40.649661 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.645021 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 20:11:40.650176 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.645024 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 20:11:40.650176 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.645026 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 20:11:40.650176 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.645029 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 20:11:40.650176 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.645031 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 20:11:40.650176 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.645034 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 20:11:40.650176 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.645036 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 20:11:40.650176 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.645039 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 20:11:40.650176 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.645041 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 20:11:40.650176 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.645045 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 20:11:40.650176 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.645049 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 20:11:40.650176 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.645052 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 20:11:40.650176 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:40.645054 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 20:11:40.650176 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.645060 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 20:11:40.650176 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.645802 2577 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 20:11:40.650176 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.648769 2577 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 20:11:40.650568 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.649673 2577 server.go:1019] "Starting client certificate rotation" Apr 16 20:11:40.650568 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.649777 2577 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 20:11:40.650568 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.649824 2577 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 20:11:40.673180 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.673165 2577 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 20:11:40.677996 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.677979 2577 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 20:11:40.690677 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.690656 2577 log.go:25] "Validated CRI v1 runtime API" Apr 16 20:11:40.696887 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.696874 2577 log.go:25] "Validated CRI v1 image API" Apr 16 20:11:40.698301 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.698285 2577 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 20:11:40.701126 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.701108 2577 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 20:11:40.701802 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.701778 2577 fs.go:135] Filesystem UUIDs: map[36fc33e0-9d3b-4ae8-a772-532542294828:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 f7c174c4-21db-49c4-b785-06432dc92a08:/dev/nvme0n1p3] Apr 16 20:11:40.701850 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.701801 2577 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 20:11:40.707606 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.707488 2577 manager.go:217] Machine: {Timestamp:2026-04-16 20:11:40.705407916 +0000 UTC m=+0.381676143 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3099416 MemoryCapacity:33164488704 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec215e076a01ea03797908c137dd4a33 SystemUUID:ec215e07-6a01-ea03-7979-08c137dd4a33 BootID:4c17a861-e6b1-47ef-b316-35dd2274946f Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:1f:83:4c:93:c9 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:1f:83:4c:93:c9 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:de:10:d0:3c:62:86 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164488704 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 20:11:40.707606 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.707602 2577 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 20:11:40.707720 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.707677 2577 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 20:11:40.708720 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.708697 2577 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 20:11:40.708930 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.708723 2577 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-131-31.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 20:11:40.709042 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.708943 2577 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 20:11:40.709042 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.708975 2577 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 20:11:40.709042 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.708994 2577 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 20:11:40.709629 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.709617 2577 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 20:11:40.711250 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.711237 2577 state_mem.go:36] "Initialized new in-memory state store" Apr 16 20:11:40.711378 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.711368 2577 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 20:11:40.713642 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.713621 2577 kubelet.go:491] "Attempting to sync node with API server" Apr 16 20:11:40.713719 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.713654 2577 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 20:11:40.713719 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.713670 2577 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 20:11:40.713719 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.713683 2577 kubelet.go:397] "Adding apiserver pod source" Apr 16 20:11:40.713719 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.713696 2577 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 20:11:40.714815 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.714801 2577 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 20:11:40.714894 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.714825 2577 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 20:11:40.717701 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.717679 2577 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 20:11:40.719307 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.719294 2577 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 20:11:40.720937 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.720924 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 20:11:40.720999 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.720941 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 20:11:40.720999 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.720959 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 20:11:40.720999 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.720967 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 20:11:40.720999 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.720976 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 20:11:40.720999 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.720986 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 20:11:40.720999 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.720994 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 20:11:40.721153 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.721012 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 20:11:40.721153 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.721020 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 20:11:40.721153 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.721026 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 20:11:40.721153 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.721042 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 20:11:40.721153 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.721050 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 20:11:40.721892 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.721881 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 20:11:40.721929 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.721895 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 20:11:40.725161 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.725141 2577 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-dvsl5" Apr 16 20:11:40.725929 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.725798 2577 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 20:11:40.726386 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.726368 2577 server.go:1295] "Started kubelet" Apr 16 20:11:40.727374 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.727327 2577 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-31.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 20:11:40.727487 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.727422 2577 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 20:11:40.727506 ip-10-0-131-31 systemd[1]: Started Kubernetes Kubelet. Apr 16 20:11:40.728183 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.727915 2577 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 20:11:40.728183 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:11:40.728151 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 20:11:40.728288 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:11:40.728158 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-131-31.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 20:11:40.728288 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.728235 2577 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 20:11:40.729246 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.729230 2577 server.go:317] "Adding debug handlers to kubelet server" Apr 16 20:11:40.729623 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.729607 2577 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 20:11:40.732884 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.732857 2577 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-dvsl5" Apr 16 20:11:40.733823 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:11:40.733111 2577 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-131-31.ec2.internal.18a6ef607c53d392 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-131-31.ec2.internal,UID:ip-10-0-131-31.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-131-31.ec2.internal,},FirstTimestamp:2026-04-16 20:11:40.726006674 +0000 UTC m=+0.402274901,LastTimestamp:2026-04-16 20:11:40.726006674 +0000 UTC m=+0.402274901,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-131-31.ec2.internal,}" Apr 16 20:11:40.736726 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:11:40.736702 2577 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 20:11:40.738417 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.738402 2577 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 20:11:40.738417 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.738411 2577 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 20:11:40.739040 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.738966 2577 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 20:11:40.739040 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.738978 2577 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 20:11:40.739040 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.738992 2577 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 20:11:40.739236 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.739107 2577 reconstruct.go:97] "Volume reconstruction finished" Apr 16 20:11:40.739236 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.739116 2577 reconciler.go:26] "Reconciler: start to sync state" Apr 16 20:11:40.739236 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.739202 2577 factory.go:55] Registering systemd factory Apr 16 20:11:40.739377 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.739264 2577 factory.go:223] Registration of the systemd container factory successfully Apr 16 20:11:40.739377 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:11:40.739265 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-31.ec2.internal\" not found" Apr 16 20:11:40.739468 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.739462 2577 factory.go:153] Registering CRI-O factory Apr 16 20:11:40.739517 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.739473 2577 factory.go:223] Registration of the crio container factory successfully Apr 16 20:11:40.739517 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.739514 2577 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 20:11:40.739602 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.739537 2577 factory.go:103] Registering Raw factory Apr 16 20:11:40.739602 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.739578 2577 manager.go:1196] Started watching for new ooms in manager Apr 16 20:11:40.740232 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.740218 2577 manager.go:319] Starting recovery of all containers Apr 16 20:11:40.751079 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.751064 2577 manager.go:324] Recovery completed Apr 16 20:11:40.752237 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.752221 2577 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 20:11:40.755616 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.755602 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 20:11:40.755687 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:11:40.755674 2577 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-131-31.ec2.internal\" not found" node="ip-10-0-131-31.ec2.internal" Apr 16 20:11:40.757864 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.757848 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-31.ec2.internal" event="NodeHasSufficientMemory" Apr 16 20:11:40.757916 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.757879 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-31.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 20:11:40.757916 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.757893 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-31.ec2.internal" event="NodeHasSufficientPID" Apr 16 20:11:40.758349 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.758337 2577 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 20:11:40.758388 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.758366 2577 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 20:11:40.758388 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.758382 2577 state_mem.go:36] "Initialized new in-memory state store" Apr 16 20:11:40.761029 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.761019 2577 policy_none.go:49] "None policy: Start" Apr 16 20:11:40.761071 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.761034 2577 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 20:11:40.761071 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.761044 2577 state_mem.go:35] "Initializing new in-memory state store" Apr 16 20:11:40.807401 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.795165 2577 manager.go:341] "Starting Device Plugin manager" Apr 16 20:11:40.807401 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:11:40.795319 2577 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 20:11:40.807401 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.795333 2577 server.go:85] "Starting device plugin registration server" Apr 16 20:11:40.807401 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.795555 2577 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 20:11:40.807401 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.795570 2577 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 20:11:40.807401 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.795670 2577 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 20:11:40.807401 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.795785 2577 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 20:11:40.807401 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.795796 2577 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 20:11:40.807401 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:11:40.796174 2577 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 20:11:40.807401 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:11:40.796208 2577 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-131-31.ec2.internal\" not found" Apr 16 20:11:40.869021 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.868993 2577 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 20:11:40.870154 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.870138 2577 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 20:11:40.870209 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.870167 2577 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 20:11:40.870209 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.870186 2577 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 20:11:40.870209 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.870197 2577 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 20:11:40.870306 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:11:40.870229 2577 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 20:11:40.873841 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.873826 2577 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 20:11:40.896144 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.896110 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 20:11:40.896899 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.896882 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-31.ec2.internal" event="NodeHasSufficientMemory" Apr 16 20:11:40.896989 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.896913 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-31.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 20:11:40.896989 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.896926 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-31.ec2.internal" event="NodeHasSufficientPID" Apr 16 20:11:40.896989 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.896963 2577 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-131-31.ec2.internal" Apr 16 20:11:40.905704 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.905690 2577 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-131-31.ec2.internal" Apr 16 20:11:40.905748 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:11:40.905711 2577 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-131-31.ec2.internal\": node \"ip-10-0-131-31.ec2.internal\" not found" Apr 16 20:11:40.924777 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:11:40.924763 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-31.ec2.internal\" not found" Apr 16 20:11:40.970987 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.970967 2577 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-31.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-131-31.ec2.internal"] Apr 16 20:11:40.971057 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.971034 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 20:11:40.972288 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.972265 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-31.ec2.internal" event="NodeHasSufficientMemory" Apr 16 20:11:40.972336 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.972294 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-31.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 20:11:40.972336 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.972304 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-31.ec2.internal" event="NodeHasSufficientPID" Apr 16 20:11:40.973551 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.973540 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 20:11:40.973701 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.973688 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-31.ec2.internal" Apr 16 20:11:40.973736 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.973720 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 20:11:40.974169 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.974155 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-31.ec2.internal" event="NodeHasSufficientMemory" Apr 16 20:11:40.974239 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.974183 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-31.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 20:11:40.974239 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.974194 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-31.ec2.internal" event="NodeHasSufficientPID" Apr 16 20:11:40.974239 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.974157 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-31.ec2.internal" event="NodeHasSufficientMemory" Apr 16 20:11:40.974351 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.974250 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-31.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 20:11:40.974351 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.974263 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-31.ec2.internal" event="NodeHasSufficientPID" Apr 16 20:11:40.975582 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.975567 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-31.ec2.internal" Apr 16 20:11:40.975629 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.975598 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 20:11:40.976201 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.976186 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-31.ec2.internal" event="NodeHasSufficientMemory" Apr 16 20:11:40.976262 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.976210 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-31.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 20:11:40.976262 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:40.976219 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-31.ec2.internal" event="NodeHasSufficientPID" Apr 16 20:11:41.000462 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:11:41.000445 2577 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-131-31.ec2.internal\" not found" node="ip-10-0-131-31.ec2.internal" Apr 16 20:11:41.004668 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:11:41.004653 2577 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-131-31.ec2.internal\" not found" node="ip-10-0-131-31.ec2.internal" Apr 16 20:11:41.025010 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:11:41.024981 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-31.ec2.internal\" not found" Apr 16 20:11:41.040834 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:41.040813 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/4af69d7c0e5c5b8fbd782e214cad2fa5-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-31.ec2.internal\" (UID: \"4af69d7c0e5c5b8fbd782e214cad2fa5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-31.ec2.internal" Apr 16 20:11:41.040901 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:41.040838 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4af69d7c0e5c5b8fbd782e214cad2fa5-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-31.ec2.internal\" (UID: \"4af69d7c0e5c5b8fbd782e214cad2fa5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-31.ec2.internal" Apr 16 20:11:41.040901 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:41.040855 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/83cea7cfd9a6bfdb650e2cb8e7b4bc3c-config\") pod \"kube-apiserver-proxy-ip-10-0-131-31.ec2.internal\" (UID: \"83cea7cfd9a6bfdb650e2cb8e7b4bc3c\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-31.ec2.internal" Apr 16 20:11:41.125648 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:11:41.125626 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-31.ec2.internal\" not found" Apr 16 20:11:41.141193 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:41.141174 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/4af69d7c0e5c5b8fbd782e214cad2fa5-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-31.ec2.internal\" (UID: \"4af69d7c0e5c5b8fbd782e214cad2fa5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-31.ec2.internal" Apr 16 20:11:41.141246 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:41.141199 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4af69d7c0e5c5b8fbd782e214cad2fa5-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-31.ec2.internal\" (UID: \"4af69d7c0e5c5b8fbd782e214cad2fa5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-31.ec2.internal" Apr 16 20:11:41.141246 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:41.141224 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/83cea7cfd9a6bfdb650e2cb8e7b4bc3c-config\") pod \"kube-apiserver-proxy-ip-10-0-131-31.ec2.internal\" (UID: \"83cea7cfd9a6bfdb650e2cb8e7b4bc3c\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-31.ec2.internal" Apr 16 20:11:41.141315 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:41.141275 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4af69d7c0e5c5b8fbd782e214cad2fa5-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-31.ec2.internal\" (UID: \"4af69d7c0e5c5b8fbd782e214cad2fa5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-31.ec2.internal" Apr 16 20:11:41.141315 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:41.141281 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/4af69d7c0e5c5b8fbd782e214cad2fa5-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-31.ec2.internal\" (UID: \"4af69d7c0e5c5b8fbd782e214cad2fa5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-31.ec2.internal" Apr 16 20:11:41.141377 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:41.141321 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/83cea7cfd9a6bfdb650e2cb8e7b4bc3c-config\") pod \"kube-apiserver-proxy-ip-10-0-131-31.ec2.internal\" (UID: \"83cea7cfd9a6bfdb650e2cb8e7b4bc3c\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-31.ec2.internal" Apr 16 20:11:41.225962 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:11:41.225894 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-31.ec2.internal\" not found" Apr 16 20:11:41.303355 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:41.303327 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-31.ec2.internal" Apr 16 20:11:41.308160 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:41.308143 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-31.ec2.internal" Apr 16 20:11:41.326107 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:11:41.326085 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-31.ec2.internal\" not found" Apr 16 20:11:41.426532 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:11:41.426503 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-31.ec2.internal\" not found" Apr 16 20:11:41.527051 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:11:41.526982 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-31.ec2.internal\" not found" Apr 16 20:11:41.627554 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:11:41.627527 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-31.ec2.internal\" not found" Apr 16 20:11:41.650002 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:41.649971 2577 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 20:11:41.650374 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:41.650120 2577 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 20:11:41.650374 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:41.650158 2577 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 20:11:41.718296 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:41.718275 2577 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 20:11:41.727613 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:11:41.727588 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-31.ec2.internal\" not found" Apr 16 20:11:41.735268 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:41.735232 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 20:06:40 +0000 UTC" deadline="2027-11-07 10:52:46.700842234 +0000 UTC" Apr 16 20:11:41.735351 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:41.735268 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13670h41m4.96557811s" Apr 16 20:11:41.739027 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:41.739009 2577 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 20:11:41.754212 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:41.754194 2577 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 20:11:41.783608 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:41.783553 2577 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-wsm4z" Apr 16 20:11:41.792466 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:41.792444 2577 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-wsm4z" Apr 16 20:11:41.810330 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:41.810307 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4af69d7c0e5c5b8fbd782e214cad2fa5.slice/crio-3c8f0dec8a9b6a997e75bebda8c949cf0d165e55a41bdc5ca0fc72cc8a9253f0 WatchSource:0}: Error finding container 3c8f0dec8a9b6a997e75bebda8c949cf0d165e55a41bdc5ca0fc72cc8a9253f0: Status 404 returned error can't find the container with id 3c8f0dec8a9b6a997e75bebda8c949cf0d165e55a41bdc5ca0fc72cc8a9253f0 Apr 16 20:11:41.810486 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:41.810472 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod83cea7cfd9a6bfdb650e2cb8e7b4bc3c.slice/crio-92014b26b06cb1e69eee51b18971bcbdbcd8dd8ba083ca4e91cd93708bf22eef WatchSource:0}: Error finding container 92014b26b06cb1e69eee51b18971bcbdbcd8dd8ba083ca4e91cd93708bf22eef: Status 404 returned error can't find the container with id 92014b26b06cb1e69eee51b18971bcbdbcd8dd8ba083ca4e91cd93708bf22eef Apr 16 20:11:41.814416 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:41.814404 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 20:11:41.828476 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:11:41.828459 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-31.ec2.internal\" not found" Apr 16 20:11:41.872860 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:41.872817 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-31.ec2.internal" event={"ID":"83cea7cfd9a6bfdb650e2cb8e7b4bc3c","Type":"ContainerStarted","Data":"92014b26b06cb1e69eee51b18971bcbdbcd8dd8ba083ca4e91cd93708bf22eef"} Apr 16 20:11:41.873615 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:41.873596 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-31.ec2.internal" event={"ID":"4af69d7c0e5c5b8fbd782e214cad2fa5","Type":"ContainerStarted","Data":"3c8f0dec8a9b6a997e75bebda8c949cf0d165e55a41bdc5ca0fc72cc8a9253f0"} Apr 16 20:11:41.928773 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:11:41.928751 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-31.ec2.internal\" not found" Apr 16 20:11:42.029353 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:11:42.029325 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-31.ec2.internal\" not found" Apr 16 20:11:42.082998 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.082975 2577 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 20:11:42.138791 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.138768 2577 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-31.ec2.internal" Apr 16 20:11:42.151197 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.151172 2577 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 20:11:42.153311 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.152389 2577 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-31.ec2.internal" Apr 16 20:11:42.159096 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.159079 2577 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 20:11:42.715362 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.715330 2577 apiserver.go:52] "Watching apiserver" Apr 16 20:11:42.722339 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.722319 2577 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 20:11:42.724371 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.724345 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-t77hx","openshift-network-operator/iptables-alerter-jwtk7","kube-system/konnectivity-agent-9qvtx","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7tn9t","openshift-cluster-node-tuning-operator/tuned-rcrtv","openshift-dns/node-resolver-w8lw7","openshift-image-registry/node-ca-jwfkw","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-31.ec2.internal","openshift-multus/multus-8gsfm","openshift-ovn-kubernetes/ovnkube-node-985k4","kube-system/kube-apiserver-proxy-ip-10-0-131-31.ec2.internal","openshift-multus/multus-additional-cni-plugins-6gf96","openshift-multus/network-metrics-daemon-l9pv6"] Apr 16 20:11:42.726548 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.726527 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-jwfkw" Apr 16 20:11:42.727911 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.727890 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-jwtk7" Apr 16 20:11:42.729154 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.729133 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 20:11:42.730173 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.730154 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-sz2sq\"" Apr 16 20:11:42.731121 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.730731 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 20:11:42.731121 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.730819 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7tn9t" Apr 16 20:11:42.731121 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.730991 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-rcrtv" Apr 16 20:11:42.731486 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.731224 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 20:11:42.731603 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.731587 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 20:11:42.731671 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.731598 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 20:11:42.731731 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.731686 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 20:11:42.731808 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.731787 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-vv7fc\"" Apr 16 20:11:42.732675 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.732654 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-985k4" Apr 16 20:11:42.732770 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.732682 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-cr48n\"" Apr 16 20:11:42.732770 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.732706 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 20:11:42.732942 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.732919 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 20:11:42.734024 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.734001 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 20:11:42.734215 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.734198 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-wpxkn\"" Apr 16 20:11:42.734621 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.734590 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 20:11:42.734751 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.734697 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 20:11:42.735340 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.735311 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 20:11:42.735419 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.735342 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 20:11:42.735662 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.735646 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 20:11:42.735867 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.735799 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l9pv6" Apr 16 20:11:42.735935 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:11:42.735919 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l9pv6" podUID="c7481f18-a2dc-4df2-8b59-4184bc5c7283" Apr 16 20:11:42.736188 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.736102 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-bm2x5\"" Apr 16 20:11:42.736188 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.736163 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 20:11:42.736311 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.736165 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 20:11:42.736809 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.736782 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 20:11:42.738064 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.738043 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-9qvtx" Apr 16 20:11:42.740389 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.740369 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-w8lw7" Apr 16 20:11:42.741157 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.740972 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 20:11:42.741157 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.740980 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 20:11:42.741317 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.741182 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-h7ffq\"" Apr 16 20:11:42.742745 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.742696 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 20:11:42.742934 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.742916 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t77hx" Apr 16 20:11:42.743075 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:11:42.743049 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t77hx" podUID="1280bc20-0a50-4b20-8b90-8057a3627648" Apr 16 20:11:42.743445 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.743428 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 20:11:42.743779 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.743762 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-pvnl8\"" Apr 16 20:11:42.746232 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.746212 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-8gsfm" Apr 16 20:11:42.748259 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.748239 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-6gf96" Apr 16 20:11:42.748689 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.748671 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 20:11:42.748791 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.748773 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 20:11:42.749051 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.749023 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 20:11:42.749136 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.749124 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 20:11:42.749197 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.749173 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-phqrk\"" Apr 16 20:11:42.750726 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.750704 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/909cb17e-abc8-410f-adb0-206e00624944-run-ovn\") pod \"ovnkube-node-985k4\" (UID: \"909cb17e-abc8-410f-adb0-206e00624944\") " pod="openshift-ovn-kubernetes/ovnkube-node-985k4" Apr 16 20:11:42.750830 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.750738 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/909cb17e-abc8-410f-adb0-206e00624944-systemd-units\") pod \"ovnkube-node-985k4\" (UID: \"909cb17e-abc8-410f-adb0-206e00624944\") " pod="openshift-ovn-kubernetes/ovnkube-node-985k4" Apr 16 20:11:42.750830 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.750761 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/909cb17e-abc8-410f-adb0-206e00624944-host-run-ovn-kubernetes\") pod \"ovnkube-node-985k4\" (UID: \"909cb17e-abc8-410f-adb0-206e00624944\") " pod="openshift-ovn-kubernetes/ovnkube-node-985k4" Apr 16 20:11:42.750830 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.750784 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/909cb17e-abc8-410f-adb0-206e00624944-ovnkube-script-lib\") pod \"ovnkube-node-985k4\" (UID: \"909cb17e-abc8-410f-adb0-206e00624944\") " pod="openshift-ovn-kubernetes/ovnkube-node-985k4" Apr 16 20:11:42.750830 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.750808 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/56120f2f-13d6-48da-a3d5-a472edb09dfc-host\") pod \"node-ca-jwfkw\" (UID: \"56120f2f-13d6-48da-a3d5-a472edb09dfc\") " pod="openshift-image-registry/node-ca-jwfkw" Apr 16 20:11:42.751057 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.750835 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/22a56877-dc06-432a-9ca1-06a091e2af67-etc-modprobe-d\") pod \"tuned-rcrtv\" (UID: \"22a56877-dc06-432a-9ca1-06a091e2af67\") " pod="openshift-cluster-node-tuning-operator/tuned-rcrtv" Apr 16 20:11:42.751057 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.750874 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/22a56877-dc06-432a-9ca1-06a091e2af67-run\") pod \"tuned-rcrtv\" (UID: \"22a56877-dc06-432a-9ca1-06a091e2af67\") " pod="openshift-cluster-node-tuning-operator/tuned-rcrtv" Apr 16 20:11:42.751057 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.750898 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/909cb17e-abc8-410f-adb0-206e00624944-host-slash\") pod \"ovnkube-node-985k4\" (UID: \"909cb17e-abc8-410f-adb0-206e00624944\") " pod="openshift-ovn-kubernetes/ovnkube-node-985k4" Apr 16 20:11:42.751057 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.750935 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/99d4b836-b69f-4839-b4e1-937780e53711-konnectivity-ca\") pod \"konnectivity-agent-9qvtx\" (UID: \"99d4b836-b69f-4839-b4e1-937780e53711\") " pod="kube-system/konnectivity-agent-9qvtx" Apr 16 20:11:42.751057 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.750976 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9c16c53d-2e96-42ef-86b8-344e57a267ee-tmp-dir\") pod \"node-resolver-w8lw7\" (UID: \"9c16c53d-2e96-42ef-86b8-344e57a267ee\") " pod="openshift-dns/node-resolver-w8lw7" Apr 16 20:11:42.751057 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.750999 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/2d8fda20-4941-47b7-ab67-e2b86e2a8523-device-dir\") pod \"aws-ebs-csi-driver-node-7tn9t\" (UID: \"2d8fda20-4941-47b7-ab67-e2b86e2a8523\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7tn9t" Apr 16 20:11:42.751340 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.751095 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/22a56877-dc06-432a-9ca1-06a091e2af67-sys\") pod \"tuned-rcrtv\" (UID: \"22a56877-dc06-432a-9ca1-06a091e2af67\") " pod="openshift-cluster-node-tuning-operator/tuned-rcrtv" Apr 16 20:11:42.751340 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.751141 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/909cb17e-abc8-410f-adb0-206e00624944-var-lib-openvswitch\") pod \"ovnkube-node-985k4\" (UID: \"909cb17e-abc8-410f-adb0-206e00624944\") " pod="openshift-ovn-kubernetes/ovnkube-node-985k4" Apr 16 20:11:42.751340 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.751195 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/909cb17e-abc8-410f-adb0-206e00624944-node-log\") pod \"ovnkube-node-985k4\" (UID: \"909cb17e-abc8-410f-adb0-206e00624944\") " pod="openshift-ovn-kubernetes/ovnkube-node-985k4" Apr 16 20:11:42.751340 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.751221 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/909cb17e-abc8-410f-adb0-206e00624944-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-985k4\" (UID: \"909cb17e-abc8-410f-adb0-206e00624944\") " pod="openshift-ovn-kubernetes/ovnkube-node-985k4" Apr 16 20:11:42.751340 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.751243 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/22a56877-dc06-432a-9ca1-06a091e2af67-etc-sysctl-conf\") pod \"tuned-rcrtv\" (UID: \"22a56877-dc06-432a-9ca1-06a091e2af67\") " pod="openshift-cluster-node-tuning-operator/tuned-rcrtv" Apr 16 20:11:42.751340 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.751265 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/909cb17e-abc8-410f-adb0-206e00624944-host-cni-bin\") pod \"ovnkube-node-985k4\" (UID: \"909cb17e-abc8-410f-adb0-206e00624944\") " pod="openshift-ovn-kubernetes/ovnkube-node-985k4" Apr 16 20:11:42.751340 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.751313 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/909cb17e-abc8-410f-adb0-206e00624944-ovnkube-config\") pod \"ovnkube-node-985k4\" (UID: \"909cb17e-abc8-410f-adb0-206e00624944\") " pod="openshift-ovn-kubernetes/ovnkube-node-985k4" Apr 16 20:11:42.751829 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.751354 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/909cb17e-abc8-410f-adb0-206e00624944-env-overrides\") pod \"ovnkube-node-985k4\" (UID: \"909cb17e-abc8-410f-adb0-206e00624944\") " pod="openshift-ovn-kubernetes/ovnkube-node-985k4" Apr 16 20:11:42.751829 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.751374 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 20:11:42.751829 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.751389 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/22a56877-dc06-432a-9ca1-06a091e2af67-etc-kubernetes\") pod \"tuned-rcrtv\" (UID: \"22a56877-dc06-432a-9ca1-06a091e2af67\") " pod="openshift-cluster-node-tuning-operator/tuned-rcrtv" Apr 16 20:11:42.751829 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.751413 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/909cb17e-abc8-410f-adb0-206e00624944-host-run-netns\") pod \"ovnkube-node-985k4\" (UID: \"909cb17e-abc8-410f-adb0-206e00624944\") " pod="openshift-ovn-kubernetes/ovnkube-node-985k4" Apr 16 20:11:42.751829 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.751437 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/909cb17e-abc8-410f-adb0-206e00624944-etc-openvswitch\") pod \"ovnkube-node-985k4\" (UID: \"909cb17e-abc8-410f-adb0-206e00624944\") " pod="openshift-ovn-kubernetes/ovnkube-node-985k4" Apr 16 20:11:42.751829 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.751453 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 20:11:42.751829 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.751460 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c7481f18-a2dc-4df2-8b59-4184bc5c7283-metrics-certs\") pod \"network-metrics-daemon-l9pv6\" (UID: \"c7481f18-a2dc-4df2-8b59-4184bc5c7283\") " pod="openshift-multus/network-metrics-daemon-l9pv6" Apr 16 20:11:42.751829 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.751488 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2d8fda20-4941-47b7-ab67-e2b86e2a8523-kubelet-dir\") pod \"aws-ebs-csi-driver-node-7tn9t\" (UID: \"2d8fda20-4941-47b7-ab67-e2b86e2a8523\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7tn9t" Apr 16 20:11:42.751829 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.751527 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-tl9k7\"" Apr 16 20:11:42.751829 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.751527 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctfcl\" (UniqueName: \"kubernetes.io/projected/2d8fda20-4941-47b7-ab67-e2b86e2a8523-kube-api-access-ctfcl\") pod \"aws-ebs-csi-driver-node-7tn9t\" (UID: \"2d8fda20-4941-47b7-ab67-e2b86e2a8523\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7tn9t" Apr 16 20:11:42.751829 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.751608 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/22a56877-dc06-432a-9ca1-06a091e2af67-etc-systemd\") pod \"tuned-rcrtv\" (UID: \"22a56877-dc06-432a-9ca1-06a091e2af67\") " pod="openshift-cluster-node-tuning-operator/tuned-rcrtv" Apr 16 20:11:42.751829 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.751635 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxvv6\" (UniqueName: \"kubernetes.io/projected/22a56877-dc06-432a-9ca1-06a091e2af67-kube-api-access-vxvv6\") pod \"tuned-rcrtv\" (UID: \"22a56877-dc06-432a-9ca1-06a091e2af67\") " pod="openshift-cluster-node-tuning-operator/tuned-rcrtv" Apr 16 20:11:42.751829 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.751660 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/909cb17e-abc8-410f-adb0-206e00624944-run-openvswitch\") pod \"ovnkube-node-985k4\" (UID: \"909cb17e-abc8-410f-adb0-206e00624944\") " pod="openshift-ovn-kubernetes/ovnkube-node-985k4" Apr 16 20:11:42.751829 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.751682 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/22a56877-dc06-432a-9ca1-06a091e2af67-etc-sysconfig\") pod \"tuned-rcrtv\" (UID: \"22a56877-dc06-432a-9ca1-06a091e2af67\") " pod="openshift-cluster-node-tuning-operator/tuned-rcrtv" Apr 16 20:11:42.751829 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.751703 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/22a56877-dc06-432a-9ca1-06a091e2af67-host\") pod \"tuned-rcrtv\" (UID: \"22a56877-dc06-432a-9ca1-06a091e2af67\") " pod="openshift-cluster-node-tuning-operator/tuned-rcrtv" Apr 16 20:11:42.751829 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.751725 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/909cb17e-abc8-410f-adb0-206e00624944-host-kubelet\") pod \"ovnkube-node-985k4\" (UID: \"909cb17e-abc8-410f-adb0-206e00624944\") " pod="openshift-ovn-kubernetes/ovnkube-node-985k4" Apr 16 20:11:42.751829 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.751748 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/56120f2f-13d6-48da-a3d5-a472edb09dfc-serviceca\") pod \"node-ca-jwfkw\" (UID: \"56120f2f-13d6-48da-a3d5-a472edb09dfc\") " pod="openshift-image-registry/node-ca-jwfkw" Apr 16 20:11:42.751829 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.751772 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/99d4b836-b69f-4839-b4e1-937780e53711-agent-certs\") pod \"konnectivity-agent-9qvtx\" (UID: \"99d4b836-b69f-4839-b4e1-937780e53711\") " pod="kube-system/konnectivity-agent-9qvtx" Apr 16 20:11:42.752673 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.751791 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/22a56877-dc06-432a-9ca1-06a091e2af67-etc-tuned\") pod \"tuned-rcrtv\" (UID: \"22a56877-dc06-432a-9ca1-06a091e2af67\") " pod="openshift-cluster-node-tuning-operator/tuned-rcrtv" Apr 16 20:11:42.752673 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.751843 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/909cb17e-abc8-410f-adb0-206e00624944-log-socket\") pod \"ovnkube-node-985k4\" (UID: \"909cb17e-abc8-410f-adb0-206e00624944\") " pod="openshift-ovn-kubernetes/ovnkube-node-985k4" Apr 16 20:11:42.752673 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.751860 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/909cb17e-abc8-410f-adb0-206e00624944-ovn-node-metrics-cert\") pod \"ovnkube-node-985k4\" (UID: \"909cb17e-abc8-410f-adb0-206e00624944\") " pod="openshift-ovn-kubernetes/ovnkube-node-985k4" Apr 16 20:11:42.752673 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.751884 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dc7mx\" (UniqueName: \"kubernetes.io/projected/c7481f18-a2dc-4df2-8b59-4184bc5c7283-kube-api-access-dc7mx\") pod \"network-metrics-daemon-l9pv6\" (UID: \"c7481f18-a2dc-4df2-8b59-4184bc5c7283\") " pod="openshift-multus/network-metrics-daemon-l9pv6" Apr 16 20:11:42.752673 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.751922 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89z8w\" (UniqueName: \"kubernetes.io/projected/9c16c53d-2e96-42ef-86b8-344e57a267ee-kube-api-access-89z8w\") pod \"node-resolver-w8lw7\" (UID: \"9c16c53d-2e96-42ef-86b8-344e57a267ee\") " pod="openshift-dns/node-resolver-w8lw7" Apr 16 20:11:42.752673 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.751966 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/2d8fda20-4941-47b7-ab67-e2b86e2a8523-etc-selinux\") pod \"aws-ebs-csi-driver-node-7tn9t\" (UID: \"2d8fda20-4941-47b7-ab67-e2b86e2a8523\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7tn9t" Apr 16 20:11:42.752673 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.751986 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9c16c53d-2e96-42ef-86b8-344e57a267ee-hosts-file\") pod \"node-resolver-w8lw7\" (UID: \"9c16c53d-2e96-42ef-86b8-344e57a267ee\") " pod="openshift-dns/node-resolver-w8lw7" Apr 16 20:11:42.752673 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.752003 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/2d8fda20-4941-47b7-ab67-e2b86e2a8523-socket-dir\") pod \"aws-ebs-csi-driver-node-7tn9t\" (UID: \"2d8fda20-4941-47b7-ab67-e2b86e2a8523\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7tn9t" Apr 16 20:11:42.752673 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.752024 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/909cb17e-abc8-410f-adb0-206e00624944-host-cni-netd\") pod \"ovnkube-node-985k4\" (UID: \"909cb17e-abc8-410f-adb0-206e00624944\") " pod="openshift-ovn-kubernetes/ovnkube-node-985k4" Apr 16 20:11:42.752673 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.752062 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/2d8fda20-4941-47b7-ab67-e2b86e2a8523-sys-fs\") pod \"aws-ebs-csi-driver-node-7tn9t\" (UID: \"2d8fda20-4941-47b7-ab67-e2b86e2a8523\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7tn9t" Apr 16 20:11:42.752673 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.752084 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/22a56877-dc06-432a-9ca1-06a091e2af67-etc-sysctl-d\") pod \"tuned-rcrtv\" (UID: \"22a56877-dc06-432a-9ca1-06a091e2af67\") " pod="openshift-cluster-node-tuning-operator/tuned-rcrtv" Apr 16 20:11:42.752673 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.752105 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/22a56877-dc06-432a-9ca1-06a091e2af67-var-lib-kubelet\") pod \"tuned-rcrtv\" (UID: \"22a56877-dc06-432a-9ca1-06a091e2af67\") " pod="openshift-cluster-node-tuning-operator/tuned-rcrtv" Apr 16 20:11:42.752673 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.752128 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qdsc\" (UniqueName: \"kubernetes.io/projected/909cb17e-abc8-410f-adb0-206e00624944-kube-api-access-4qdsc\") pod \"ovnkube-node-985k4\" (UID: \"909cb17e-abc8-410f-adb0-206e00624944\") " pod="openshift-ovn-kubernetes/ovnkube-node-985k4" Apr 16 20:11:42.752673 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.752148 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9j28n\" (UniqueName: \"kubernetes.io/projected/512ea84f-ed8b-4a83-a907-ce38ff494ba1-kube-api-access-9j28n\") pod \"iptables-alerter-jwtk7\" (UID: \"512ea84f-ed8b-4a83-a907-ce38ff494ba1\") " pod="openshift-network-operator/iptables-alerter-jwtk7" Apr 16 20:11:42.752673 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.752224 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/2d8fda20-4941-47b7-ab67-e2b86e2a8523-registration-dir\") pod \"aws-ebs-csi-driver-node-7tn9t\" (UID: \"2d8fda20-4941-47b7-ab67-e2b86e2a8523\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7tn9t" Apr 16 20:11:42.752673 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.752257 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/909cb17e-abc8-410f-adb0-206e00624944-run-systemd\") pod \"ovnkube-node-985k4\" (UID: \"909cb17e-abc8-410f-adb0-206e00624944\") " pod="openshift-ovn-kubernetes/ovnkube-node-985k4" Apr 16 20:11:42.753436 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.752294 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/512ea84f-ed8b-4a83-a907-ce38ff494ba1-host-slash\") pod \"iptables-alerter-jwtk7\" (UID: \"512ea84f-ed8b-4a83-a907-ce38ff494ba1\") " pod="openshift-network-operator/iptables-alerter-jwtk7" Apr 16 20:11:42.753436 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.752319 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8kl8\" (UniqueName: \"kubernetes.io/projected/56120f2f-13d6-48da-a3d5-a472edb09dfc-kube-api-access-n8kl8\") pod \"node-ca-jwfkw\" (UID: \"56120f2f-13d6-48da-a3d5-a472edb09dfc\") " pod="openshift-image-registry/node-ca-jwfkw" Apr 16 20:11:42.753436 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.752343 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/22a56877-dc06-432a-9ca1-06a091e2af67-lib-modules\") pod \"tuned-rcrtv\" (UID: \"22a56877-dc06-432a-9ca1-06a091e2af67\") " pod="openshift-cluster-node-tuning-operator/tuned-rcrtv" Apr 16 20:11:42.753436 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.752368 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/22a56877-dc06-432a-9ca1-06a091e2af67-tmp\") pod \"tuned-rcrtv\" (UID: \"22a56877-dc06-432a-9ca1-06a091e2af67\") " pod="openshift-cluster-node-tuning-operator/tuned-rcrtv" Apr 16 20:11:42.753436 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.752390 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/512ea84f-ed8b-4a83-a907-ce38ff494ba1-iptables-alerter-script\") pod \"iptables-alerter-jwtk7\" (UID: \"512ea84f-ed8b-4a83-a907-ce38ff494ba1\") " pod="openshift-network-operator/iptables-alerter-jwtk7" Apr 16 20:11:42.793490 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.793462 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 20:06:41 +0000 UTC" deadline="2027-10-06 23:31:07.306399325 +0000 UTC" Apr 16 20:11:42.793592 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.793484 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12915h19m24.512917197s" Apr 16 20:11:42.841055 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.841024 2577 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 20:11:42.852925 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.852897 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/909cb17e-abc8-410f-adb0-206e00624944-run-ovn\") pod \"ovnkube-node-985k4\" (UID: \"909cb17e-abc8-410f-adb0-206e00624944\") " pod="openshift-ovn-kubernetes/ovnkube-node-985k4" Apr 16 20:11:42.853079 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.852957 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e372e398-739d-495f-a53d-0b5eea55bc76-host-run-multus-certs\") pod \"multus-8gsfm\" (UID: \"e372e398-739d-495f-a53d-0b5eea55bc76\") " pod="openshift-multus/multus-8gsfm" Apr 16 20:11:42.853079 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.852985 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/909cb17e-abc8-410f-adb0-206e00624944-systemd-units\") pod \"ovnkube-node-985k4\" (UID: \"909cb17e-abc8-410f-adb0-206e00624944\") " pod="openshift-ovn-kubernetes/ovnkube-node-985k4" Apr 16 20:11:42.853079 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.853007 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/909cb17e-abc8-410f-adb0-206e00624944-host-run-ovn-kubernetes\") pod \"ovnkube-node-985k4\" (UID: \"909cb17e-abc8-410f-adb0-206e00624944\") " pod="openshift-ovn-kubernetes/ovnkube-node-985k4" Apr 16 20:11:42.853079 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.853035 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/909cb17e-abc8-410f-adb0-206e00624944-run-ovn\") pod \"ovnkube-node-985k4\" (UID: \"909cb17e-abc8-410f-adb0-206e00624944\") " pod="openshift-ovn-kubernetes/ovnkube-node-985k4" Apr 16 20:11:42.853079 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.853060 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/909cb17e-abc8-410f-adb0-206e00624944-systemd-units\") pod \"ovnkube-node-985k4\" (UID: \"909cb17e-abc8-410f-adb0-206e00624944\") " pod="openshift-ovn-kubernetes/ovnkube-node-985k4" Apr 16 20:11:42.853314 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.853074 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/909cb17e-abc8-410f-adb0-206e00624944-ovnkube-script-lib\") pod \"ovnkube-node-985k4\" (UID: \"909cb17e-abc8-410f-adb0-206e00624944\") " pod="openshift-ovn-kubernetes/ovnkube-node-985k4" Apr 16 20:11:42.853314 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.853144 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/909cb17e-abc8-410f-adb0-206e00624944-host-run-ovn-kubernetes\") pod \"ovnkube-node-985k4\" (UID: \"909cb17e-abc8-410f-adb0-206e00624944\") " pod="openshift-ovn-kubernetes/ovnkube-node-985k4" Apr 16 20:11:42.853314 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.853151 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bd1cbed9-748e-4677-b9ad-5f73500b5738-system-cni-dir\") pod \"multus-additional-cni-plugins-6gf96\" (UID: \"bd1cbed9-748e-4677-b9ad-5f73500b5738\") " pod="openshift-multus/multus-additional-cni-plugins-6gf96" Apr 16 20:11:42.853314 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.853189 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/56120f2f-13d6-48da-a3d5-a472edb09dfc-host\") pod \"node-ca-jwfkw\" (UID: \"56120f2f-13d6-48da-a3d5-a472edb09dfc\") " pod="openshift-image-registry/node-ca-jwfkw" Apr 16 20:11:42.853314 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.853229 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/22a56877-dc06-432a-9ca1-06a091e2af67-etc-modprobe-d\") pod \"tuned-rcrtv\" (UID: \"22a56877-dc06-432a-9ca1-06a091e2af67\") " pod="openshift-cluster-node-tuning-operator/tuned-rcrtv" Apr 16 20:11:42.853314 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.853240 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/56120f2f-13d6-48da-a3d5-a472edb09dfc-host\") pod \"node-ca-jwfkw\" (UID: \"56120f2f-13d6-48da-a3d5-a472edb09dfc\") " pod="openshift-image-registry/node-ca-jwfkw" Apr 16 20:11:42.853314 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.853265 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/22a56877-dc06-432a-9ca1-06a091e2af67-run\") pod \"tuned-rcrtv\" (UID: \"22a56877-dc06-432a-9ca1-06a091e2af67\") " pod="openshift-cluster-node-tuning-operator/tuned-rcrtv" Apr 16 20:11:42.853640 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.853320 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/22a56877-dc06-432a-9ca1-06a091e2af67-run\") pod \"tuned-rcrtv\" (UID: \"22a56877-dc06-432a-9ca1-06a091e2af67\") " pod="openshift-cluster-node-tuning-operator/tuned-rcrtv" Apr 16 20:11:42.853640 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.853323 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/22a56877-dc06-432a-9ca1-06a091e2af67-etc-modprobe-d\") pod \"tuned-rcrtv\" (UID: \"22a56877-dc06-432a-9ca1-06a091e2af67\") " pod="openshift-cluster-node-tuning-operator/tuned-rcrtv" Apr 16 20:11:42.853640 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.853335 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/909cb17e-abc8-410f-adb0-206e00624944-host-slash\") pod \"ovnkube-node-985k4\" (UID: \"909cb17e-abc8-410f-adb0-206e00624944\") " pod="openshift-ovn-kubernetes/ovnkube-node-985k4" Apr 16 20:11:42.853640 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.853363 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/99d4b836-b69f-4839-b4e1-937780e53711-konnectivity-ca\") pod \"konnectivity-agent-9qvtx\" (UID: \"99d4b836-b69f-4839-b4e1-937780e53711\") " pod="kube-system/konnectivity-agent-9qvtx" Apr 16 20:11:42.853640 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.853403 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9c16c53d-2e96-42ef-86b8-344e57a267ee-tmp-dir\") pod \"node-resolver-w8lw7\" (UID: \"9c16c53d-2e96-42ef-86b8-344e57a267ee\") " pod="openshift-dns/node-resolver-w8lw7" Apr 16 20:11:42.853640 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.853439 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/909cb17e-abc8-410f-adb0-206e00624944-host-slash\") pod \"ovnkube-node-985k4\" (UID: \"909cb17e-abc8-410f-adb0-206e00624944\") " pod="openshift-ovn-kubernetes/ovnkube-node-985k4" Apr 16 20:11:42.853640 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.853555 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e372e398-739d-495f-a53d-0b5eea55bc76-host-var-lib-kubelet\") pod \"multus-8gsfm\" (UID: \"e372e398-739d-495f-a53d-0b5eea55bc76\") " pod="openshift-multus/multus-8gsfm" Apr 16 20:11:42.853640 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.853589 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvz8d\" (UniqueName: \"kubernetes.io/projected/bd1cbed9-748e-4677-b9ad-5f73500b5738-kube-api-access-fvz8d\") pod \"multus-additional-cni-plugins-6gf96\" (UID: \"bd1cbed9-748e-4677-b9ad-5f73500b5738\") " pod="openshift-multus/multus-additional-cni-plugins-6gf96" Apr 16 20:11:42.853640 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.853621 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/2d8fda20-4941-47b7-ab67-e2b86e2a8523-device-dir\") pod \"aws-ebs-csi-driver-node-7tn9t\" (UID: \"2d8fda20-4941-47b7-ab67-e2b86e2a8523\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7tn9t" Apr 16 20:11:42.854028 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.853648 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/22a56877-dc06-432a-9ca1-06a091e2af67-sys\") pod \"tuned-rcrtv\" (UID: \"22a56877-dc06-432a-9ca1-06a091e2af67\") " pod="openshift-cluster-node-tuning-operator/tuned-rcrtv" Apr 16 20:11:42.854028 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.853671 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/909cb17e-abc8-410f-adb0-206e00624944-var-lib-openvswitch\") pod \"ovnkube-node-985k4\" (UID: \"909cb17e-abc8-410f-adb0-206e00624944\") " pod="openshift-ovn-kubernetes/ovnkube-node-985k4" Apr 16 20:11:42.854028 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.853696 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/909cb17e-abc8-410f-adb0-206e00624944-node-log\") pod \"ovnkube-node-985k4\" (UID: \"909cb17e-abc8-410f-adb0-206e00624944\") " pod="openshift-ovn-kubernetes/ovnkube-node-985k4" Apr 16 20:11:42.854028 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.853721 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/909cb17e-abc8-410f-adb0-206e00624944-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-985k4\" (UID: \"909cb17e-abc8-410f-adb0-206e00624944\") " pod="openshift-ovn-kubernetes/ovnkube-node-985k4" Apr 16 20:11:42.854028 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.853747 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e372e398-739d-495f-a53d-0b5eea55bc76-host-run-netns\") pod \"multus-8gsfm\" (UID: \"e372e398-739d-495f-a53d-0b5eea55bc76\") " pod="openshift-multus/multus-8gsfm" Apr 16 20:11:42.854028 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.853760 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9c16c53d-2e96-42ef-86b8-344e57a267ee-tmp-dir\") pod \"node-resolver-w8lw7\" (UID: \"9c16c53d-2e96-42ef-86b8-344e57a267ee\") " pod="openshift-dns/node-resolver-w8lw7" Apr 16 20:11:42.854028 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.853769 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e372e398-739d-495f-a53d-0b5eea55bc76-multus-conf-dir\") pod \"multus-8gsfm\" (UID: \"e372e398-739d-495f-a53d-0b5eea55bc76\") " pod="openshift-multus/multus-8gsfm" Apr 16 20:11:42.854028 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.853792 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/22a56877-dc06-432a-9ca1-06a091e2af67-etc-sysctl-conf\") pod \"tuned-rcrtv\" (UID: \"22a56877-dc06-432a-9ca1-06a091e2af67\") " pod="openshift-cluster-node-tuning-operator/tuned-rcrtv" Apr 16 20:11:42.854028 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.853825 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/909cb17e-abc8-410f-adb0-206e00624944-host-cni-bin\") pod \"ovnkube-node-985k4\" (UID: \"909cb17e-abc8-410f-adb0-206e00624944\") " pod="openshift-ovn-kubernetes/ovnkube-node-985k4" Apr 16 20:11:42.854028 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.853831 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/909cb17e-abc8-410f-adb0-206e00624944-ovnkube-script-lib\") pod \"ovnkube-node-985k4\" (UID: \"909cb17e-abc8-410f-adb0-206e00624944\") " pod="openshift-ovn-kubernetes/ovnkube-node-985k4" Apr 16 20:11:42.854028 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.853847 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/909cb17e-abc8-410f-adb0-206e00624944-var-lib-openvswitch\") pod \"ovnkube-node-985k4\" (UID: \"909cb17e-abc8-410f-adb0-206e00624944\") " pod="openshift-ovn-kubernetes/ovnkube-node-985k4" Apr 16 20:11:42.854028 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.853849 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/909cb17e-abc8-410f-adb0-206e00624944-ovnkube-config\") pod \"ovnkube-node-985k4\" (UID: \"909cb17e-abc8-410f-adb0-206e00624944\") " pod="openshift-ovn-kubernetes/ovnkube-node-985k4" Apr 16 20:11:42.854028 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.853875 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/99d4b836-b69f-4839-b4e1-937780e53711-konnectivity-ca\") pod \"konnectivity-agent-9qvtx\" (UID: \"99d4b836-b69f-4839-b4e1-937780e53711\") " pod="kube-system/konnectivity-agent-9qvtx" Apr 16 20:11:42.854028 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.853890 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/909cb17e-abc8-410f-adb0-206e00624944-env-overrides\") pod \"ovnkube-node-985k4\" (UID: \"909cb17e-abc8-410f-adb0-206e00624944\") " pod="openshift-ovn-kubernetes/ovnkube-node-985k4" Apr 16 20:11:42.854028 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.853941 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e372e398-739d-495f-a53d-0b5eea55bc76-system-cni-dir\") pod \"multus-8gsfm\" (UID: \"e372e398-739d-495f-a53d-0b5eea55bc76\") " pod="openshift-multus/multus-8gsfm" Apr 16 20:11:42.854028 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.853966 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/22a56877-dc06-432a-9ca1-06a091e2af67-sys\") pod \"tuned-rcrtv\" (UID: \"22a56877-dc06-432a-9ca1-06a091e2af67\") " pod="openshift-cluster-node-tuning-operator/tuned-rcrtv" Apr 16 20:11:42.854028 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.853997 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e372e398-739d-495f-a53d-0b5eea55bc76-cnibin\") pod \"multus-8gsfm\" (UID: \"e372e398-739d-495f-a53d-0b5eea55bc76\") " pod="openshift-multus/multus-8gsfm" Apr 16 20:11:42.854028 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.854021 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/909cb17e-abc8-410f-adb0-206e00624944-node-log\") pod \"ovnkube-node-985k4\" (UID: \"909cb17e-abc8-410f-adb0-206e00624944\") " pod="openshift-ovn-kubernetes/ovnkube-node-985k4" Apr 16 20:11:42.854815 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.854028 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e372e398-739d-495f-a53d-0b5eea55bc76-multus-socket-dir-parent\") pod \"multus-8gsfm\" (UID: \"e372e398-739d-495f-a53d-0b5eea55bc76\") " pod="openshift-multus/multus-8gsfm" Apr 16 20:11:42.854815 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.854046 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/22a56877-dc06-432a-9ca1-06a091e2af67-etc-sysctl-conf\") pod \"tuned-rcrtv\" (UID: \"22a56877-dc06-432a-9ca1-06a091e2af67\") " pod="openshift-cluster-node-tuning-operator/tuned-rcrtv" Apr 16 20:11:42.854815 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.854059 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e372e398-739d-495f-a53d-0b5eea55bc76-host-run-k8s-cni-cncf-io\") pod \"multus-8gsfm\" (UID: \"e372e398-739d-495f-a53d-0b5eea55bc76\") " pod="openshift-multus/multus-8gsfm" Apr 16 20:11:42.854815 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.854067 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/909cb17e-abc8-410f-adb0-206e00624944-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-985k4\" (UID: \"909cb17e-abc8-410f-adb0-206e00624944\") " pod="openshift-ovn-kubernetes/ovnkube-node-985k4" Apr 16 20:11:42.854815 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.854089 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/22a56877-dc06-432a-9ca1-06a091e2af67-etc-kubernetes\") pod \"tuned-rcrtv\" (UID: \"22a56877-dc06-432a-9ca1-06a091e2af67\") " pod="openshift-cluster-node-tuning-operator/tuned-rcrtv" Apr 16 20:11:42.854815 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.854105 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/909cb17e-abc8-410f-adb0-206e00624944-host-run-netns\") pod \"ovnkube-node-985k4\" (UID: \"909cb17e-abc8-410f-adb0-206e00624944\") " pod="openshift-ovn-kubernetes/ovnkube-node-985k4" Apr 16 20:11:42.854815 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.854104 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/2d8fda20-4941-47b7-ab67-e2b86e2a8523-device-dir\") pod \"aws-ebs-csi-driver-node-7tn9t\" (UID: \"2d8fda20-4941-47b7-ab67-e2b86e2a8523\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7tn9t" Apr 16 20:11:42.854815 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.854138 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/909cb17e-abc8-410f-adb0-206e00624944-etc-openvswitch\") pod \"ovnkube-node-985k4\" (UID: \"909cb17e-abc8-410f-adb0-206e00624944\") " pod="openshift-ovn-kubernetes/ovnkube-node-985k4" Apr 16 20:11:42.854815 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.854146 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/909cb17e-abc8-410f-adb0-206e00624944-host-run-netns\") pod \"ovnkube-node-985k4\" (UID: \"909cb17e-abc8-410f-adb0-206e00624944\") " pod="openshift-ovn-kubernetes/ovnkube-node-985k4" Apr 16 20:11:42.854815 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.854177 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/909cb17e-abc8-410f-adb0-206e00624944-host-cni-bin\") pod \"ovnkube-node-985k4\" (UID: \"909cb17e-abc8-410f-adb0-206e00624944\") " pod="openshift-ovn-kubernetes/ovnkube-node-985k4" Apr 16 20:11:42.854815 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.854190 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/22a56877-dc06-432a-9ca1-06a091e2af67-etc-kubernetes\") pod \"tuned-rcrtv\" (UID: \"22a56877-dc06-432a-9ca1-06a091e2af67\") " pod="openshift-cluster-node-tuning-operator/tuned-rcrtv" Apr 16 20:11:42.854815 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.854192 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c7481f18-a2dc-4df2-8b59-4184bc5c7283-metrics-certs\") pod \"network-metrics-daemon-l9pv6\" (UID: \"c7481f18-a2dc-4df2-8b59-4184bc5c7283\") " pod="openshift-multus/network-metrics-daemon-l9pv6" Apr 16 20:11:42.854815 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.854225 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/909cb17e-abc8-410f-adb0-206e00624944-etc-openvswitch\") pod \"ovnkube-node-985k4\" (UID: \"909cb17e-abc8-410f-adb0-206e00624944\") " pod="openshift-ovn-kubernetes/ovnkube-node-985k4" Apr 16 20:11:42.854815 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.854239 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e372e398-739d-495f-a53d-0b5eea55bc76-multus-cni-dir\") pod \"multus-8gsfm\" (UID: \"e372e398-739d-495f-a53d-0b5eea55bc76\") " pod="openshift-multus/multus-8gsfm" Apr 16 20:11:42.854815 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.854277 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e372e398-739d-495f-a53d-0b5eea55bc76-cni-binary-copy\") pod \"multus-8gsfm\" (UID: \"e372e398-739d-495f-a53d-0b5eea55bc76\") " pod="openshift-multus/multus-8gsfm" Apr 16 20:11:42.854815 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:11:42.854300 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:11:42.854815 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.854308 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/909cb17e-abc8-410f-adb0-206e00624944-env-overrides\") pod \"ovnkube-node-985k4\" (UID: \"909cb17e-abc8-410f-adb0-206e00624944\") " pod="openshift-ovn-kubernetes/ovnkube-node-985k4" Apr 16 20:11:42.854815 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.854325 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/909cb17e-abc8-410f-adb0-206e00624944-ovnkube-config\") pod \"ovnkube-node-985k4\" (UID: \"909cb17e-abc8-410f-adb0-206e00624944\") " pod="openshift-ovn-kubernetes/ovnkube-node-985k4" Apr 16 20:11:42.855613 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.854352 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2d8fda20-4941-47b7-ab67-e2b86e2a8523-kubelet-dir\") pod \"aws-ebs-csi-driver-node-7tn9t\" (UID: \"2d8fda20-4941-47b7-ab67-e2b86e2a8523\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7tn9t" Apr 16 20:11:42.855613 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:11:42.854365 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c7481f18-a2dc-4df2-8b59-4184bc5c7283-metrics-certs podName:c7481f18-a2dc-4df2-8b59-4184bc5c7283 nodeName:}" failed. No retries permitted until 2026-04-16 20:11:43.354343914 +0000 UTC m=+3.030612155 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c7481f18-a2dc-4df2-8b59-4184bc5c7283-metrics-certs") pod "network-metrics-daemon-l9pv6" (UID: "c7481f18-a2dc-4df2-8b59-4184bc5c7283") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:11:42.855613 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.854304 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2d8fda20-4941-47b7-ab67-e2b86e2a8523-kubelet-dir\") pod \"aws-ebs-csi-driver-node-7tn9t\" (UID: \"2d8fda20-4941-47b7-ab67-e2b86e2a8523\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7tn9t" Apr 16 20:11:42.855613 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.854400 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ctfcl\" (UniqueName: \"kubernetes.io/projected/2d8fda20-4941-47b7-ab67-e2b86e2a8523-kube-api-access-ctfcl\") pod \"aws-ebs-csi-driver-node-7tn9t\" (UID: \"2d8fda20-4941-47b7-ab67-e2b86e2a8523\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7tn9t" Apr 16 20:11:42.855613 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.854436 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/22a56877-dc06-432a-9ca1-06a091e2af67-etc-systemd\") pod \"tuned-rcrtv\" (UID: \"22a56877-dc06-432a-9ca1-06a091e2af67\") " pod="openshift-cluster-node-tuning-operator/tuned-rcrtv" Apr 16 20:11:42.855613 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.854458 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vxvv6\" (UniqueName: \"kubernetes.io/projected/22a56877-dc06-432a-9ca1-06a091e2af67-kube-api-access-vxvv6\") pod \"tuned-rcrtv\" (UID: \"22a56877-dc06-432a-9ca1-06a091e2af67\") " pod="openshift-cluster-node-tuning-operator/tuned-rcrtv" Apr 16 20:11:42.855613 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.854501 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/909cb17e-abc8-410f-adb0-206e00624944-run-openvswitch\") pod \"ovnkube-node-985k4\" (UID: \"909cb17e-abc8-410f-adb0-206e00624944\") " pod="openshift-ovn-kubernetes/ovnkube-node-985k4" Apr 16 20:11:42.855613 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.854537 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e372e398-739d-495f-a53d-0b5eea55bc76-host-var-lib-cni-multus\") pod \"multus-8gsfm\" (UID: \"e372e398-739d-495f-a53d-0b5eea55bc76\") " pod="openshift-multus/multus-8gsfm" Apr 16 20:11:42.855613 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.854542 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/909cb17e-abc8-410f-adb0-206e00624944-run-openvswitch\") pod \"ovnkube-node-985k4\" (UID: \"909cb17e-abc8-410f-adb0-206e00624944\") " pod="openshift-ovn-kubernetes/ovnkube-node-985k4" Apr 16 20:11:42.855613 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.854503 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/22a56877-dc06-432a-9ca1-06a091e2af67-etc-systemd\") pod \"tuned-rcrtv\" (UID: \"22a56877-dc06-432a-9ca1-06a091e2af67\") " pod="openshift-cluster-node-tuning-operator/tuned-rcrtv" Apr 16 20:11:42.855613 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.854564 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/22a56877-dc06-432a-9ca1-06a091e2af67-etc-sysconfig\") pod \"tuned-rcrtv\" (UID: \"22a56877-dc06-432a-9ca1-06a091e2af67\") " pod="openshift-cluster-node-tuning-operator/tuned-rcrtv" Apr 16 20:11:42.855613 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.854588 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/22a56877-dc06-432a-9ca1-06a091e2af67-host\") pod \"tuned-rcrtv\" (UID: \"22a56877-dc06-432a-9ca1-06a091e2af67\") " pod="openshift-cluster-node-tuning-operator/tuned-rcrtv" Apr 16 20:11:42.855613 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.854613 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/909cb17e-abc8-410f-adb0-206e00624944-host-kubelet\") pod \"ovnkube-node-985k4\" (UID: \"909cb17e-abc8-410f-adb0-206e00624944\") " pod="openshift-ovn-kubernetes/ovnkube-node-985k4" Apr 16 20:11:42.855613 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.854628 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/22a56877-dc06-432a-9ca1-06a091e2af67-etc-sysconfig\") pod \"tuned-rcrtv\" (UID: \"22a56877-dc06-432a-9ca1-06a091e2af67\") " pod="openshift-cluster-node-tuning-operator/tuned-rcrtv" Apr 16 20:11:42.855613 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.854640 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/bd1cbed9-748e-4677-b9ad-5f73500b5738-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-6gf96\" (UID: \"bd1cbed9-748e-4677-b9ad-5f73500b5738\") " pod="openshift-multus/multus-additional-cni-plugins-6gf96" Apr 16 20:11:42.855613 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.854645 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/22a56877-dc06-432a-9ca1-06a091e2af67-host\") pod \"tuned-rcrtv\" (UID: \"22a56877-dc06-432a-9ca1-06a091e2af67\") " pod="openshift-cluster-node-tuning-operator/tuned-rcrtv" Apr 16 20:11:42.856368 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.854670 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/56120f2f-13d6-48da-a3d5-a472edb09dfc-serviceca\") pod \"node-ca-jwfkw\" (UID: \"56120f2f-13d6-48da-a3d5-a472edb09dfc\") " pod="openshift-image-registry/node-ca-jwfkw" Apr 16 20:11:42.856368 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.854683 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/909cb17e-abc8-410f-adb0-206e00624944-host-kubelet\") pod \"ovnkube-node-985k4\" (UID: \"909cb17e-abc8-410f-adb0-206e00624944\") " pod="openshift-ovn-kubernetes/ovnkube-node-985k4" Apr 16 20:11:42.856368 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.854694 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/99d4b836-b69f-4839-b4e1-937780e53711-agent-certs\") pod \"konnectivity-agent-9qvtx\" (UID: \"99d4b836-b69f-4839-b4e1-937780e53711\") " pod="kube-system/konnectivity-agent-9qvtx" Apr 16 20:11:42.856368 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.854720 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e372e398-739d-495f-a53d-0b5eea55bc76-os-release\") pod \"multus-8gsfm\" (UID: \"e372e398-739d-495f-a53d-0b5eea55bc76\") " pod="openshift-multus/multus-8gsfm" Apr 16 20:11:42.856368 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.854746 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bd1cbed9-748e-4677-b9ad-5f73500b5738-os-release\") pod \"multus-additional-cni-plugins-6gf96\" (UID: \"bd1cbed9-748e-4677-b9ad-5f73500b5738\") " pod="openshift-multus/multus-additional-cni-plugins-6gf96" Apr 16 20:11:42.856368 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.854775 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/22a56877-dc06-432a-9ca1-06a091e2af67-etc-tuned\") pod \"tuned-rcrtv\" (UID: \"22a56877-dc06-432a-9ca1-06a091e2af67\") " pod="openshift-cluster-node-tuning-operator/tuned-rcrtv" Apr 16 20:11:42.856368 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.854800 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/909cb17e-abc8-410f-adb0-206e00624944-log-socket\") pod \"ovnkube-node-985k4\" (UID: \"909cb17e-abc8-410f-adb0-206e00624944\") " pod="openshift-ovn-kubernetes/ovnkube-node-985k4" Apr 16 20:11:42.856368 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.854840 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/909cb17e-abc8-410f-adb0-206e00624944-ovn-node-metrics-cert\") pod \"ovnkube-node-985k4\" (UID: \"909cb17e-abc8-410f-adb0-206e00624944\") " pod="openshift-ovn-kubernetes/ovnkube-node-985k4" Apr 16 20:11:42.856368 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.854874 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dc7mx\" (UniqueName: \"kubernetes.io/projected/c7481f18-a2dc-4df2-8b59-4184bc5c7283-kube-api-access-dc7mx\") pod \"network-metrics-daemon-l9pv6\" (UID: \"c7481f18-a2dc-4df2-8b59-4184bc5c7283\") " pod="openshift-multus/network-metrics-daemon-l9pv6" Apr 16 20:11:42.856368 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.854905 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e372e398-739d-495f-a53d-0b5eea55bc76-multus-daemon-config\") pod \"multus-8gsfm\" (UID: \"e372e398-739d-495f-a53d-0b5eea55bc76\") " pod="openshift-multus/multus-8gsfm" Apr 16 20:11:42.856368 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.854932 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7xx2\" (UniqueName: \"kubernetes.io/projected/e372e398-739d-495f-a53d-0b5eea55bc76-kube-api-access-j7xx2\") pod \"multus-8gsfm\" (UID: \"e372e398-739d-495f-a53d-0b5eea55bc76\") " pod="openshift-multus/multus-8gsfm" Apr 16 20:11:42.856368 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.854976 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bd1cbed9-748e-4677-b9ad-5f73500b5738-cnibin\") pod \"multus-additional-cni-plugins-6gf96\" (UID: \"bd1cbed9-748e-4677-b9ad-5f73500b5738\") " pod="openshift-multus/multus-additional-cni-plugins-6gf96" Apr 16 20:11:42.856368 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.855003 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-89z8w\" (UniqueName: \"kubernetes.io/projected/9c16c53d-2e96-42ef-86b8-344e57a267ee-kube-api-access-89z8w\") pod \"node-resolver-w8lw7\" (UID: \"9c16c53d-2e96-42ef-86b8-344e57a267ee\") " pod="openshift-dns/node-resolver-w8lw7" Apr 16 20:11:42.856368 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.855047 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/909cb17e-abc8-410f-adb0-206e00624944-log-socket\") pod \"ovnkube-node-985k4\" (UID: \"909cb17e-abc8-410f-adb0-206e00624944\") " pod="openshift-ovn-kubernetes/ovnkube-node-985k4" Apr 16 20:11:42.856368 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.855073 2577 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 20:11:42.856368 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.855094 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/56120f2f-13d6-48da-a3d5-a472edb09dfc-serviceca\") pod \"node-ca-jwfkw\" (UID: \"56120f2f-13d6-48da-a3d5-a472edb09dfc\") " pod="openshift-image-registry/node-ca-jwfkw" Apr 16 20:11:42.856368 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.855083 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bd1cbed9-748e-4677-b9ad-5f73500b5738-cni-binary-copy\") pod \"multus-additional-cni-plugins-6gf96\" (UID: \"bd1cbed9-748e-4677-b9ad-5f73500b5738\") " pod="openshift-multus/multus-additional-cni-plugins-6gf96" Apr 16 20:11:42.857121 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.855285 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/bd1cbed9-748e-4677-b9ad-5f73500b5738-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6gf96\" (UID: \"bd1cbed9-748e-4677-b9ad-5f73500b5738\") " pod="openshift-multus/multus-additional-cni-plugins-6gf96" Apr 16 20:11:42.857121 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.855333 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/2d8fda20-4941-47b7-ab67-e2b86e2a8523-etc-selinux\") pod \"aws-ebs-csi-driver-node-7tn9t\" (UID: \"2d8fda20-4941-47b7-ab67-e2b86e2a8523\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7tn9t" Apr 16 20:11:42.857121 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.855374 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9c16c53d-2e96-42ef-86b8-344e57a267ee-hosts-file\") pod \"node-resolver-w8lw7\" (UID: \"9c16c53d-2e96-42ef-86b8-344e57a267ee\") " pod="openshift-dns/node-resolver-w8lw7" Apr 16 20:11:42.857121 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.855406 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e372e398-739d-495f-a53d-0b5eea55bc76-etc-kubernetes\") pod \"multus-8gsfm\" (UID: \"e372e398-739d-495f-a53d-0b5eea55bc76\") " pod="openshift-multus/multus-8gsfm" Apr 16 20:11:42.857121 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.855415 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/2d8fda20-4941-47b7-ab67-e2b86e2a8523-etc-selinux\") pod \"aws-ebs-csi-driver-node-7tn9t\" (UID: \"2d8fda20-4941-47b7-ab67-e2b86e2a8523\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7tn9t" Apr 16 20:11:42.857121 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.855452 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/2d8fda20-4941-47b7-ab67-e2b86e2a8523-socket-dir\") pod \"aws-ebs-csi-driver-node-7tn9t\" (UID: \"2d8fda20-4941-47b7-ab67-e2b86e2a8523\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7tn9t" Apr 16 20:11:42.857121 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.855463 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9c16c53d-2e96-42ef-86b8-344e57a267ee-hosts-file\") pod \"node-resolver-w8lw7\" (UID: \"9c16c53d-2e96-42ef-86b8-344e57a267ee\") " pod="openshift-dns/node-resolver-w8lw7" Apr 16 20:11:42.857121 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.855479 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/909cb17e-abc8-410f-adb0-206e00624944-host-cni-netd\") pod \"ovnkube-node-985k4\" (UID: \"909cb17e-abc8-410f-adb0-206e00624944\") " pod="openshift-ovn-kubernetes/ovnkube-node-985k4" Apr 16 20:11:42.857121 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.855503 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/2d8fda20-4941-47b7-ab67-e2b86e2a8523-sys-fs\") pod \"aws-ebs-csi-driver-node-7tn9t\" (UID: \"2d8fda20-4941-47b7-ab67-e2b86e2a8523\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7tn9t" Apr 16 20:11:42.857121 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.855526 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/22a56877-dc06-432a-9ca1-06a091e2af67-etc-sysctl-d\") pod \"tuned-rcrtv\" (UID: \"22a56877-dc06-432a-9ca1-06a091e2af67\") " pod="openshift-cluster-node-tuning-operator/tuned-rcrtv" Apr 16 20:11:42.857121 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.855560 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/22a56877-dc06-432a-9ca1-06a091e2af67-var-lib-kubelet\") pod \"tuned-rcrtv\" (UID: \"22a56877-dc06-432a-9ca1-06a091e2af67\") " pod="openshift-cluster-node-tuning-operator/tuned-rcrtv" Apr 16 20:11:42.857121 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.855581 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/909cb17e-abc8-410f-adb0-206e00624944-host-cni-netd\") pod \"ovnkube-node-985k4\" (UID: \"909cb17e-abc8-410f-adb0-206e00624944\") " pod="openshift-ovn-kubernetes/ovnkube-node-985k4" Apr 16 20:11:42.857121 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.855588 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4qdsc\" (UniqueName: \"kubernetes.io/projected/909cb17e-abc8-410f-adb0-206e00624944-kube-api-access-4qdsc\") pod \"ovnkube-node-985k4\" (UID: \"909cb17e-abc8-410f-adb0-206e00624944\") " pod="openshift-ovn-kubernetes/ovnkube-node-985k4" Apr 16 20:11:42.857121 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.855615 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9j28n\" (UniqueName: \"kubernetes.io/projected/512ea84f-ed8b-4a83-a907-ce38ff494ba1-kube-api-access-9j28n\") pod \"iptables-alerter-jwtk7\" (UID: \"512ea84f-ed8b-4a83-a907-ce38ff494ba1\") " pod="openshift-network-operator/iptables-alerter-jwtk7" Apr 16 20:11:42.857121 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.855636 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/2d8fda20-4941-47b7-ab67-e2b86e2a8523-sys-fs\") pod \"aws-ebs-csi-driver-node-7tn9t\" (UID: \"2d8fda20-4941-47b7-ab67-e2b86e2a8523\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7tn9t" Apr 16 20:11:42.857121 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.855649 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mm6t\" (UniqueName: \"kubernetes.io/projected/1280bc20-0a50-4b20-8b90-8057a3627648-kube-api-access-2mm6t\") pod \"network-check-target-t77hx\" (UID: \"1280bc20-0a50-4b20-8b90-8057a3627648\") " pod="openshift-network-diagnostics/network-check-target-t77hx" Apr 16 20:11:42.857121 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.855676 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bd1cbed9-748e-4677-b9ad-5f73500b5738-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6gf96\" (UID: \"bd1cbed9-748e-4677-b9ad-5f73500b5738\") " pod="openshift-multus/multus-additional-cni-plugins-6gf96" Apr 16 20:11:42.858008 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.855689 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/22a56877-dc06-432a-9ca1-06a091e2af67-etc-sysctl-d\") pod \"tuned-rcrtv\" (UID: \"22a56877-dc06-432a-9ca1-06a091e2af67\") " pod="openshift-cluster-node-tuning-operator/tuned-rcrtv" Apr 16 20:11:42.858008 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.855702 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/2d8fda20-4941-47b7-ab67-e2b86e2a8523-registration-dir\") pod \"aws-ebs-csi-driver-node-7tn9t\" (UID: \"2d8fda20-4941-47b7-ab67-e2b86e2a8523\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7tn9t" Apr 16 20:11:42.858008 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.855727 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/909cb17e-abc8-410f-adb0-206e00624944-run-systemd\") pod \"ovnkube-node-985k4\" (UID: \"909cb17e-abc8-410f-adb0-206e00624944\") " pod="openshift-ovn-kubernetes/ovnkube-node-985k4" Apr 16 20:11:42.858008 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.855740 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/22a56877-dc06-432a-9ca1-06a091e2af67-var-lib-kubelet\") pod \"tuned-rcrtv\" (UID: \"22a56877-dc06-432a-9ca1-06a091e2af67\") " pod="openshift-cluster-node-tuning-operator/tuned-rcrtv" Apr 16 20:11:42.858008 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.855749 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/512ea84f-ed8b-4a83-a907-ce38ff494ba1-host-slash\") pod \"iptables-alerter-jwtk7\" (UID: \"512ea84f-ed8b-4a83-a907-ce38ff494ba1\") " pod="openshift-network-operator/iptables-alerter-jwtk7" Apr 16 20:11:42.858008 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.855782 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n8kl8\" (UniqueName: \"kubernetes.io/projected/56120f2f-13d6-48da-a3d5-a472edb09dfc-kube-api-access-n8kl8\") pod \"node-ca-jwfkw\" (UID: \"56120f2f-13d6-48da-a3d5-a472edb09dfc\") " pod="openshift-image-registry/node-ca-jwfkw" Apr 16 20:11:42.858008 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.855793 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/512ea84f-ed8b-4a83-a907-ce38ff494ba1-host-slash\") pod \"iptables-alerter-jwtk7\" (UID: \"512ea84f-ed8b-4a83-a907-ce38ff494ba1\") " pod="openshift-network-operator/iptables-alerter-jwtk7" Apr 16 20:11:42.858008 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.855811 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/22a56877-dc06-432a-9ca1-06a091e2af67-lib-modules\") pod \"tuned-rcrtv\" (UID: \"22a56877-dc06-432a-9ca1-06a091e2af67\") " pod="openshift-cluster-node-tuning-operator/tuned-rcrtv" Apr 16 20:11:42.858008 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.855836 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/22a56877-dc06-432a-9ca1-06a091e2af67-tmp\") pod \"tuned-rcrtv\" (UID: \"22a56877-dc06-432a-9ca1-06a091e2af67\") " pod="openshift-cluster-node-tuning-operator/tuned-rcrtv" Apr 16 20:11:42.858008 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.855851 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/2d8fda20-4941-47b7-ab67-e2b86e2a8523-registration-dir\") pod \"aws-ebs-csi-driver-node-7tn9t\" (UID: \"2d8fda20-4941-47b7-ab67-e2b86e2a8523\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7tn9t" Apr 16 20:11:42.858008 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.855860 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/512ea84f-ed8b-4a83-a907-ce38ff494ba1-iptables-alerter-script\") pod \"iptables-alerter-jwtk7\" (UID: \"512ea84f-ed8b-4a83-a907-ce38ff494ba1\") " pod="openshift-network-operator/iptables-alerter-jwtk7" Apr 16 20:11:42.858008 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.855866 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/909cb17e-abc8-410f-adb0-206e00624944-run-systemd\") pod \"ovnkube-node-985k4\" (UID: \"909cb17e-abc8-410f-adb0-206e00624944\") " pod="openshift-ovn-kubernetes/ovnkube-node-985k4" Apr 16 20:11:42.858008 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.855888 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e372e398-739d-495f-a53d-0b5eea55bc76-host-var-lib-cni-bin\") pod \"multus-8gsfm\" (UID: \"e372e398-739d-495f-a53d-0b5eea55bc76\") " pod="openshift-multus/multus-8gsfm" Apr 16 20:11:42.858008 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.855911 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e372e398-739d-495f-a53d-0b5eea55bc76-hostroot\") pod \"multus-8gsfm\" (UID: \"e372e398-739d-495f-a53d-0b5eea55bc76\") " pod="openshift-multus/multus-8gsfm" Apr 16 20:11:42.858008 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.855585 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/2d8fda20-4941-47b7-ab67-e2b86e2a8523-socket-dir\") pod \"aws-ebs-csi-driver-node-7tn9t\" (UID: \"2d8fda20-4941-47b7-ab67-e2b86e2a8523\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7tn9t" Apr 16 20:11:42.858008 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.856289 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/22a56877-dc06-432a-9ca1-06a091e2af67-lib-modules\") pod \"tuned-rcrtv\" (UID: \"22a56877-dc06-432a-9ca1-06a091e2af67\") " pod="openshift-cluster-node-tuning-operator/tuned-rcrtv" Apr 16 20:11:42.858008 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.857057 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/512ea84f-ed8b-4a83-a907-ce38ff494ba1-iptables-alerter-script\") pod \"iptables-alerter-jwtk7\" (UID: \"512ea84f-ed8b-4a83-a907-ce38ff494ba1\") " pod="openshift-network-operator/iptables-alerter-jwtk7" Apr 16 20:11:42.859002 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.858867 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/22a56877-dc06-432a-9ca1-06a091e2af67-etc-tuned\") pod \"tuned-rcrtv\" (UID: \"22a56877-dc06-432a-9ca1-06a091e2af67\") " pod="openshift-cluster-node-tuning-operator/tuned-rcrtv" Apr 16 20:11:42.859002 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.858941 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/22a56877-dc06-432a-9ca1-06a091e2af67-tmp\") pod \"tuned-rcrtv\" (UID: \"22a56877-dc06-432a-9ca1-06a091e2af67\") " pod="openshift-cluster-node-tuning-operator/tuned-rcrtv" Apr 16 20:11:42.859217 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.859192 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/909cb17e-abc8-410f-adb0-206e00624944-ovn-node-metrics-cert\") pod \"ovnkube-node-985k4\" (UID: \"909cb17e-abc8-410f-adb0-206e00624944\") " pod="openshift-ovn-kubernetes/ovnkube-node-985k4" Apr 16 20:11:42.859356 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.859246 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/99d4b836-b69f-4839-b4e1-937780e53711-agent-certs\") pod \"konnectivity-agent-9qvtx\" (UID: \"99d4b836-b69f-4839-b4e1-937780e53711\") " pod="kube-system/konnectivity-agent-9qvtx" Apr 16 20:11:42.869565 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.869542 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctfcl\" (UniqueName: \"kubernetes.io/projected/2d8fda20-4941-47b7-ab67-e2b86e2a8523-kube-api-access-ctfcl\") pod \"aws-ebs-csi-driver-node-7tn9t\" (UID: \"2d8fda20-4941-47b7-ab67-e2b86e2a8523\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7tn9t" Apr 16 20:11:42.871377 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.871354 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8kl8\" (UniqueName: \"kubernetes.io/projected/56120f2f-13d6-48da-a3d5-a472edb09dfc-kube-api-access-n8kl8\") pod \"node-ca-jwfkw\" (UID: \"56120f2f-13d6-48da-a3d5-a472edb09dfc\") " pod="openshift-image-registry/node-ca-jwfkw" Apr 16 20:11:42.874248 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.874188 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-89z8w\" (UniqueName: \"kubernetes.io/projected/9c16c53d-2e96-42ef-86b8-344e57a267ee-kube-api-access-89z8w\") pod \"node-resolver-w8lw7\" (UID: \"9c16c53d-2e96-42ef-86b8-344e57a267ee\") " pod="openshift-dns/node-resolver-w8lw7" Apr 16 20:11:42.874353 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.874258 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9j28n\" (UniqueName: \"kubernetes.io/projected/512ea84f-ed8b-4a83-a907-ce38ff494ba1-kube-api-access-9j28n\") pod \"iptables-alerter-jwtk7\" (UID: \"512ea84f-ed8b-4a83-a907-ce38ff494ba1\") " pod="openshift-network-operator/iptables-alerter-jwtk7" Apr 16 20:11:42.874353 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.874272 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxvv6\" (UniqueName: \"kubernetes.io/projected/22a56877-dc06-432a-9ca1-06a091e2af67-kube-api-access-vxvv6\") pod \"tuned-rcrtv\" (UID: \"22a56877-dc06-432a-9ca1-06a091e2af67\") " pod="openshift-cluster-node-tuning-operator/tuned-rcrtv" Apr 16 20:11:42.874353 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.874304 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dc7mx\" (UniqueName: \"kubernetes.io/projected/c7481f18-a2dc-4df2-8b59-4184bc5c7283-kube-api-access-dc7mx\") pod \"network-metrics-daemon-l9pv6\" (UID: \"c7481f18-a2dc-4df2-8b59-4184bc5c7283\") " pod="openshift-multus/network-metrics-daemon-l9pv6" Apr 16 20:11:42.875495 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.875474 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qdsc\" (UniqueName: \"kubernetes.io/projected/909cb17e-abc8-410f-adb0-206e00624944-kube-api-access-4qdsc\") pod \"ovnkube-node-985k4\" (UID: \"909cb17e-abc8-410f-adb0-206e00624944\") " pod="openshift-ovn-kubernetes/ovnkube-node-985k4" Apr 16 20:11:42.957153 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.957123 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e372e398-739d-495f-a53d-0b5eea55bc76-host-var-lib-kubelet\") pod \"multus-8gsfm\" (UID: \"e372e398-739d-495f-a53d-0b5eea55bc76\") " pod="openshift-multus/multus-8gsfm" Apr 16 20:11:42.957317 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.957160 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fvz8d\" (UniqueName: \"kubernetes.io/projected/bd1cbed9-748e-4677-b9ad-5f73500b5738-kube-api-access-fvz8d\") pod \"multus-additional-cni-plugins-6gf96\" (UID: \"bd1cbed9-748e-4677-b9ad-5f73500b5738\") " pod="openshift-multus/multus-additional-cni-plugins-6gf96" Apr 16 20:11:42.957317 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.957189 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e372e398-739d-495f-a53d-0b5eea55bc76-host-run-netns\") pod \"multus-8gsfm\" (UID: \"e372e398-739d-495f-a53d-0b5eea55bc76\") " pod="openshift-multus/multus-8gsfm" Apr 16 20:11:42.957317 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.957205 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e372e398-739d-495f-a53d-0b5eea55bc76-host-var-lib-kubelet\") pod \"multus-8gsfm\" (UID: \"e372e398-739d-495f-a53d-0b5eea55bc76\") " pod="openshift-multus/multus-8gsfm" Apr 16 20:11:42.957317 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.957211 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e372e398-739d-495f-a53d-0b5eea55bc76-multus-conf-dir\") pod \"multus-8gsfm\" (UID: \"e372e398-739d-495f-a53d-0b5eea55bc76\") " pod="openshift-multus/multus-8gsfm" Apr 16 20:11:42.957317 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.957250 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e372e398-739d-495f-a53d-0b5eea55bc76-system-cni-dir\") pod \"multus-8gsfm\" (UID: \"e372e398-739d-495f-a53d-0b5eea55bc76\") " pod="openshift-multus/multus-8gsfm" Apr 16 20:11:42.957317 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.957256 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e372e398-739d-495f-a53d-0b5eea55bc76-host-run-netns\") pod \"multus-8gsfm\" (UID: \"e372e398-739d-495f-a53d-0b5eea55bc76\") " pod="openshift-multus/multus-8gsfm" Apr 16 20:11:42.957317 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.957289 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e372e398-739d-495f-a53d-0b5eea55bc76-cnibin\") pod \"multus-8gsfm\" (UID: \"e372e398-739d-495f-a53d-0b5eea55bc76\") " pod="openshift-multus/multus-8gsfm" Apr 16 20:11:42.957317 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.957301 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e372e398-739d-495f-a53d-0b5eea55bc76-multus-conf-dir\") pod \"multus-8gsfm\" (UID: \"e372e398-739d-495f-a53d-0b5eea55bc76\") " pod="openshift-multus/multus-8gsfm" Apr 16 20:11:42.957317 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.957315 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e372e398-739d-495f-a53d-0b5eea55bc76-multus-socket-dir-parent\") pod \"multus-8gsfm\" (UID: \"e372e398-739d-495f-a53d-0b5eea55bc76\") " pod="openshift-multus/multus-8gsfm" Apr 16 20:11:42.957727 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.957342 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e372e398-739d-495f-a53d-0b5eea55bc76-host-run-k8s-cni-cncf-io\") pod \"multus-8gsfm\" (UID: \"e372e398-739d-495f-a53d-0b5eea55bc76\") " pod="openshift-multus/multus-8gsfm" Apr 16 20:11:42.957727 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.957377 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e372e398-739d-495f-a53d-0b5eea55bc76-host-run-k8s-cni-cncf-io\") pod \"multus-8gsfm\" (UID: \"e372e398-739d-495f-a53d-0b5eea55bc76\") " pod="openshift-multus/multus-8gsfm" Apr 16 20:11:42.957727 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.957350 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e372e398-739d-495f-a53d-0b5eea55bc76-system-cni-dir\") pod \"multus-8gsfm\" (UID: \"e372e398-739d-495f-a53d-0b5eea55bc76\") " pod="openshift-multus/multus-8gsfm" Apr 16 20:11:42.957727 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.957391 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e372e398-739d-495f-a53d-0b5eea55bc76-multus-cni-dir\") pod \"multus-8gsfm\" (UID: \"e372e398-739d-495f-a53d-0b5eea55bc76\") " pod="openshift-multus/multus-8gsfm" Apr 16 20:11:42.957727 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.957404 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e372e398-739d-495f-a53d-0b5eea55bc76-multus-socket-dir-parent\") pod \"multus-8gsfm\" (UID: \"e372e398-739d-495f-a53d-0b5eea55bc76\") " pod="openshift-multus/multus-8gsfm" Apr 16 20:11:42.957727 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.957421 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e372e398-739d-495f-a53d-0b5eea55bc76-cni-binary-copy\") pod \"multus-8gsfm\" (UID: \"e372e398-739d-495f-a53d-0b5eea55bc76\") " pod="openshift-multus/multus-8gsfm" Apr 16 20:11:42.957727 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.957396 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e372e398-739d-495f-a53d-0b5eea55bc76-cnibin\") pod \"multus-8gsfm\" (UID: \"e372e398-739d-495f-a53d-0b5eea55bc76\") " pod="openshift-multus/multus-8gsfm" Apr 16 20:11:42.957727 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.957450 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e372e398-739d-495f-a53d-0b5eea55bc76-multus-cni-dir\") pod \"multus-8gsfm\" (UID: \"e372e398-739d-495f-a53d-0b5eea55bc76\") " pod="openshift-multus/multus-8gsfm" Apr 16 20:11:42.957727 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.957457 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e372e398-739d-495f-a53d-0b5eea55bc76-host-var-lib-cni-multus\") pod \"multus-8gsfm\" (UID: \"e372e398-739d-495f-a53d-0b5eea55bc76\") " pod="openshift-multus/multus-8gsfm" Apr 16 20:11:42.957727 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.957495 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/bd1cbed9-748e-4677-b9ad-5f73500b5738-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-6gf96\" (UID: \"bd1cbed9-748e-4677-b9ad-5f73500b5738\") " pod="openshift-multus/multus-additional-cni-plugins-6gf96" Apr 16 20:11:42.957727 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.957526 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e372e398-739d-495f-a53d-0b5eea55bc76-os-release\") pod \"multus-8gsfm\" (UID: \"e372e398-739d-495f-a53d-0b5eea55bc76\") " pod="openshift-multus/multus-8gsfm" Apr 16 20:11:42.957727 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.957538 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e372e398-739d-495f-a53d-0b5eea55bc76-host-var-lib-cni-multus\") pod \"multus-8gsfm\" (UID: \"e372e398-739d-495f-a53d-0b5eea55bc76\") " pod="openshift-multus/multus-8gsfm" Apr 16 20:11:42.957727 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.957549 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bd1cbed9-748e-4677-b9ad-5f73500b5738-os-release\") pod \"multus-additional-cni-plugins-6gf96\" (UID: \"bd1cbed9-748e-4677-b9ad-5f73500b5738\") " pod="openshift-multus/multus-additional-cni-plugins-6gf96" Apr 16 20:11:42.957727 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.957577 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e372e398-739d-495f-a53d-0b5eea55bc76-multus-daemon-config\") pod \"multus-8gsfm\" (UID: \"e372e398-739d-495f-a53d-0b5eea55bc76\") " pod="openshift-multus/multus-8gsfm" Apr 16 20:11:42.957727 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.957602 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j7xx2\" (UniqueName: \"kubernetes.io/projected/e372e398-739d-495f-a53d-0b5eea55bc76-kube-api-access-j7xx2\") pod \"multus-8gsfm\" (UID: \"e372e398-739d-495f-a53d-0b5eea55bc76\") " pod="openshift-multus/multus-8gsfm" Apr 16 20:11:42.957727 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.957606 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e372e398-739d-495f-a53d-0b5eea55bc76-os-release\") pod \"multus-8gsfm\" (UID: \"e372e398-739d-495f-a53d-0b5eea55bc76\") " pod="openshift-multus/multus-8gsfm" Apr 16 20:11:42.957727 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.957628 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bd1cbed9-748e-4677-b9ad-5f73500b5738-cnibin\") pod \"multus-additional-cni-plugins-6gf96\" (UID: \"bd1cbed9-748e-4677-b9ad-5f73500b5738\") " pod="openshift-multus/multus-additional-cni-plugins-6gf96" Apr 16 20:11:42.957727 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.957691 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bd1cbed9-748e-4677-b9ad-5f73500b5738-cni-binary-copy\") pod \"multus-additional-cni-plugins-6gf96\" (UID: \"bd1cbed9-748e-4677-b9ad-5f73500b5738\") " pod="openshift-multus/multus-additional-cni-plugins-6gf96" Apr 16 20:11:42.958391 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.957715 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/bd1cbed9-748e-4677-b9ad-5f73500b5738-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6gf96\" (UID: \"bd1cbed9-748e-4677-b9ad-5f73500b5738\") " pod="openshift-multus/multus-additional-cni-plugins-6gf96" Apr 16 20:11:42.958391 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.957740 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e372e398-739d-495f-a53d-0b5eea55bc76-etc-kubernetes\") pod \"multus-8gsfm\" (UID: \"e372e398-739d-495f-a53d-0b5eea55bc76\") " pod="openshift-multus/multus-8gsfm" Apr 16 20:11:42.958391 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.957771 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2mm6t\" (UniqueName: \"kubernetes.io/projected/1280bc20-0a50-4b20-8b90-8057a3627648-kube-api-access-2mm6t\") pod \"network-check-target-t77hx\" (UID: \"1280bc20-0a50-4b20-8b90-8057a3627648\") " pod="openshift-network-diagnostics/network-check-target-t77hx" Apr 16 20:11:42.958391 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.957798 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bd1cbed9-748e-4677-b9ad-5f73500b5738-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6gf96\" (UID: \"bd1cbed9-748e-4677-b9ad-5f73500b5738\") " pod="openshift-multus/multus-additional-cni-plugins-6gf96" Apr 16 20:11:42.958391 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.957850 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e372e398-739d-495f-a53d-0b5eea55bc76-host-var-lib-cni-bin\") pod \"multus-8gsfm\" (UID: \"e372e398-739d-495f-a53d-0b5eea55bc76\") " pod="openshift-multus/multus-8gsfm" Apr 16 20:11:42.958391 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.957876 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e372e398-739d-495f-a53d-0b5eea55bc76-hostroot\") pod \"multus-8gsfm\" (UID: \"e372e398-739d-495f-a53d-0b5eea55bc76\") " pod="openshift-multus/multus-8gsfm" Apr 16 20:11:42.958391 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.957927 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e372e398-739d-495f-a53d-0b5eea55bc76-host-run-multus-certs\") pod \"multus-8gsfm\" (UID: \"e372e398-739d-495f-a53d-0b5eea55bc76\") " pod="openshift-multus/multus-8gsfm" Apr 16 20:11:42.958391 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.957974 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bd1cbed9-748e-4677-b9ad-5f73500b5738-system-cni-dir\") pod \"multus-additional-cni-plugins-6gf96\" (UID: \"bd1cbed9-748e-4677-b9ad-5f73500b5738\") " pod="openshift-multus/multus-additional-cni-plugins-6gf96" Apr 16 20:11:42.958391 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.958043 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e372e398-739d-495f-a53d-0b5eea55bc76-cni-binary-copy\") pod \"multus-8gsfm\" (UID: \"e372e398-739d-495f-a53d-0b5eea55bc76\") " pod="openshift-multus/multus-8gsfm" Apr 16 20:11:42.958391 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.958051 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/bd1cbed9-748e-4677-b9ad-5f73500b5738-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-6gf96\" (UID: \"bd1cbed9-748e-4677-b9ad-5f73500b5738\") " pod="openshift-multus/multus-additional-cni-plugins-6gf96" Apr 16 20:11:42.958391 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.958054 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bd1cbed9-748e-4677-b9ad-5f73500b5738-system-cni-dir\") pod \"multus-additional-cni-plugins-6gf96\" (UID: \"bd1cbed9-748e-4677-b9ad-5f73500b5738\") " pod="openshift-multus/multus-additional-cni-plugins-6gf96" Apr 16 20:11:42.958391 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.958119 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bd1cbed9-748e-4677-b9ad-5f73500b5738-os-release\") pod \"multus-additional-cni-plugins-6gf96\" (UID: \"bd1cbed9-748e-4677-b9ad-5f73500b5738\") " pod="openshift-multus/multus-additional-cni-plugins-6gf96" Apr 16 20:11:42.958391 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.958129 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e372e398-739d-495f-a53d-0b5eea55bc76-multus-daemon-config\") pod \"multus-8gsfm\" (UID: \"e372e398-739d-495f-a53d-0b5eea55bc76\") " pod="openshift-multus/multus-8gsfm" Apr 16 20:11:42.958391 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.958174 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e372e398-739d-495f-a53d-0b5eea55bc76-hostroot\") pod \"multus-8gsfm\" (UID: \"e372e398-739d-495f-a53d-0b5eea55bc76\") " pod="openshift-multus/multus-8gsfm" Apr 16 20:11:42.958391 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.958176 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bd1cbed9-748e-4677-b9ad-5f73500b5738-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6gf96\" (UID: \"bd1cbed9-748e-4677-b9ad-5f73500b5738\") " pod="openshift-multus/multus-additional-cni-plugins-6gf96" Apr 16 20:11:42.958391 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.958210 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e372e398-739d-495f-a53d-0b5eea55bc76-host-run-multus-certs\") pod \"multus-8gsfm\" (UID: \"e372e398-739d-495f-a53d-0b5eea55bc76\") " pod="openshift-multus/multus-8gsfm" Apr 16 20:11:42.958391 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.958224 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e372e398-739d-495f-a53d-0b5eea55bc76-host-var-lib-cni-bin\") pod \"multus-8gsfm\" (UID: \"e372e398-739d-495f-a53d-0b5eea55bc76\") " pod="openshift-multus/multus-8gsfm" Apr 16 20:11:42.959266 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.958271 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bd1cbed9-748e-4677-b9ad-5f73500b5738-cnibin\") pod \"multus-additional-cni-plugins-6gf96\" (UID: \"bd1cbed9-748e-4677-b9ad-5f73500b5738\") " pod="openshift-multus/multus-additional-cni-plugins-6gf96" Apr 16 20:11:42.959266 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.958303 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e372e398-739d-495f-a53d-0b5eea55bc76-etc-kubernetes\") pod \"multus-8gsfm\" (UID: \"e372e398-739d-495f-a53d-0b5eea55bc76\") " pod="openshift-multus/multus-8gsfm" Apr 16 20:11:42.959266 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.958740 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/bd1cbed9-748e-4677-b9ad-5f73500b5738-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6gf96\" (UID: \"bd1cbed9-748e-4677-b9ad-5f73500b5738\") " pod="openshift-multus/multus-additional-cni-plugins-6gf96" Apr 16 20:11:42.959372 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.959301 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bd1cbed9-748e-4677-b9ad-5f73500b5738-cni-binary-copy\") pod \"multus-additional-cni-plugins-6gf96\" (UID: \"bd1cbed9-748e-4677-b9ad-5f73500b5738\") " pod="openshift-multus/multus-additional-cni-plugins-6gf96" Apr 16 20:11:42.966147 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:11:42.966095 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:11:42.966147 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:11:42.966117 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:11:42.966147 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:11:42.966132 2577 projected.go:194] Error preparing data for projected volume kube-api-access-2mm6t for pod openshift-network-diagnostics/network-check-target-t77hx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:11:42.966351 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:11:42.966187 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1280bc20-0a50-4b20-8b90-8057a3627648-kube-api-access-2mm6t podName:1280bc20-0a50-4b20-8b90-8057a3627648 nodeName:}" failed. No retries permitted until 2026-04-16 20:11:43.466174038 +0000 UTC m=+3.142442261 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-2mm6t" (UniqueName: "kubernetes.io/projected/1280bc20-0a50-4b20-8b90-8057a3627648-kube-api-access-2mm6t") pod "network-check-target-t77hx" (UID: "1280bc20-0a50-4b20-8b90-8057a3627648") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:11:42.966840 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.966823 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvz8d\" (UniqueName: \"kubernetes.io/projected/bd1cbed9-748e-4677-b9ad-5f73500b5738-kube-api-access-fvz8d\") pod \"multus-additional-cni-plugins-6gf96\" (UID: \"bd1cbed9-748e-4677-b9ad-5f73500b5738\") " pod="openshift-multus/multus-additional-cni-plugins-6gf96" Apr 16 20:11:42.966940 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:42.966925 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7xx2\" (UniqueName: \"kubernetes.io/projected/e372e398-739d-495f-a53d-0b5eea55bc76-kube-api-access-j7xx2\") pod \"multus-8gsfm\" (UID: \"e372e398-739d-495f-a53d-0b5eea55bc76\") " pod="openshift-multus/multus-8gsfm" Apr 16 20:11:43.037737 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:43.037704 2577 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 20:11:43.037921 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:43.037903 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-jwfkw" Apr 16 20:11:43.049007 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:43.048984 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-rcrtv" Apr 16 20:11:43.058650 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:43.058630 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-jwtk7" Apr 16 20:11:43.063203 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:43.063182 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7tn9t" Apr 16 20:11:43.070833 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:43.070808 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-985k4" Apr 16 20:11:43.076425 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:43.076406 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-9qvtx" Apr 16 20:11:43.082995 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:43.082975 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-w8lw7" Apr 16 20:11:43.089065 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:43.088550 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-8gsfm" Apr 16 20:11:43.092419 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:43.092402 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-6gf96" Apr 16 20:11:43.180248 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:43.180213 2577 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 20:11:43.360268 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:43.360226 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c7481f18-a2dc-4df2-8b59-4184bc5c7283-metrics-certs\") pod \"network-metrics-daemon-l9pv6\" (UID: \"c7481f18-a2dc-4df2-8b59-4184bc5c7283\") " pod="openshift-multus/network-metrics-daemon-l9pv6" Apr 16 20:11:43.360441 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:11:43.360360 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:11:43.360441 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:11:43.360424 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c7481f18-a2dc-4df2-8b59-4184bc5c7283-metrics-certs podName:c7481f18-a2dc-4df2-8b59-4184bc5c7283 nodeName:}" failed. No retries permitted until 2026-04-16 20:11:44.360406618 +0000 UTC m=+4.036674835 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c7481f18-a2dc-4df2-8b59-4184bc5c7283-metrics-certs") pod "network-metrics-daemon-l9pv6" (UID: "c7481f18-a2dc-4df2-8b59-4184bc5c7283") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:11:43.471250 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:43.471219 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c16c53d_2e96_42ef_86b8_344e57a267ee.slice/crio-320e5e335093d0120dbf1610e41083cbc17b548688b619c3fa7bb75b689e07b9 WatchSource:0}: Error finding container 320e5e335093d0120dbf1610e41083cbc17b548688b619c3fa7bb75b689e07b9: Status 404 returned error can't find the container with id 320e5e335093d0120dbf1610e41083cbc17b548688b619c3fa7bb75b689e07b9 Apr 16 20:11:43.472118 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:43.472093 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd1cbed9_748e_4677_b9ad_5f73500b5738.slice/crio-b64424961aa042ed96ef1029591396221addf2c8ecda14cfbb8388bbeeeb24ff WatchSource:0}: Error finding container b64424961aa042ed96ef1029591396221addf2c8ecda14cfbb8388bbeeeb24ff: Status 404 returned error can't find the container with id b64424961aa042ed96ef1029591396221addf2c8ecda14cfbb8388bbeeeb24ff Apr 16 20:11:43.473504 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:43.473431 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode372e398_739d_495f_a53d_0b5eea55bc76.slice/crio-9e50d5e1fa72eb039423c146eaa76703ad9675dab24af38812b2f2fdd17f0a34 WatchSource:0}: Error finding container 9e50d5e1fa72eb039423c146eaa76703ad9675dab24af38812b2f2fdd17f0a34: Status 404 returned error can't find the container with id 9e50d5e1fa72eb039423c146eaa76703ad9675dab24af38812b2f2fdd17f0a34 Apr 16 20:11:43.476781 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:43.476761 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22a56877_dc06_432a_9ca1_06a091e2af67.slice/crio-b0267cb8ab58e6a3d0a1400025ae37bd4f1a8024330c55ac64d24304ef4e9cec WatchSource:0}: Error finding container b0267cb8ab58e6a3d0a1400025ae37bd4f1a8024330c55ac64d24304ef4e9cec: Status 404 returned error can't find the container with id b0267cb8ab58e6a3d0a1400025ae37bd4f1a8024330c55ac64d24304ef4e9cec Apr 16 20:11:43.477773 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:43.477732 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d8fda20_4941_47b7_ab67_e2b86e2a8523.slice/crio-84d423a58ce212350da89443947a9b5d0b16086fdb30dcc55e798c35a1c438b9 WatchSource:0}: Error finding container 84d423a58ce212350da89443947a9b5d0b16086fdb30dcc55e798c35a1c438b9: Status 404 returned error can't find the container with id 84d423a58ce212350da89443947a9b5d0b16086fdb30dcc55e798c35a1c438b9 Apr 16 20:11:43.478758 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:43.478733 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod512ea84f_ed8b_4a83_a907_ce38ff494ba1.slice/crio-bc4e4138199dcfe01c1fd16681c106ea4649519d80e05c1e6887a94fefd710a6 WatchSource:0}: Error finding container bc4e4138199dcfe01c1fd16681c106ea4649519d80e05c1e6887a94fefd710a6: Status 404 returned error can't find the container with id bc4e4138199dcfe01c1fd16681c106ea4649519d80e05c1e6887a94fefd710a6 Apr 16 20:11:43.479889 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:43.479864 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod909cb17e_abc8_410f_adb0_206e00624944.slice/crio-6be0f7580447b9256b1c6b6ccf3dcf27ac72a5d0b6cbbbf6b033a8739ad408d9 WatchSource:0}: Error finding container 6be0f7580447b9256b1c6b6ccf3dcf27ac72a5d0b6cbbbf6b033a8739ad408d9: Status 404 returned error can't find the container with id 6be0f7580447b9256b1c6b6ccf3dcf27ac72a5d0b6cbbbf6b033a8739ad408d9 Apr 16 20:11:43.480813 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:43.480791 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56120f2f_13d6_48da_a3d5_a472edb09dfc.slice/crio-542f91316ae3efa8e983b6359612b72148245e95c4e88cf1720a4f859dbdee64 WatchSource:0}: Error finding container 542f91316ae3efa8e983b6359612b72148245e95c4e88cf1720a4f859dbdee64: Status 404 returned error can't find the container with id 542f91316ae3efa8e983b6359612b72148245e95c4e88cf1720a4f859dbdee64 Apr 16 20:11:43.481569 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:11:43.481546 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99d4b836_b69f_4839_b4e1_937780e53711.slice/crio-bb3ff6c81afd6f3febd893591ec5e531287dca26b067056ce196425add68f743 WatchSource:0}: Error finding container bb3ff6c81afd6f3febd893591ec5e531287dca26b067056ce196425add68f743: Status 404 returned error can't find the container with id bb3ff6c81afd6f3febd893591ec5e531287dca26b067056ce196425add68f743 Apr 16 20:11:43.561434 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:43.561277 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2mm6t\" (UniqueName: \"kubernetes.io/projected/1280bc20-0a50-4b20-8b90-8057a3627648-kube-api-access-2mm6t\") pod \"network-check-target-t77hx\" (UID: \"1280bc20-0a50-4b20-8b90-8057a3627648\") " pod="openshift-network-diagnostics/network-check-target-t77hx" Apr 16 20:11:43.561540 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:11:43.561403 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:11:43.561596 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:11:43.561537 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:11:43.561596 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:11:43.561564 2577 projected.go:194] Error preparing data for projected volume kube-api-access-2mm6t for pod openshift-network-diagnostics/network-check-target-t77hx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:11:43.561671 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:11:43.561604 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1280bc20-0a50-4b20-8b90-8057a3627648-kube-api-access-2mm6t podName:1280bc20-0a50-4b20-8b90-8057a3627648 nodeName:}" failed. No retries permitted until 2026-04-16 20:11:44.561590994 +0000 UTC m=+4.237859209 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-2mm6t" (UniqueName: "kubernetes.io/projected/1280bc20-0a50-4b20-8b90-8057a3627648-kube-api-access-2mm6t") pod "network-check-target-t77hx" (UID: "1280bc20-0a50-4b20-8b90-8057a3627648") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:11:43.794784 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:43.794660 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 20:06:41 +0000 UTC" deadline="2027-11-15 11:29:36.04019712 +0000 UTC" Apr 16 20:11:43.794784 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:43.794701 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13863h17m52.24550074s" Apr 16 20:11:43.879373 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:43.879310 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6gf96" event={"ID":"bd1cbed9-748e-4677-b9ad-5f73500b5738","Type":"ContainerStarted","Data":"b64424961aa042ed96ef1029591396221addf2c8ecda14cfbb8388bbeeeb24ff"} Apr 16 20:11:43.882992 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:43.882925 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-w8lw7" event={"ID":"9c16c53d-2e96-42ef-86b8-344e57a267ee","Type":"ContainerStarted","Data":"320e5e335093d0120dbf1610e41083cbc17b548688b619c3fa7bb75b689e07b9"} Apr 16 20:11:43.899331 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:43.899291 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-31.ec2.internal" event={"ID":"83cea7cfd9a6bfdb650e2cb8e7b4bc3c","Type":"ContainerStarted","Data":"b5e7ba128615b8bbfdfe4071307c6b5318ac553553b42232deb45d316c2ede63"} Apr 16 20:11:43.907769 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:43.906411 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7tn9t" event={"ID":"2d8fda20-4941-47b7-ab67-e2b86e2a8523","Type":"ContainerStarted","Data":"84d423a58ce212350da89443947a9b5d0b16086fdb30dcc55e798c35a1c438b9"} Apr 16 20:11:43.909653 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:43.909589 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-9qvtx" event={"ID":"99d4b836-b69f-4839-b4e1-937780e53711","Type":"ContainerStarted","Data":"bb3ff6c81afd6f3febd893591ec5e531287dca26b067056ce196425add68f743"} Apr 16 20:11:43.918004 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:43.917942 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-31.ec2.internal" podStartSLOduration=1.917927583 podStartE2EDuration="1.917927583s" podCreationTimestamp="2026-04-16 20:11:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:11:43.917732388 +0000 UTC m=+3.594000633" watchObservedRunningTime="2026-04-16 20:11:43.917927583 +0000 UTC m=+3.594195817" Apr 16 20:11:43.918704 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:43.918678 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-jwtk7" event={"ID":"512ea84f-ed8b-4a83-a907-ce38ff494ba1","Type":"ContainerStarted","Data":"bc4e4138199dcfe01c1fd16681c106ea4649519d80e05c1e6887a94fefd710a6"} Apr 16 20:11:43.927804 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:43.927756 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-985k4" event={"ID":"909cb17e-abc8-410f-adb0-206e00624944","Type":"ContainerStarted","Data":"6be0f7580447b9256b1c6b6ccf3dcf27ac72a5d0b6cbbbf6b033a8739ad408d9"} Apr 16 20:11:43.932447 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:43.932391 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-jwfkw" event={"ID":"56120f2f-13d6-48da-a3d5-a472edb09dfc","Type":"ContainerStarted","Data":"542f91316ae3efa8e983b6359612b72148245e95c4e88cf1720a4f859dbdee64"} Apr 16 20:11:43.940934 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:43.940906 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-rcrtv" event={"ID":"22a56877-dc06-432a-9ca1-06a091e2af67","Type":"ContainerStarted","Data":"b0267cb8ab58e6a3d0a1400025ae37bd4f1a8024330c55ac64d24304ef4e9cec"} Apr 16 20:11:43.949698 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:43.948485 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8gsfm" event={"ID":"e372e398-739d-495f-a53d-0b5eea55bc76","Type":"ContainerStarted","Data":"9e50d5e1fa72eb039423c146eaa76703ad9675dab24af38812b2f2fdd17f0a34"} Apr 16 20:11:44.369284 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:44.368195 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c7481f18-a2dc-4df2-8b59-4184bc5c7283-metrics-certs\") pod \"network-metrics-daemon-l9pv6\" (UID: \"c7481f18-a2dc-4df2-8b59-4184bc5c7283\") " pod="openshift-multus/network-metrics-daemon-l9pv6" Apr 16 20:11:44.369284 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:11:44.368344 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:11:44.369284 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:11:44.368412 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c7481f18-a2dc-4df2-8b59-4184bc5c7283-metrics-certs podName:c7481f18-a2dc-4df2-8b59-4184bc5c7283 nodeName:}" failed. No retries permitted until 2026-04-16 20:11:46.368391129 +0000 UTC m=+6.044659345 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c7481f18-a2dc-4df2-8b59-4184bc5c7283-metrics-certs") pod "network-metrics-daemon-l9pv6" (UID: "c7481f18-a2dc-4df2-8b59-4184bc5c7283") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:11:44.572372 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:44.572333 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2mm6t\" (UniqueName: \"kubernetes.io/projected/1280bc20-0a50-4b20-8b90-8057a3627648-kube-api-access-2mm6t\") pod \"network-check-target-t77hx\" (UID: \"1280bc20-0a50-4b20-8b90-8057a3627648\") " pod="openshift-network-diagnostics/network-check-target-t77hx" Apr 16 20:11:44.572571 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:11:44.572554 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:11:44.572641 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:11:44.572577 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:11:44.572641 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:11:44.572590 2577 projected.go:194] Error preparing data for projected volume kube-api-access-2mm6t for pod openshift-network-diagnostics/network-check-target-t77hx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:11:44.572742 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:11:44.572653 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1280bc20-0a50-4b20-8b90-8057a3627648-kube-api-access-2mm6t podName:1280bc20-0a50-4b20-8b90-8057a3627648 nodeName:}" failed. No retries permitted until 2026-04-16 20:11:46.572635498 +0000 UTC m=+6.248903712 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-2mm6t" (UniqueName: "kubernetes.io/projected/1280bc20-0a50-4b20-8b90-8057a3627648-kube-api-access-2mm6t") pod "network-check-target-t77hx" (UID: "1280bc20-0a50-4b20-8b90-8057a3627648") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:11:44.871314 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:44.871109 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l9pv6" Apr 16 20:11:44.871314 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:44.871109 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t77hx" Apr 16 20:11:44.871314 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:11:44.871261 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l9pv6" podUID="c7481f18-a2dc-4df2-8b59-4184bc5c7283" Apr 16 20:11:44.871846 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:11:44.871323 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t77hx" podUID="1280bc20-0a50-4b20-8b90-8057a3627648" Apr 16 20:11:44.976324 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:44.976238 2577 generic.go:358] "Generic (PLEG): container finished" podID="4af69d7c0e5c5b8fbd782e214cad2fa5" containerID="c4d958b6937227fa4e5997b68d2b92da93666c8e85abaada2dcbed400b5d84a3" exitCode=0 Apr 16 20:11:44.976492 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:44.976331 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-31.ec2.internal" event={"ID":"4af69d7c0e5c5b8fbd782e214cad2fa5","Type":"ContainerDied","Data":"c4d958b6937227fa4e5997b68d2b92da93666c8e85abaada2dcbed400b5d84a3"} Apr 16 20:11:45.982993 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:45.982876 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-31.ec2.internal" event={"ID":"4af69d7c0e5c5b8fbd782e214cad2fa5","Type":"ContainerStarted","Data":"ac5a8ee21ad76fe18e2c5547a05b2d54857181c7d52a3ef89066da62c854a1a7"} Apr 16 20:11:46.386017 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:46.385978 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c7481f18-a2dc-4df2-8b59-4184bc5c7283-metrics-certs\") pod \"network-metrics-daemon-l9pv6\" (UID: \"c7481f18-a2dc-4df2-8b59-4184bc5c7283\") " pod="openshift-multus/network-metrics-daemon-l9pv6" Apr 16 20:11:46.386201 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:11:46.386122 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:11:46.386201 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:11:46.386178 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c7481f18-a2dc-4df2-8b59-4184bc5c7283-metrics-certs podName:c7481f18-a2dc-4df2-8b59-4184bc5c7283 nodeName:}" failed. No retries permitted until 2026-04-16 20:11:50.386159886 +0000 UTC m=+10.062428103 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c7481f18-a2dc-4df2-8b59-4184bc5c7283-metrics-certs") pod "network-metrics-daemon-l9pv6" (UID: "c7481f18-a2dc-4df2-8b59-4184bc5c7283") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:11:46.588254 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:46.587791 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2mm6t\" (UniqueName: \"kubernetes.io/projected/1280bc20-0a50-4b20-8b90-8057a3627648-kube-api-access-2mm6t\") pod \"network-check-target-t77hx\" (UID: \"1280bc20-0a50-4b20-8b90-8057a3627648\") " pod="openshift-network-diagnostics/network-check-target-t77hx" Apr 16 20:11:46.588254 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:11:46.588003 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:11:46.588254 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:11:46.588025 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:11:46.588254 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:11:46.588038 2577 projected.go:194] Error preparing data for projected volume kube-api-access-2mm6t for pod openshift-network-diagnostics/network-check-target-t77hx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:11:46.588254 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:11:46.588100 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1280bc20-0a50-4b20-8b90-8057a3627648-kube-api-access-2mm6t podName:1280bc20-0a50-4b20-8b90-8057a3627648 nodeName:}" failed. No retries permitted until 2026-04-16 20:11:50.58807853 +0000 UTC m=+10.264346743 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-2mm6t" (UniqueName: "kubernetes.io/projected/1280bc20-0a50-4b20-8b90-8057a3627648-kube-api-access-2mm6t") pod "network-check-target-t77hx" (UID: "1280bc20-0a50-4b20-8b90-8057a3627648") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:11:46.871438 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:46.871401 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t77hx" Apr 16 20:11:46.871648 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:11:46.871537 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t77hx" podUID="1280bc20-0a50-4b20-8b90-8057a3627648" Apr 16 20:11:46.872847 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:46.872816 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l9pv6" Apr 16 20:11:46.872987 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:11:46.872968 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l9pv6" podUID="c7481f18-a2dc-4df2-8b59-4184bc5c7283" Apr 16 20:11:48.871098 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:48.871060 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t77hx" Apr 16 20:11:48.871548 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:48.871104 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l9pv6" Apr 16 20:11:48.871548 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:11:48.871194 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t77hx" podUID="1280bc20-0a50-4b20-8b90-8057a3627648" Apr 16 20:11:48.871548 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:11:48.871342 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l9pv6" podUID="c7481f18-a2dc-4df2-8b59-4184bc5c7283" Apr 16 20:11:49.334886 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:49.334835 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-31.ec2.internal" podStartSLOduration=7.334814617 podStartE2EDuration="7.334814617s" podCreationTimestamp="2026-04-16 20:11:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:11:45.997125614 +0000 UTC m=+5.673393850" watchObservedRunningTime="2026-04-16 20:11:49.334814617 +0000 UTC m=+9.011082850" Apr 16 20:11:49.335480 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:49.335455 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-8ng5b"] Apr 16 20:11:49.338766 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:49.338741 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8ng5b" Apr 16 20:11:49.338850 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:11:49.338827 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8ng5b" podUID="e23b04fb-ea11-4ca2-9b81-1b9b8594f1eb" Apr 16 20:11:49.410078 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:49.410042 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/e23b04fb-ea11-4ca2-9b81-1b9b8594f1eb-kubelet-config\") pod \"global-pull-secret-syncer-8ng5b\" (UID: \"e23b04fb-ea11-4ca2-9b81-1b9b8594f1eb\") " pod="kube-system/global-pull-secret-syncer-8ng5b" Apr 16 20:11:49.410254 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:49.410165 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e23b04fb-ea11-4ca2-9b81-1b9b8594f1eb-original-pull-secret\") pod \"global-pull-secret-syncer-8ng5b\" (UID: \"e23b04fb-ea11-4ca2-9b81-1b9b8594f1eb\") " pod="kube-system/global-pull-secret-syncer-8ng5b" Apr 16 20:11:49.410254 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:49.410240 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/e23b04fb-ea11-4ca2-9b81-1b9b8594f1eb-dbus\") pod \"global-pull-secret-syncer-8ng5b\" (UID: \"e23b04fb-ea11-4ca2-9b81-1b9b8594f1eb\") " pod="kube-system/global-pull-secret-syncer-8ng5b" Apr 16 20:11:49.510631 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:49.510589 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e23b04fb-ea11-4ca2-9b81-1b9b8594f1eb-original-pull-secret\") pod \"global-pull-secret-syncer-8ng5b\" (UID: \"e23b04fb-ea11-4ca2-9b81-1b9b8594f1eb\") " pod="kube-system/global-pull-secret-syncer-8ng5b" Apr 16 20:11:49.510798 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:49.510675 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/e23b04fb-ea11-4ca2-9b81-1b9b8594f1eb-dbus\") pod \"global-pull-secret-syncer-8ng5b\" (UID: \"e23b04fb-ea11-4ca2-9b81-1b9b8594f1eb\") " pod="kube-system/global-pull-secret-syncer-8ng5b" Apr 16 20:11:49.510798 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:49.510716 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/e23b04fb-ea11-4ca2-9b81-1b9b8594f1eb-kubelet-config\") pod \"global-pull-secret-syncer-8ng5b\" (UID: \"e23b04fb-ea11-4ca2-9b81-1b9b8594f1eb\") " pod="kube-system/global-pull-secret-syncer-8ng5b" Apr 16 20:11:49.510798 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:11:49.510747 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 20:11:49.510977 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:11:49.510825 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e23b04fb-ea11-4ca2-9b81-1b9b8594f1eb-original-pull-secret podName:e23b04fb-ea11-4ca2-9b81-1b9b8594f1eb nodeName:}" failed. No retries permitted until 2026-04-16 20:11:50.010802971 +0000 UTC m=+9.687071187 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e23b04fb-ea11-4ca2-9b81-1b9b8594f1eb-original-pull-secret") pod "global-pull-secret-syncer-8ng5b" (UID: "e23b04fb-ea11-4ca2-9b81-1b9b8594f1eb") : object "kube-system"/"original-pull-secret" not registered Apr 16 20:11:49.510977 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:49.510825 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/e23b04fb-ea11-4ca2-9b81-1b9b8594f1eb-kubelet-config\") pod \"global-pull-secret-syncer-8ng5b\" (UID: \"e23b04fb-ea11-4ca2-9b81-1b9b8594f1eb\") " pod="kube-system/global-pull-secret-syncer-8ng5b" Apr 16 20:11:49.511109 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:49.510989 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/e23b04fb-ea11-4ca2-9b81-1b9b8594f1eb-dbus\") pod \"global-pull-secret-syncer-8ng5b\" (UID: \"e23b04fb-ea11-4ca2-9b81-1b9b8594f1eb\") " pod="kube-system/global-pull-secret-syncer-8ng5b" Apr 16 20:11:50.015312 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:50.015272 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e23b04fb-ea11-4ca2-9b81-1b9b8594f1eb-original-pull-secret\") pod \"global-pull-secret-syncer-8ng5b\" (UID: \"e23b04fb-ea11-4ca2-9b81-1b9b8594f1eb\") " pod="kube-system/global-pull-secret-syncer-8ng5b" Apr 16 20:11:50.015761 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:11:50.015397 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 20:11:50.015761 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:11:50.015464 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e23b04fb-ea11-4ca2-9b81-1b9b8594f1eb-original-pull-secret podName:e23b04fb-ea11-4ca2-9b81-1b9b8594f1eb nodeName:}" failed. No retries permitted until 2026-04-16 20:11:51.015445644 +0000 UTC m=+10.691713883 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e23b04fb-ea11-4ca2-9b81-1b9b8594f1eb-original-pull-secret") pod "global-pull-secret-syncer-8ng5b" (UID: "e23b04fb-ea11-4ca2-9b81-1b9b8594f1eb") : object "kube-system"/"original-pull-secret" not registered Apr 16 20:11:50.418852 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:50.418819 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c7481f18-a2dc-4df2-8b59-4184bc5c7283-metrics-certs\") pod \"network-metrics-daemon-l9pv6\" (UID: \"c7481f18-a2dc-4df2-8b59-4184bc5c7283\") " pod="openshift-multus/network-metrics-daemon-l9pv6" Apr 16 20:11:50.419065 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:11:50.418967 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:11:50.419065 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:11:50.419038 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c7481f18-a2dc-4df2-8b59-4184bc5c7283-metrics-certs podName:c7481f18-a2dc-4df2-8b59-4184bc5c7283 nodeName:}" failed. No retries permitted until 2026-04-16 20:11:58.419017458 +0000 UTC m=+18.095285684 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c7481f18-a2dc-4df2-8b59-4184bc5c7283-metrics-certs") pod "network-metrics-daemon-l9pv6" (UID: "c7481f18-a2dc-4df2-8b59-4184bc5c7283") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:11:50.620418 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:50.620378 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2mm6t\" (UniqueName: \"kubernetes.io/projected/1280bc20-0a50-4b20-8b90-8057a3627648-kube-api-access-2mm6t\") pod \"network-check-target-t77hx\" (UID: \"1280bc20-0a50-4b20-8b90-8057a3627648\") " pod="openshift-network-diagnostics/network-check-target-t77hx" Apr 16 20:11:50.620584 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:11:50.620560 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:11:50.620677 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:11:50.620587 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:11:50.620677 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:11:50.620602 2577 projected.go:194] Error preparing data for projected volume kube-api-access-2mm6t for pod openshift-network-diagnostics/network-check-target-t77hx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:11:50.620677 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:11:50.620665 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1280bc20-0a50-4b20-8b90-8057a3627648-kube-api-access-2mm6t podName:1280bc20-0a50-4b20-8b90-8057a3627648 nodeName:}" failed. No retries permitted until 2026-04-16 20:11:58.620645493 +0000 UTC m=+18.296913710 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-2mm6t" (UniqueName: "kubernetes.io/projected/1280bc20-0a50-4b20-8b90-8057a3627648-kube-api-access-2mm6t") pod "network-check-target-t77hx" (UID: "1280bc20-0a50-4b20-8b90-8057a3627648") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:11:50.871563 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:50.871527 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l9pv6" Apr 16 20:11:50.871734 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:50.871574 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t77hx" Apr 16 20:11:50.871734 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:11:50.871681 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l9pv6" podUID="c7481f18-a2dc-4df2-8b59-4184bc5c7283" Apr 16 20:11:50.873074 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:11:50.873039 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t77hx" podUID="1280bc20-0a50-4b20-8b90-8057a3627648" Apr 16 20:11:50.873191 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:50.873109 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8ng5b" Apr 16 20:11:50.873241 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:11:50.873185 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8ng5b" podUID="e23b04fb-ea11-4ca2-9b81-1b9b8594f1eb" Apr 16 20:11:51.024437 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:51.024397 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e23b04fb-ea11-4ca2-9b81-1b9b8594f1eb-original-pull-secret\") pod \"global-pull-secret-syncer-8ng5b\" (UID: \"e23b04fb-ea11-4ca2-9b81-1b9b8594f1eb\") " pod="kube-system/global-pull-secret-syncer-8ng5b" Apr 16 20:11:51.024889 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:11:51.024587 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 20:11:51.024889 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:11:51.024650 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e23b04fb-ea11-4ca2-9b81-1b9b8594f1eb-original-pull-secret podName:e23b04fb-ea11-4ca2-9b81-1b9b8594f1eb nodeName:}" failed. No retries permitted until 2026-04-16 20:11:53.024631466 +0000 UTC m=+12.700899681 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e23b04fb-ea11-4ca2-9b81-1b9b8594f1eb-original-pull-secret") pod "global-pull-secret-syncer-8ng5b" (UID: "e23b04fb-ea11-4ca2-9b81-1b9b8594f1eb") : object "kube-system"/"original-pull-secret" not registered Apr 16 20:11:52.871279 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:52.871241 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l9pv6" Apr 16 20:11:52.871780 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:52.871242 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t77hx" Apr 16 20:11:52.871780 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:11:52.871377 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l9pv6" podUID="c7481f18-a2dc-4df2-8b59-4184bc5c7283" Apr 16 20:11:52.871780 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:52.871242 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8ng5b" Apr 16 20:11:52.871780 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:11:52.871458 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t77hx" podUID="1280bc20-0a50-4b20-8b90-8057a3627648" Apr 16 20:11:52.871780 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:11:52.871537 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8ng5b" podUID="e23b04fb-ea11-4ca2-9b81-1b9b8594f1eb" Apr 16 20:11:53.040183 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:53.040139 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e23b04fb-ea11-4ca2-9b81-1b9b8594f1eb-original-pull-secret\") pod \"global-pull-secret-syncer-8ng5b\" (UID: \"e23b04fb-ea11-4ca2-9b81-1b9b8594f1eb\") " pod="kube-system/global-pull-secret-syncer-8ng5b" Apr 16 20:11:53.040348 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:11:53.040284 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 20:11:53.040399 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:11:53.040352 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e23b04fb-ea11-4ca2-9b81-1b9b8594f1eb-original-pull-secret podName:e23b04fb-ea11-4ca2-9b81-1b9b8594f1eb nodeName:}" failed. No retries permitted until 2026-04-16 20:11:57.040333589 +0000 UTC m=+16.716601808 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e23b04fb-ea11-4ca2-9b81-1b9b8594f1eb-original-pull-secret") pod "global-pull-secret-syncer-8ng5b" (UID: "e23b04fb-ea11-4ca2-9b81-1b9b8594f1eb") : object "kube-system"/"original-pull-secret" not registered Apr 16 20:11:54.871045 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:54.871007 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t77hx" Apr 16 20:11:54.871525 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:54.871007 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8ng5b" Apr 16 20:11:54.871525 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:11:54.871140 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t77hx" podUID="1280bc20-0a50-4b20-8b90-8057a3627648" Apr 16 20:11:54.871525 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:54.871008 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l9pv6" Apr 16 20:11:54.871525 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:11:54.871244 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8ng5b" podUID="e23b04fb-ea11-4ca2-9b81-1b9b8594f1eb" Apr 16 20:11:54.871525 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:11:54.871299 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l9pv6" podUID="c7481f18-a2dc-4df2-8b59-4184bc5c7283" Apr 16 20:11:56.871463 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:56.871422 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8ng5b" Apr 16 20:11:56.871920 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:56.871503 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l9pv6" Apr 16 20:11:56.871920 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:56.871593 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t77hx" Apr 16 20:11:56.871920 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:11:56.871605 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8ng5b" podUID="e23b04fb-ea11-4ca2-9b81-1b9b8594f1eb" Apr 16 20:11:56.871920 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:11:56.871696 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l9pv6" podUID="c7481f18-a2dc-4df2-8b59-4184bc5c7283" Apr 16 20:11:56.871920 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:11:56.871775 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t77hx" podUID="1280bc20-0a50-4b20-8b90-8057a3627648" Apr 16 20:11:57.069585 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:57.069550 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e23b04fb-ea11-4ca2-9b81-1b9b8594f1eb-original-pull-secret\") pod \"global-pull-secret-syncer-8ng5b\" (UID: \"e23b04fb-ea11-4ca2-9b81-1b9b8594f1eb\") " pod="kube-system/global-pull-secret-syncer-8ng5b" Apr 16 20:11:57.069750 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:11:57.069686 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 20:11:57.069795 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:11:57.069759 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e23b04fb-ea11-4ca2-9b81-1b9b8594f1eb-original-pull-secret podName:e23b04fb-ea11-4ca2-9b81-1b9b8594f1eb nodeName:}" failed. No retries permitted until 2026-04-16 20:12:05.069739295 +0000 UTC m=+24.746007518 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e23b04fb-ea11-4ca2-9b81-1b9b8594f1eb-original-pull-secret") pod "global-pull-secret-syncer-8ng5b" (UID: "e23b04fb-ea11-4ca2-9b81-1b9b8594f1eb") : object "kube-system"/"original-pull-secret" not registered Apr 16 20:11:58.482219 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:58.482174 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c7481f18-a2dc-4df2-8b59-4184bc5c7283-metrics-certs\") pod \"network-metrics-daemon-l9pv6\" (UID: \"c7481f18-a2dc-4df2-8b59-4184bc5c7283\") " pod="openshift-multus/network-metrics-daemon-l9pv6" Apr 16 20:11:58.482695 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:11:58.482342 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:11:58.482695 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:11:58.482430 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c7481f18-a2dc-4df2-8b59-4184bc5c7283-metrics-certs podName:c7481f18-a2dc-4df2-8b59-4184bc5c7283 nodeName:}" failed. No retries permitted until 2026-04-16 20:12:14.482408503 +0000 UTC m=+34.158676728 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c7481f18-a2dc-4df2-8b59-4184bc5c7283-metrics-certs") pod "network-metrics-daemon-l9pv6" (UID: "c7481f18-a2dc-4df2-8b59-4184bc5c7283") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:11:58.684569 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:58.684534 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2mm6t\" (UniqueName: \"kubernetes.io/projected/1280bc20-0a50-4b20-8b90-8057a3627648-kube-api-access-2mm6t\") pod \"network-check-target-t77hx\" (UID: \"1280bc20-0a50-4b20-8b90-8057a3627648\") " pod="openshift-network-diagnostics/network-check-target-t77hx" Apr 16 20:11:58.684750 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:11:58.684731 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:11:58.684799 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:11:58.684757 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:11:58.684799 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:11:58.684772 2577 projected.go:194] Error preparing data for projected volume kube-api-access-2mm6t for pod openshift-network-diagnostics/network-check-target-t77hx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:11:58.684869 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:11:58.684834 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1280bc20-0a50-4b20-8b90-8057a3627648-kube-api-access-2mm6t podName:1280bc20-0a50-4b20-8b90-8057a3627648 nodeName:}" failed. No retries permitted until 2026-04-16 20:12:14.684816375 +0000 UTC m=+34.361084590 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-2mm6t" (UniqueName: "kubernetes.io/projected/1280bc20-0a50-4b20-8b90-8057a3627648-kube-api-access-2mm6t") pod "network-check-target-t77hx" (UID: "1280bc20-0a50-4b20-8b90-8057a3627648") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:11:58.871382 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:58.871347 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l9pv6" Apr 16 20:11:58.871382 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:58.871379 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t77hx" Apr 16 20:11:58.871621 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:11:58.871497 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l9pv6" podUID="c7481f18-a2dc-4df2-8b59-4184bc5c7283" Apr 16 20:11:58.871621 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:11:58.871518 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8ng5b" Apr 16 20:11:58.871717 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:11:58.871624 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t77hx" podUID="1280bc20-0a50-4b20-8b90-8057a3627648" Apr 16 20:11:58.871774 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:11:58.871718 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8ng5b" podUID="e23b04fb-ea11-4ca2-9b81-1b9b8594f1eb" Apr 16 20:12:00.872609 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:00.872291 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t77hx" Apr 16 20:12:00.872609 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:12:00.872402 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t77hx" podUID="1280bc20-0a50-4b20-8b90-8057a3627648" Apr 16 20:12:00.873119 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:00.872694 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8ng5b" Apr 16 20:12:00.873119 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:12:00.872792 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8ng5b" podUID="e23b04fb-ea11-4ca2-9b81-1b9b8594f1eb" Apr 16 20:12:00.873119 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:00.872835 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l9pv6" Apr 16 20:12:00.873119 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:12:00.872914 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l9pv6" podUID="c7481f18-a2dc-4df2-8b59-4184bc5c7283" Apr 16 20:12:01.010809 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:01.010487 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-jwfkw" event={"ID":"56120f2f-13d6-48da-a3d5-a472edb09dfc","Type":"ContainerStarted","Data":"ed1a1085d98c22d0eeb48b0a242d252f1f8f9ca2b5d2a7e814e780068edee215"} Apr 16 20:12:01.013536 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:01.013494 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-w8lw7" event={"ID":"9c16c53d-2e96-42ef-86b8-344e57a267ee","Type":"ContainerStarted","Data":"3b629abc27f0b879ebf14b808f5d5e347c01ce98997956c08fe2a0bbb57191bb"} Apr 16 20:12:01.028414 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:01.027697 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-jwfkw" podStartSLOduration=12.025595784 podStartE2EDuration="21.027680048s" podCreationTimestamp="2026-04-16 20:11:40 +0000 UTC" firstStartedPulling="2026-04-16 20:11:43.484741104 +0000 UTC m=+3.161009317" lastFinishedPulling="2026-04-16 20:11:52.486825356 +0000 UTC m=+12.163093581" observedRunningTime="2026-04-16 20:12:01.027038693 +0000 UTC m=+20.703306926" watchObservedRunningTime="2026-04-16 20:12:01.027680048 +0000 UTC m=+20.703948282" Apr 16 20:12:01.042458 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:01.042402 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-w8lw7" podStartSLOduration=3.092497179 podStartE2EDuration="20.042383276s" podCreationTimestamp="2026-04-16 20:11:41 +0000 UTC" firstStartedPulling="2026-04-16 20:11:43.472746823 +0000 UTC m=+3.149015033" lastFinishedPulling="2026-04-16 20:12:00.42263292 +0000 UTC m=+20.098901130" observedRunningTime="2026-04-16 20:12:01.041919518 +0000 UTC m=+20.718187745" watchObservedRunningTime="2026-04-16 20:12:01.042383276 +0000 UTC m=+20.718651511" Apr 16 20:12:01.940103 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:01.939853 2577 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 20:12:02.015814 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:02.015782 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8gsfm" event={"ID":"e372e398-739d-495f-a53d-0b5eea55bc76","Type":"ContainerStarted","Data":"1711110ca436a25e3a87fb4b0c1dd7a46205f4eb2fa0e15e6ac6d620bb31989e"} Apr 16 20:12:02.017157 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:02.017134 2577 generic.go:358] "Generic (PLEG): container finished" podID="bd1cbed9-748e-4677-b9ad-5f73500b5738" containerID="f5b20722bfcf00d25e3693a4ab88d12c8ab612e7ad7bab4f4ab5e0d4f329e5f1" exitCode=0 Apr 16 20:12:02.017261 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:02.017200 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6gf96" event={"ID":"bd1cbed9-748e-4677-b9ad-5f73500b5738","Type":"ContainerDied","Data":"f5b20722bfcf00d25e3693a4ab88d12c8ab612e7ad7bab4f4ab5e0d4f329e5f1"} Apr 16 20:12:02.018838 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:02.018811 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7tn9t" event={"ID":"2d8fda20-4941-47b7-ab67-e2b86e2a8523","Type":"ContainerStarted","Data":"080923af80e8bfd43d5ba48e0e4e0196c7cc1b3e6dd5bbe091ce6dff4d6a8922"} Apr 16 20:12:02.018838 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:02.018835 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7tn9t" event={"ID":"2d8fda20-4941-47b7-ab67-e2b86e2a8523","Type":"ContainerStarted","Data":"1f4b4cb610ec8c6395a1277b2a3d29d869dd5977017725259e93d4501d9a47fb"} Apr 16 20:12:02.020191 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:02.020171 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-9qvtx" event={"ID":"99d4b836-b69f-4839-b4e1-937780e53711","Type":"ContainerStarted","Data":"9acac837276e2a322e3ad509a9b9df581443251a642271af0c9f305b085916dd"} Apr 16 20:12:02.022685 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:02.022665 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-985k4" event={"ID":"909cb17e-abc8-410f-adb0-206e00624944","Type":"ContainerStarted","Data":"5d434a145a6c9ac3af921b8dcf5bf003a5cdea65e2cfd4807cb7d990d8c47dae"} Apr 16 20:12:02.022771 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:02.022693 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-985k4" event={"ID":"909cb17e-abc8-410f-adb0-206e00624944","Type":"ContainerStarted","Data":"04998885c6c1fc47d800eabc23f897308d08e681bf25b0f10b92779d0bd10184"} Apr 16 20:12:02.022771 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:02.022705 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-985k4" event={"ID":"909cb17e-abc8-410f-adb0-206e00624944","Type":"ContainerStarted","Data":"7aa85268e2e4c95337c8084de14ded28c1726fbdad0a64fe4fa4714253728b75"} Apr 16 20:12:02.022771 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:02.022715 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-985k4" event={"ID":"909cb17e-abc8-410f-adb0-206e00624944","Type":"ContainerStarted","Data":"b7e6f112d0fe71b06ad85bc3d5ca10dcb6974c4e97bb0435f529e23e6c73ed1c"} Apr 16 20:12:02.022771 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:02.022723 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-985k4" event={"ID":"909cb17e-abc8-410f-adb0-206e00624944","Type":"ContainerStarted","Data":"ee2fbe514b0a53e29abd0766b541429e7e9478e387c821b46252d3212fa87a8e"} Apr 16 20:12:02.022771 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:02.022735 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-985k4" event={"ID":"909cb17e-abc8-410f-adb0-206e00624944","Type":"ContainerStarted","Data":"ed1af19c0e82bdf0736aeb7747ef4427668dd50e71852aeb4fc3ac4029602418"} Apr 16 20:12:02.023836 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:02.023816 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-rcrtv" event={"ID":"22a56877-dc06-432a-9ca1-06a091e2af67","Type":"ContainerStarted","Data":"0f60716cf1246b47c86977ee2a549a60d7a05947c894e873ec563195a2080218"} Apr 16 20:12:02.031304 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:02.031238 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-8gsfm" podStartSLOduration=3.738186179 podStartE2EDuration="21.031225865s" podCreationTimestamp="2026-04-16 20:11:41 +0000 UTC" firstStartedPulling="2026-04-16 20:11:43.475881773 +0000 UTC m=+3.152149982" lastFinishedPulling="2026-04-16 20:12:00.768921453 +0000 UTC m=+20.445189668" observedRunningTime="2026-04-16 20:12:02.031100844 +0000 UTC m=+21.707369075" watchObservedRunningTime="2026-04-16 20:12:02.031225865 +0000 UTC m=+21.707494098" Apr 16 20:12:02.044324 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:02.044292 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-rcrtv" podStartSLOduration=4.759554069 podStartE2EDuration="22.04428214s" podCreationTimestamp="2026-04-16 20:11:40 +0000 UTC" firstStartedPulling="2026-04-16 20:11:43.480088081 +0000 UTC m=+3.156356306" lastFinishedPulling="2026-04-16 20:12:00.764816154 +0000 UTC m=+20.441084377" observedRunningTime="2026-04-16 20:12:02.044105024 +0000 UTC m=+21.720373256" watchObservedRunningTime="2026-04-16 20:12:02.04428214 +0000 UTC m=+21.720550372" Apr 16 20:12:02.056903 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:02.056871 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-9qvtx" podStartSLOduration=3.7781295200000002 podStartE2EDuration="21.056862461s" podCreationTimestamp="2026-04-16 20:11:41 +0000 UTC" firstStartedPulling="2026-04-16 20:11:43.484527296 +0000 UTC m=+3.160795514" lastFinishedPulling="2026-04-16 20:12:00.763260224 +0000 UTC m=+20.439528455" observedRunningTime="2026-04-16 20:12:02.056573915 +0000 UTC m=+21.732842147" watchObservedRunningTime="2026-04-16 20:12:02.056862461 +0000 UTC m=+21.733130693" Apr 16 20:12:02.809578 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:02.809443 2577 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T20:12:01.940100205Z","UUID":"dd622018-8002-4de0-af40-d784f5654696","Handler":null,"Name":"","Endpoint":""} Apr 16 20:12:02.812393 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:02.812365 2577 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 20:12:02.812485 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:02.812414 2577 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 20:12:02.870404 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:02.870372 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8ng5b" Apr 16 20:12:02.870561 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:02.870444 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l9pv6" Apr 16 20:12:02.870561 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:02.870463 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t77hx" Apr 16 20:12:02.870640 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:12:02.870574 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l9pv6" podUID="c7481f18-a2dc-4df2-8b59-4184bc5c7283" Apr 16 20:12:02.870640 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:12:02.870612 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t77hx" podUID="1280bc20-0a50-4b20-8b90-8057a3627648" Apr 16 20:12:02.870746 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:12:02.870720 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8ng5b" podUID="e23b04fb-ea11-4ca2-9b81-1b9b8594f1eb" Apr 16 20:12:03.027210 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:03.027116 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-jwtk7" event={"ID":"512ea84f-ed8b-4a83-a907-ce38ff494ba1","Type":"ContainerStarted","Data":"d7485ea7b3604cc13133df931c5f28bcc9a1b1da0052ab6a64419e659ebc23a6"} Apr 16 20:12:03.029170 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:03.029140 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7tn9t" event={"ID":"2d8fda20-4941-47b7-ab67-e2b86e2a8523","Type":"ContainerStarted","Data":"d6ce8d4e22d74a48b8b44762625f4be1851c137f8708c63ea75276917038d4dd"} Apr 16 20:12:03.105190 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:03.105130 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7tn9t" podStartSLOduration=3.882201718 podStartE2EDuration="23.105110136s" podCreationTimestamp="2026-04-16 20:11:40 +0000 UTC" firstStartedPulling="2026-04-16 20:11:43.482540126 +0000 UTC m=+3.158808335" lastFinishedPulling="2026-04-16 20:12:02.705448528 +0000 UTC m=+22.381716753" observedRunningTime="2026-04-16 20:12:03.105016204 +0000 UTC m=+22.781284436" watchObservedRunningTime="2026-04-16 20:12:03.105110136 +0000 UTC m=+22.781378404" Apr 16 20:12:03.106041 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:03.105996 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-jwtk7" podStartSLOduration=6.165812807 podStartE2EDuration="23.105986141s" podCreationTimestamp="2026-04-16 20:11:40 +0000 UTC" firstStartedPulling="2026-04-16 20:11:43.482486062 +0000 UTC m=+3.158754275" lastFinishedPulling="2026-04-16 20:12:00.4226594 +0000 UTC m=+20.098927609" observedRunningTime="2026-04-16 20:12:03.054397109 +0000 UTC m=+22.730665341" watchObservedRunningTime="2026-04-16 20:12:03.105986141 +0000 UTC m=+22.782254373" Apr 16 20:12:04.034331 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:04.034284 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-985k4" event={"ID":"909cb17e-abc8-410f-adb0-206e00624944","Type":"ContainerStarted","Data":"e238dc66459f085356cdc1badb43a14a3b7222685a06485b97031e5b491de454"} Apr 16 20:12:04.603831 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:04.603801 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-9qvtx" Apr 16 20:12:04.871028 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:04.870777 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t77hx" Apr 16 20:12:04.871183 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:04.870777 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8ng5b" Apr 16 20:12:04.871183 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:12:04.871069 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t77hx" podUID="1280bc20-0a50-4b20-8b90-8057a3627648" Apr 16 20:12:04.871183 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:04.870790 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l9pv6" Apr 16 20:12:04.871183 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:12:04.871165 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8ng5b" podUID="e23b04fb-ea11-4ca2-9b81-1b9b8594f1eb" Apr 16 20:12:04.871395 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:12:04.871262 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l9pv6" podUID="c7481f18-a2dc-4df2-8b59-4184bc5c7283" Apr 16 20:12:05.133017 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:05.132923 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e23b04fb-ea11-4ca2-9b81-1b9b8594f1eb-original-pull-secret\") pod \"global-pull-secret-syncer-8ng5b\" (UID: \"e23b04fb-ea11-4ca2-9b81-1b9b8594f1eb\") " pod="kube-system/global-pull-secret-syncer-8ng5b" Apr 16 20:12:05.133478 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:12:05.133058 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 20:12:05.133478 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:12:05.133132 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e23b04fb-ea11-4ca2-9b81-1b9b8594f1eb-original-pull-secret podName:e23b04fb-ea11-4ca2-9b81-1b9b8594f1eb nodeName:}" failed. No retries permitted until 2026-04-16 20:12:21.133110476 +0000 UTC m=+40.809378692 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e23b04fb-ea11-4ca2-9b81-1b9b8594f1eb-original-pull-secret") pod "global-pull-secret-syncer-8ng5b" (UID: "e23b04fb-ea11-4ca2-9b81-1b9b8594f1eb") : object "kube-system"/"original-pull-secret" not registered Apr 16 20:12:05.900588 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:05.900561 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-9qvtx" Apr 16 20:12:05.901074 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:05.901055 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-9qvtx" Apr 16 20:12:06.042005 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:06.041942 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-985k4" event={"ID":"909cb17e-abc8-410f-adb0-206e00624944","Type":"ContainerStarted","Data":"dcc236dcbc1e56b47ef8b5120c073a2a88a14264127f31d40a3c89736307f6a8"} Apr 16 20:12:06.042298 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:06.042270 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-985k4" Apr 16 20:12:06.042513 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:06.042490 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-985k4" Apr 16 20:12:06.042593 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:06.042525 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-985k4" Apr 16 20:12:06.043044 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:06.043022 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-9qvtx" Apr 16 20:12:06.056671 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:06.056644 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-985k4" Apr 16 20:12:06.056803 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:06.056791 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-985k4" Apr 16 20:12:06.068066 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:06.068013 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-985k4" podStartSLOduration=8.363373897 podStartE2EDuration="26.067996708s" podCreationTimestamp="2026-04-16 20:11:40 +0000 UTC" firstStartedPulling="2026-04-16 20:11:43.484222389 +0000 UTC m=+3.160490600" lastFinishedPulling="2026-04-16 20:12:01.188845184 +0000 UTC m=+20.865113411" observedRunningTime="2026-04-16 20:12:06.067554915 +0000 UTC m=+25.743823148" watchObservedRunningTime="2026-04-16 20:12:06.067996708 +0000 UTC m=+25.744264942" Apr 16 20:12:06.870730 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:06.870549 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l9pv6" Apr 16 20:12:06.871520 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:06.870612 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8ng5b" Apr 16 20:12:06.871520 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:12:06.870820 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l9pv6" podUID="c7481f18-a2dc-4df2-8b59-4184bc5c7283" Apr 16 20:12:06.871520 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:06.870620 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t77hx" Apr 16 20:12:06.871520 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:12:06.870864 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8ng5b" podUID="e23b04fb-ea11-4ca2-9b81-1b9b8594f1eb" Apr 16 20:12:06.871520 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:12:06.870939 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t77hx" podUID="1280bc20-0a50-4b20-8b90-8057a3627648" Apr 16 20:12:07.044997 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:07.044966 2577 generic.go:358] "Generic (PLEG): container finished" podID="bd1cbed9-748e-4677-b9ad-5f73500b5738" containerID="0e4507e9f3ea52bcb6722df4368bdd6057637674617f2bb498696c56b827c1d8" exitCode=0 Apr 16 20:12:07.045203 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:07.045045 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6gf96" event={"ID":"bd1cbed9-748e-4677-b9ad-5f73500b5738","Type":"ContainerDied","Data":"0e4507e9f3ea52bcb6722df4368bdd6057637674617f2bb498696c56b827c1d8"} Apr 16 20:12:07.856218 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:07.856189 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-8ng5b"] Apr 16 20:12:07.856391 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:07.856300 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8ng5b" Apr 16 20:12:07.856465 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:12:07.856402 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8ng5b" podUID="e23b04fb-ea11-4ca2-9b81-1b9b8594f1eb" Apr 16 20:12:07.859872 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:07.859611 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-l9pv6"] Apr 16 20:12:07.859872 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:07.859717 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l9pv6" Apr 16 20:12:07.859872 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:12:07.859827 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l9pv6" podUID="c7481f18-a2dc-4df2-8b59-4184bc5c7283" Apr 16 20:12:07.860364 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:07.860344 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-t77hx"] Apr 16 20:12:07.860455 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:07.860421 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t77hx" Apr 16 20:12:07.860514 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:12:07.860499 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t77hx" podUID="1280bc20-0a50-4b20-8b90-8057a3627648" Apr 16 20:12:08.048601 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:08.048526 2577 generic.go:358] "Generic (PLEG): container finished" podID="bd1cbed9-748e-4677-b9ad-5f73500b5738" containerID="b101861b2a4536a1b102c544b311ad61ae301252d4450ba54b98bda2f9a62e54" exitCode=0 Apr 16 20:12:08.048996 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:08.048616 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6gf96" event={"ID":"bd1cbed9-748e-4677-b9ad-5f73500b5738","Type":"ContainerDied","Data":"b101861b2a4536a1b102c544b311ad61ae301252d4450ba54b98bda2f9a62e54"} Apr 16 20:12:09.051845 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:09.051810 2577 generic.go:358] "Generic (PLEG): container finished" podID="bd1cbed9-748e-4677-b9ad-5f73500b5738" containerID="d57144ba3c10f882f630b9c50c0d1b85a2913546e85649ff5cd3f36b08d6d1d8" exitCode=0 Apr 16 20:12:09.052285 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:09.051856 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6gf96" event={"ID":"bd1cbed9-748e-4677-b9ad-5f73500b5738","Type":"ContainerDied","Data":"d57144ba3c10f882f630b9c50c0d1b85a2913546e85649ff5cd3f36b08d6d1d8"} Apr 16 20:12:09.871206 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:09.871177 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8ng5b" Apr 16 20:12:09.871331 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:09.871188 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l9pv6" Apr 16 20:12:09.871331 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:12:09.871284 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8ng5b" podUID="e23b04fb-ea11-4ca2-9b81-1b9b8594f1eb" Apr 16 20:12:09.871431 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:09.871186 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t77hx" Apr 16 20:12:09.871431 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:12:09.871361 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l9pv6" podUID="c7481f18-a2dc-4df2-8b59-4184bc5c7283" Apr 16 20:12:09.871431 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:12:09.871447 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t77hx" podUID="1280bc20-0a50-4b20-8b90-8057a3627648" Apr 16 20:12:11.871010 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:11.870780 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8ng5b" Apr 16 20:12:11.871474 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:11.870790 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l9pv6" Apr 16 20:12:11.871474 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:12:11.871112 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8ng5b" podUID="e23b04fb-ea11-4ca2-9b81-1b9b8594f1eb" Apr 16 20:12:11.871474 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:11.870928 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t77hx" Apr 16 20:12:11.871474 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:12:11.871381 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l9pv6" podUID="c7481f18-a2dc-4df2-8b59-4184bc5c7283" Apr 16 20:12:11.871474 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:12:11.871412 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t77hx" podUID="1280bc20-0a50-4b20-8b90-8057a3627648" Apr 16 20:12:13.625418 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:13.625339 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-31.ec2.internal" event="NodeReady" Apr 16 20:12:13.625880 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:13.625490 2577 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 20:12:13.704117 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:13.704080 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-lbx4w"] Apr 16 20:12:13.741923 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:13.741185 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-nv4vl"] Apr 16 20:12:13.741923 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:13.741395 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-lbx4w" Apr 16 20:12:13.744658 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:13.744633 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-gq66p\"" Apr 16 20:12:13.745462 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:13.745444 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 20:12:13.745535 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:13.745443 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 20:12:13.766439 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:13.766411 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-lbx4w"] Apr 16 20:12:13.766586 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:13.766458 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-nv4vl"] Apr 16 20:12:13.766586 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:13.766463 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-nv4vl" Apr 16 20:12:13.770092 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:13.770040 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 20:12:13.770222 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:13.770168 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 20:12:13.770285 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:13.770227 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-28d82\"" Apr 16 20:12:13.770696 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:13.770677 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 20:12:13.870724 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:13.870690 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t77hx" Apr 16 20:12:13.870724 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:13.870710 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l9pv6" Apr 16 20:12:13.870968 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:13.870691 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8ng5b" Apr 16 20:12:13.876043 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:13.875725 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-8662n\"" Apr 16 20:12:13.876043 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:13.875766 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-fgc2s\"" Apr 16 20:12:13.876043 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:13.875727 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 20:12:13.876043 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:13.875815 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 20:12:13.876465 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:13.876444 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 20:12:13.876550 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:13.876464 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 20:12:13.901108 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:13.901089 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/07dfb99b-8784-4865-98a4-3e29b5215bfd-metrics-tls\") pod \"dns-default-lbx4w\" (UID: \"07dfb99b-8784-4865-98a4-3e29b5215bfd\") " pod="openshift-dns/dns-default-lbx4w" Apr 16 20:12:13.901227 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:13.901132 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n76ls\" (UniqueName: \"kubernetes.io/projected/9865442e-a577-4b94-be71-e7cbe1fa9199-kube-api-access-n76ls\") pod \"ingress-canary-nv4vl\" (UID: \"9865442e-a577-4b94-be71-e7cbe1fa9199\") " pod="openshift-ingress-canary/ingress-canary-nv4vl" Apr 16 20:12:13.901227 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:13.901198 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4z996\" (UniqueName: \"kubernetes.io/projected/07dfb99b-8784-4865-98a4-3e29b5215bfd-kube-api-access-4z996\") pod \"dns-default-lbx4w\" (UID: \"07dfb99b-8784-4865-98a4-3e29b5215bfd\") " pod="openshift-dns/dns-default-lbx4w" Apr 16 20:12:13.901341 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:13.901229 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/07dfb99b-8784-4865-98a4-3e29b5215bfd-config-volume\") pod \"dns-default-lbx4w\" (UID: \"07dfb99b-8784-4865-98a4-3e29b5215bfd\") " pod="openshift-dns/dns-default-lbx4w" Apr 16 20:12:13.901341 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:13.901268 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/07dfb99b-8784-4865-98a4-3e29b5215bfd-tmp-dir\") pod \"dns-default-lbx4w\" (UID: \"07dfb99b-8784-4865-98a4-3e29b5215bfd\") " pod="openshift-dns/dns-default-lbx4w" Apr 16 20:12:13.901341 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:13.901294 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9865442e-a577-4b94-be71-e7cbe1fa9199-cert\") pod \"ingress-canary-nv4vl\" (UID: \"9865442e-a577-4b94-be71-e7cbe1fa9199\") " pod="openshift-ingress-canary/ingress-canary-nv4vl" Apr 16 20:12:14.002466 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:14.002431 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/07dfb99b-8784-4865-98a4-3e29b5215bfd-metrics-tls\") pod \"dns-default-lbx4w\" (UID: \"07dfb99b-8784-4865-98a4-3e29b5215bfd\") " pod="openshift-dns/dns-default-lbx4w" Apr 16 20:12:14.002466 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:14.002478 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n76ls\" (UniqueName: \"kubernetes.io/projected/9865442e-a577-4b94-be71-e7cbe1fa9199-kube-api-access-n76ls\") pod \"ingress-canary-nv4vl\" (UID: \"9865442e-a577-4b94-be71-e7cbe1fa9199\") " pod="openshift-ingress-canary/ingress-canary-nv4vl" Apr 16 20:12:14.002701 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:14.002528 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4z996\" (UniqueName: \"kubernetes.io/projected/07dfb99b-8784-4865-98a4-3e29b5215bfd-kube-api-access-4z996\") pod \"dns-default-lbx4w\" (UID: \"07dfb99b-8784-4865-98a4-3e29b5215bfd\") " pod="openshift-dns/dns-default-lbx4w" Apr 16 20:12:14.002701 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:14.002556 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/07dfb99b-8784-4865-98a4-3e29b5215bfd-config-volume\") pod \"dns-default-lbx4w\" (UID: \"07dfb99b-8784-4865-98a4-3e29b5215bfd\") " pod="openshift-dns/dns-default-lbx4w" Apr 16 20:12:14.002701 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:12:14.002591 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:12:14.002701 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:14.002636 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/07dfb99b-8784-4865-98a4-3e29b5215bfd-tmp-dir\") pod \"dns-default-lbx4w\" (UID: \"07dfb99b-8784-4865-98a4-3e29b5215bfd\") " pod="openshift-dns/dns-default-lbx4w" Apr 16 20:12:14.002701 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:12:14.002666 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07dfb99b-8784-4865-98a4-3e29b5215bfd-metrics-tls podName:07dfb99b-8784-4865-98a4-3e29b5215bfd nodeName:}" failed. No retries permitted until 2026-04-16 20:12:14.502646109 +0000 UTC m=+34.178914335 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/07dfb99b-8784-4865-98a4-3e29b5215bfd-metrics-tls") pod "dns-default-lbx4w" (UID: "07dfb99b-8784-4865-98a4-3e29b5215bfd") : secret "dns-default-metrics-tls" not found Apr 16 20:12:14.002701 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:14.002698 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9865442e-a577-4b94-be71-e7cbe1fa9199-cert\") pod \"ingress-canary-nv4vl\" (UID: \"9865442e-a577-4b94-be71-e7cbe1fa9199\") " pod="openshift-ingress-canary/ingress-canary-nv4vl" Apr 16 20:12:14.003001 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:12:14.002794 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:12:14.003001 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:12:14.002832 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9865442e-a577-4b94-be71-e7cbe1fa9199-cert podName:9865442e-a577-4b94-be71-e7cbe1fa9199 nodeName:}" failed. No retries permitted until 2026-04-16 20:12:14.502820871 +0000 UTC m=+34.179089084 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9865442e-a577-4b94-be71-e7cbe1fa9199-cert") pod "ingress-canary-nv4vl" (UID: "9865442e-a577-4b94-be71-e7cbe1fa9199") : secret "canary-serving-cert" not found Apr 16 20:12:14.003203 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:14.003180 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/07dfb99b-8784-4865-98a4-3e29b5215bfd-config-volume\") pod \"dns-default-lbx4w\" (UID: \"07dfb99b-8784-4865-98a4-3e29b5215bfd\") " pod="openshift-dns/dns-default-lbx4w" Apr 16 20:12:14.013428 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:14.013400 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/07dfb99b-8784-4865-98a4-3e29b5215bfd-tmp-dir\") pod \"dns-default-lbx4w\" (UID: \"07dfb99b-8784-4865-98a4-3e29b5215bfd\") " pod="openshift-dns/dns-default-lbx4w" Apr 16 20:12:14.015278 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:14.015256 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4z996\" (UniqueName: \"kubernetes.io/projected/07dfb99b-8784-4865-98a4-3e29b5215bfd-kube-api-access-4z996\") pod \"dns-default-lbx4w\" (UID: \"07dfb99b-8784-4865-98a4-3e29b5215bfd\") " pod="openshift-dns/dns-default-lbx4w" Apr 16 20:12:14.015368 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:14.015290 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n76ls\" (UniqueName: \"kubernetes.io/projected/9865442e-a577-4b94-be71-e7cbe1fa9199-kube-api-access-n76ls\") pod \"ingress-canary-nv4vl\" (UID: \"9865442e-a577-4b94-be71-e7cbe1fa9199\") " pod="openshift-ingress-canary/ingress-canary-nv4vl" Apr 16 20:12:14.507112 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:14.507063 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9865442e-a577-4b94-be71-e7cbe1fa9199-cert\") pod \"ingress-canary-nv4vl\" (UID: \"9865442e-a577-4b94-be71-e7cbe1fa9199\") " pod="openshift-ingress-canary/ingress-canary-nv4vl" Apr 16 20:12:14.507308 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:14.507156 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/07dfb99b-8784-4865-98a4-3e29b5215bfd-metrics-tls\") pod \"dns-default-lbx4w\" (UID: \"07dfb99b-8784-4865-98a4-3e29b5215bfd\") " pod="openshift-dns/dns-default-lbx4w" Apr 16 20:12:14.507308 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:14.507199 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c7481f18-a2dc-4df2-8b59-4184bc5c7283-metrics-certs\") pod \"network-metrics-daemon-l9pv6\" (UID: \"c7481f18-a2dc-4df2-8b59-4184bc5c7283\") " pod="openshift-multus/network-metrics-daemon-l9pv6" Apr 16 20:12:14.507308 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:12:14.507256 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:12:14.507308 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:12:14.507288 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 20:12:14.507510 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:12:14.507319 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9865442e-a577-4b94-be71-e7cbe1fa9199-cert podName:9865442e-a577-4b94-be71-e7cbe1fa9199 nodeName:}" failed. No retries permitted until 2026-04-16 20:12:15.507298756 +0000 UTC m=+35.183566979 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9865442e-a577-4b94-be71-e7cbe1fa9199-cert") pod "ingress-canary-nv4vl" (UID: "9865442e-a577-4b94-be71-e7cbe1fa9199") : secret "canary-serving-cert" not found Apr 16 20:12:14.507510 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:12:14.507339 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c7481f18-a2dc-4df2-8b59-4184bc5c7283-metrics-certs podName:c7481f18-a2dc-4df2-8b59-4184bc5c7283 nodeName:}" failed. No retries permitted until 2026-04-16 20:12:46.507329147 +0000 UTC m=+66.183597358 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c7481f18-a2dc-4df2-8b59-4184bc5c7283-metrics-certs") pod "network-metrics-daemon-l9pv6" (UID: "c7481f18-a2dc-4df2-8b59-4184bc5c7283") : secret "metrics-daemon-secret" not found Apr 16 20:12:14.507510 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:12:14.507417 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:12:14.507510 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:12:14.507502 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07dfb99b-8784-4865-98a4-3e29b5215bfd-metrics-tls podName:07dfb99b-8784-4865-98a4-3e29b5215bfd nodeName:}" failed. No retries permitted until 2026-04-16 20:12:15.507481428 +0000 UTC m=+35.183749658 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/07dfb99b-8784-4865-98a4-3e29b5215bfd-metrics-tls") pod "dns-default-lbx4w" (UID: "07dfb99b-8784-4865-98a4-3e29b5215bfd") : secret "dns-default-metrics-tls" not found Apr 16 20:12:14.708463 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:14.708426 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2mm6t\" (UniqueName: \"kubernetes.io/projected/1280bc20-0a50-4b20-8b90-8057a3627648-kube-api-access-2mm6t\") pod \"network-check-target-t77hx\" (UID: \"1280bc20-0a50-4b20-8b90-8057a3627648\") " pod="openshift-network-diagnostics/network-check-target-t77hx" Apr 16 20:12:14.711651 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:14.711619 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mm6t\" (UniqueName: \"kubernetes.io/projected/1280bc20-0a50-4b20-8b90-8057a3627648-kube-api-access-2mm6t\") pod \"network-check-target-t77hx\" (UID: \"1280bc20-0a50-4b20-8b90-8057a3627648\") " pod="openshift-network-diagnostics/network-check-target-t77hx" Apr 16 20:12:14.782026 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:14.781943 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t77hx" Apr 16 20:12:14.993243 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:14.993076 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-t77hx"] Apr 16 20:12:15.002384 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:12:15.002358 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1280bc20_0a50_4b20_8b90_8057a3627648.slice/crio-ea3d84e71c31d7db0bd0e5c373588de9b44ed3d8e7c43d1afc9a21953a7d6782 WatchSource:0}: Error finding container ea3d84e71c31d7db0bd0e5c373588de9b44ed3d8e7c43d1afc9a21953a7d6782: Status 404 returned error can't find the container with id ea3d84e71c31d7db0bd0e5c373588de9b44ed3d8e7c43d1afc9a21953a7d6782 Apr 16 20:12:15.064169 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:15.064093 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-t77hx" event={"ID":"1280bc20-0a50-4b20-8b90-8057a3627648","Type":"ContainerStarted","Data":"ea3d84e71c31d7db0bd0e5c373588de9b44ed3d8e7c43d1afc9a21953a7d6782"} Apr 16 20:12:15.515171 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:15.515139 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/07dfb99b-8784-4865-98a4-3e29b5215bfd-metrics-tls\") pod \"dns-default-lbx4w\" (UID: \"07dfb99b-8784-4865-98a4-3e29b5215bfd\") " pod="openshift-dns/dns-default-lbx4w" Apr 16 20:12:15.515328 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:15.515202 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9865442e-a577-4b94-be71-e7cbe1fa9199-cert\") pod \"ingress-canary-nv4vl\" (UID: \"9865442e-a577-4b94-be71-e7cbe1fa9199\") " pod="openshift-ingress-canary/ingress-canary-nv4vl" Apr 16 20:12:15.515328 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:12:15.515295 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:12:15.515433 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:12:15.515363 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9865442e-a577-4b94-be71-e7cbe1fa9199-cert podName:9865442e-a577-4b94-be71-e7cbe1fa9199 nodeName:}" failed. No retries permitted until 2026-04-16 20:12:17.515344854 +0000 UTC m=+37.191613073 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9865442e-a577-4b94-be71-e7cbe1fa9199-cert") pod "ingress-canary-nv4vl" (UID: "9865442e-a577-4b94-be71-e7cbe1fa9199") : secret "canary-serving-cert" not found Apr 16 20:12:15.515433 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:12:15.515301 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:12:15.515522 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:12:15.515460 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07dfb99b-8784-4865-98a4-3e29b5215bfd-metrics-tls podName:07dfb99b-8784-4865-98a4-3e29b5215bfd nodeName:}" failed. No retries permitted until 2026-04-16 20:12:17.515441834 +0000 UTC m=+37.191710055 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/07dfb99b-8784-4865-98a4-3e29b5215bfd-metrics-tls") pod "dns-default-lbx4w" (UID: "07dfb99b-8784-4865-98a4-3e29b5215bfd") : secret "dns-default-metrics-tls" not found Apr 16 20:12:16.068528 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:16.068492 2577 generic.go:358] "Generic (PLEG): container finished" podID="bd1cbed9-748e-4677-b9ad-5f73500b5738" containerID="49f630bac8f8070ff246f3695906a36c165e8dfafebec2b37755ea6d0e441e9a" exitCode=0 Apr 16 20:12:16.069112 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:16.068548 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6gf96" event={"ID":"bd1cbed9-748e-4677-b9ad-5f73500b5738","Type":"ContainerDied","Data":"49f630bac8f8070ff246f3695906a36c165e8dfafebec2b37755ea6d0e441e9a"} Apr 16 20:12:17.073528 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:17.073492 2577 generic.go:358] "Generic (PLEG): container finished" podID="bd1cbed9-748e-4677-b9ad-5f73500b5738" containerID="68302f0cb4d29500453a61a2171f89ff4dca4fc15f23140098edc5757a677da5" exitCode=0 Apr 16 20:12:17.074050 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:17.073577 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6gf96" event={"ID":"bd1cbed9-748e-4677-b9ad-5f73500b5738","Type":"ContainerDied","Data":"68302f0cb4d29500453a61a2171f89ff4dca4fc15f23140098edc5757a677da5"} Apr 16 20:12:17.531387 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:17.531338 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9865442e-a577-4b94-be71-e7cbe1fa9199-cert\") pod \"ingress-canary-nv4vl\" (UID: \"9865442e-a577-4b94-be71-e7cbe1fa9199\") " pod="openshift-ingress-canary/ingress-canary-nv4vl" Apr 16 20:12:17.531563 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:17.531427 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/07dfb99b-8784-4865-98a4-3e29b5215bfd-metrics-tls\") pod \"dns-default-lbx4w\" (UID: \"07dfb99b-8784-4865-98a4-3e29b5215bfd\") " pod="openshift-dns/dns-default-lbx4w" Apr 16 20:12:17.531563 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:12:17.531462 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:12:17.531563 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:12:17.531545 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9865442e-a577-4b94-be71-e7cbe1fa9199-cert podName:9865442e-a577-4b94-be71-e7cbe1fa9199 nodeName:}" failed. No retries permitted until 2026-04-16 20:12:21.531523949 +0000 UTC m=+41.207792177 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9865442e-a577-4b94-be71-e7cbe1fa9199-cert") pod "ingress-canary-nv4vl" (UID: "9865442e-a577-4b94-be71-e7cbe1fa9199") : secret "canary-serving-cert" not found Apr 16 20:12:17.531736 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:12:17.531569 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:12:17.531736 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:12:17.531621 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07dfb99b-8784-4865-98a4-3e29b5215bfd-metrics-tls podName:07dfb99b-8784-4865-98a4-3e29b5215bfd nodeName:}" failed. No retries permitted until 2026-04-16 20:12:21.531608464 +0000 UTC m=+41.207876673 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/07dfb99b-8784-4865-98a4-3e29b5215bfd-metrics-tls") pod "dns-default-lbx4w" (UID: "07dfb99b-8784-4865-98a4-3e29b5215bfd") : secret "dns-default-metrics-tls" not found Apr 16 20:12:18.078545 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:18.078516 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6gf96" event={"ID":"bd1cbed9-748e-4677-b9ad-5f73500b5738","Type":"ContainerStarted","Data":"a69947441f657b49543f30047c5a0ef48ab6e301525066cd9045fedbef9af879"} Apr 16 20:12:19.082064 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:19.081867 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-t77hx" event={"ID":"1280bc20-0a50-4b20-8b90-8057a3627648","Type":"ContainerStarted","Data":"ff214eabf11a55593cb473beb23dcedfa5b2879a4cc2eadd3926619321ce9c96"} Apr 16 20:12:19.082447 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:19.082077 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-t77hx" Apr 16 20:12:19.099698 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:19.099657 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-t77hx" podStartSLOduration=35.028389271 podStartE2EDuration="38.099646432s" podCreationTimestamp="2026-04-16 20:11:41 +0000 UTC" firstStartedPulling="2026-04-16 20:12:15.105458133 +0000 UTC m=+34.781726344" lastFinishedPulling="2026-04-16 20:12:18.176715291 +0000 UTC m=+37.852983505" observedRunningTime="2026-04-16 20:12:19.098692744 +0000 UTC m=+38.774960976" watchObservedRunningTime="2026-04-16 20:12:19.099646432 +0000 UTC m=+38.775914658" Apr 16 20:12:19.126754 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:19.126720 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-6gf96" podStartSLOduration=6.473060059 podStartE2EDuration="38.126709493s" podCreationTimestamp="2026-04-16 20:11:41 +0000 UTC" firstStartedPulling="2026-04-16 20:11:43.474107787 +0000 UTC m=+3.150376011" lastFinishedPulling="2026-04-16 20:12:15.127757232 +0000 UTC m=+34.804025445" observedRunningTime="2026-04-16 20:12:19.122859303 +0000 UTC m=+38.799127534" watchObservedRunningTime="2026-04-16 20:12:19.126709493 +0000 UTC m=+38.802977724" Apr 16 20:12:21.156698 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:21.156661 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e23b04fb-ea11-4ca2-9b81-1b9b8594f1eb-original-pull-secret\") pod \"global-pull-secret-syncer-8ng5b\" (UID: \"e23b04fb-ea11-4ca2-9b81-1b9b8594f1eb\") " pod="kube-system/global-pull-secret-syncer-8ng5b" Apr 16 20:12:21.160571 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:21.160549 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e23b04fb-ea11-4ca2-9b81-1b9b8594f1eb-original-pull-secret\") pod \"global-pull-secret-syncer-8ng5b\" (UID: \"e23b04fb-ea11-4ca2-9b81-1b9b8594f1eb\") " pod="kube-system/global-pull-secret-syncer-8ng5b" Apr 16 20:12:21.388580 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:21.388546 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8ng5b" Apr 16 20:12:21.497455 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:21.497424 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-8ng5b"] Apr 16 20:12:21.511284 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:12:21.511256 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode23b04fb_ea11_4ca2_9b81_1b9b8594f1eb.slice/crio-fae144a7d0473c23bd0a10bcea0671c515f0d5e451bcd8a04788b6ba18547b27 WatchSource:0}: Error finding container fae144a7d0473c23bd0a10bcea0671c515f0d5e451bcd8a04788b6ba18547b27: Status 404 returned error can't find the container with id fae144a7d0473c23bd0a10bcea0671c515f0d5e451bcd8a04788b6ba18547b27 Apr 16 20:12:21.559507 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:21.559476 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/07dfb99b-8784-4865-98a4-3e29b5215bfd-metrics-tls\") pod \"dns-default-lbx4w\" (UID: \"07dfb99b-8784-4865-98a4-3e29b5215bfd\") " pod="openshift-dns/dns-default-lbx4w" Apr 16 20:12:21.559627 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:21.559537 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9865442e-a577-4b94-be71-e7cbe1fa9199-cert\") pod \"ingress-canary-nv4vl\" (UID: \"9865442e-a577-4b94-be71-e7cbe1fa9199\") " pod="openshift-ingress-canary/ingress-canary-nv4vl" Apr 16 20:12:21.559627 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:12:21.559625 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:12:21.559733 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:12:21.559650 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:12:21.559733 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:12:21.559668 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9865442e-a577-4b94-be71-e7cbe1fa9199-cert podName:9865442e-a577-4b94-be71-e7cbe1fa9199 nodeName:}" failed. No retries permitted until 2026-04-16 20:12:29.559655419 +0000 UTC m=+49.235923629 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9865442e-a577-4b94-be71-e7cbe1fa9199-cert") pod "ingress-canary-nv4vl" (UID: "9865442e-a577-4b94-be71-e7cbe1fa9199") : secret "canary-serving-cert" not found Apr 16 20:12:21.559733 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:12:21.559713 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07dfb99b-8784-4865-98a4-3e29b5215bfd-metrics-tls podName:07dfb99b-8784-4865-98a4-3e29b5215bfd nodeName:}" failed. No retries permitted until 2026-04-16 20:12:29.559694923 +0000 UTC m=+49.235963139 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/07dfb99b-8784-4865-98a4-3e29b5215bfd-metrics-tls") pod "dns-default-lbx4w" (UID: "07dfb99b-8784-4865-98a4-3e29b5215bfd") : secret "dns-default-metrics-tls" not found Apr 16 20:12:22.087856 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:22.087820 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-8ng5b" event={"ID":"e23b04fb-ea11-4ca2-9b81-1b9b8594f1eb","Type":"ContainerStarted","Data":"fae144a7d0473c23bd0a10bcea0671c515f0d5e451bcd8a04788b6ba18547b27"} Apr 16 20:12:26.097086 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:26.097048 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-8ng5b" event={"ID":"e23b04fb-ea11-4ca2-9b81-1b9b8594f1eb","Type":"ContainerStarted","Data":"64a82069e7343adbfac6c59a86f872ca9937fbf00d8bda53ef16f576a9d7c0ba"} Apr 16 20:12:26.111243 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:26.111204 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-8ng5b" podStartSLOduration=33.045070736 podStartE2EDuration="37.111192994s" podCreationTimestamp="2026-04-16 20:11:49 +0000 UTC" firstStartedPulling="2026-04-16 20:12:21.513339107 +0000 UTC m=+41.189607321" lastFinishedPulling="2026-04-16 20:12:25.579461356 +0000 UTC m=+45.255729579" observedRunningTime="2026-04-16 20:12:26.110697326 +0000 UTC m=+45.786965559" watchObservedRunningTime="2026-04-16 20:12:26.111192994 +0000 UTC m=+45.787461225" Apr 16 20:12:29.612819 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:29.612777 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/07dfb99b-8784-4865-98a4-3e29b5215bfd-metrics-tls\") pod \"dns-default-lbx4w\" (UID: \"07dfb99b-8784-4865-98a4-3e29b5215bfd\") " pod="openshift-dns/dns-default-lbx4w" Apr 16 20:12:29.613282 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:29.612861 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9865442e-a577-4b94-be71-e7cbe1fa9199-cert\") pod \"ingress-canary-nv4vl\" (UID: \"9865442e-a577-4b94-be71-e7cbe1fa9199\") " pod="openshift-ingress-canary/ingress-canary-nv4vl" Apr 16 20:12:29.613282 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:12:29.612934 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:12:29.613282 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:12:29.613018 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9865442e-a577-4b94-be71-e7cbe1fa9199-cert podName:9865442e-a577-4b94-be71-e7cbe1fa9199 nodeName:}" failed. No retries permitted until 2026-04-16 20:12:45.613004206 +0000 UTC m=+65.289272415 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9865442e-a577-4b94-be71-e7cbe1fa9199-cert") pod "ingress-canary-nv4vl" (UID: "9865442e-a577-4b94-be71-e7cbe1fa9199") : secret "canary-serving-cert" not found Apr 16 20:12:29.613282 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:12:29.612934 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:12:29.613282 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:12:29.613092 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07dfb99b-8784-4865-98a4-3e29b5215bfd-metrics-tls podName:07dfb99b-8784-4865-98a4-3e29b5215bfd nodeName:}" failed. No retries permitted until 2026-04-16 20:12:45.613079366 +0000 UTC m=+65.289347579 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/07dfb99b-8784-4865-98a4-3e29b5215bfd-metrics-tls") pod "dns-default-lbx4w" (UID: "07dfb99b-8784-4865-98a4-3e29b5215bfd") : secret "dns-default-metrics-tls" not found Apr 16 20:12:38.063744 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:38.063714 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-985k4" Apr 16 20:12:45.622598 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:45.622558 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/07dfb99b-8784-4865-98a4-3e29b5215bfd-metrics-tls\") pod \"dns-default-lbx4w\" (UID: \"07dfb99b-8784-4865-98a4-3e29b5215bfd\") " pod="openshift-dns/dns-default-lbx4w" Apr 16 20:12:45.623092 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:45.622619 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9865442e-a577-4b94-be71-e7cbe1fa9199-cert\") pod \"ingress-canary-nv4vl\" (UID: \"9865442e-a577-4b94-be71-e7cbe1fa9199\") " pod="openshift-ingress-canary/ingress-canary-nv4vl" Apr 16 20:12:45.623092 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:12:45.622697 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:12:45.623092 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:12:45.622703 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:12:45.623092 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:12:45.622750 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9865442e-a577-4b94-be71-e7cbe1fa9199-cert podName:9865442e-a577-4b94-be71-e7cbe1fa9199 nodeName:}" failed. No retries permitted until 2026-04-16 20:13:17.6227377 +0000 UTC m=+97.299005910 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9865442e-a577-4b94-be71-e7cbe1fa9199-cert") pod "ingress-canary-nv4vl" (UID: "9865442e-a577-4b94-be71-e7cbe1fa9199") : secret "canary-serving-cert" not found Apr 16 20:12:45.623092 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:12:45.622763 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07dfb99b-8784-4865-98a4-3e29b5215bfd-metrics-tls podName:07dfb99b-8784-4865-98a4-3e29b5215bfd nodeName:}" failed. No retries permitted until 2026-04-16 20:13:17.622756389 +0000 UTC m=+97.299024598 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/07dfb99b-8784-4865-98a4-3e29b5215bfd-metrics-tls") pod "dns-default-lbx4w" (UID: "07dfb99b-8784-4865-98a4-3e29b5215bfd") : secret "dns-default-metrics-tls" not found Apr 16 20:12:46.528687 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:46.528638 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c7481f18-a2dc-4df2-8b59-4184bc5c7283-metrics-certs\") pod \"network-metrics-daemon-l9pv6\" (UID: \"c7481f18-a2dc-4df2-8b59-4184bc5c7283\") " pod="openshift-multus/network-metrics-daemon-l9pv6" Apr 16 20:12:46.528858 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:12:46.528771 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 20:12:46.528858 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:12:46.528856 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c7481f18-a2dc-4df2-8b59-4184bc5c7283-metrics-certs podName:c7481f18-a2dc-4df2-8b59-4184bc5c7283 nodeName:}" failed. No retries permitted until 2026-04-16 20:13:50.528841338 +0000 UTC m=+130.205109548 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c7481f18-a2dc-4df2-8b59-4184bc5c7283-metrics-certs") pod "network-metrics-daemon-l9pv6" (UID: "c7481f18-a2dc-4df2-8b59-4184bc5c7283") : secret "metrics-daemon-secret" not found Apr 16 20:12:50.086818 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:12:50.086785 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-t77hx" Apr 16 20:13:17.639217 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:13:17.639168 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/07dfb99b-8784-4865-98a4-3e29b5215bfd-metrics-tls\") pod \"dns-default-lbx4w\" (UID: \"07dfb99b-8784-4865-98a4-3e29b5215bfd\") " pod="openshift-dns/dns-default-lbx4w" Apr 16 20:13:17.639614 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:13:17.639241 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9865442e-a577-4b94-be71-e7cbe1fa9199-cert\") pod \"ingress-canary-nv4vl\" (UID: \"9865442e-a577-4b94-be71-e7cbe1fa9199\") " pod="openshift-ingress-canary/ingress-canary-nv4vl" Apr 16 20:13:17.639614 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:13:17.639317 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:13:17.639614 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:13:17.639323 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:13:17.639614 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:13:17.639368 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9865442e-a577-4b94-be71-e7cbe1fa9199-cert podName:9865442e-a577-4b94-be71-e7cbe1fa9199 nodeName:}" failed. No retries permitted until 2026-04-16 20:14:21.639355451 +0000 UTC m=+161.315623661 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9865442e-a577-4b94-be71-e7cbe1fa9199-cert") pod "ingress-canary-nv4vl" (UID: "9865442e-a577-4b94-be71-e7cbe1fa9199") : secret "canary-serving-cert" not found Apr 16 20:13:17.639614 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:13:17.639381 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07dfb99b-8784-4865-98a4-3e29b5215bfd-metrics-tls podName:07dfb99b-8784-4865-98a4-3e29b5215bfd nodeName:}" failed. No retries permitted until 2026-04-16 20:14:21.639375253 +0000 UTC m=+161.315643462 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/07dfb99b-8784-4865-98a4-3e29b5215bfd-metrics-tls") pod "dns-default-lbx4w" (UID: "07dfb99b-8784-4865-98a4-3e29b5215bfd") : secret "dns-default-metrics-tls" not found Apr 16 20:13:50.559991 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:13:50.559934 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c7481f18-a2dc-4df2-8b59-4184bc5c7283-metrics-certs\") pod \"network-metrics-daemon-l9pv6\" (UID: \"c7481f18-a2dc-4df2-8b59-4184bc5c7283\") " pod="openshift-multus/network-metrics-daemon-l9pv6" Apr 16 20:13:50.560448 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:13:50.560081 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 20:13:50.560448 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:13:50.560155 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c7481f18-a2dc-4df2-8b59-4184bc5c7283-metrics-certs podName:c7481f18-a2dc-4df2-8b59-4184bc5c7283 nodeName:}" failed. No retries permitted until 2026-04-16 20:15:52.560139553 +0000 UTC m=+252.236407762 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c7481f18-a2dc-4df2-8b59-4184bc5c7283-metrics-certs") pod "network-metrics-daemon-l9pv6" (UID: "c7481f18-a2dc-4df2-8b59-4184bc5c7283") : secret "metrics-daemon-secret" not found Apr 16 20:13:59.879555 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:13:59.879520 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-767l6"] Apr 16 20:13:59.881209 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:13:59.881191 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-767l6" Apr 16 20:13:59.883941 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:13:59.883919 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 16 20:13:59.884189 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:13:59.884173 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 16 20:13:59.884748 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:13:59.884732 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-mknlj\"" Apr 16 20:13:59.887340 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:13:59.887315 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-78d5944d84-snw2w"] Apr 16 20:13:59.889261 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:13:59.889243 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-z9g4t"] Apr 16 20:13:59.889419 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:13:59.889401 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-78d5944d84-snw2w" Apr 16 20:13:59.890772 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:13:59.890756 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-z9g4t" Apr 16 20:13:59.891628 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:13:59.891611 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 16 20:13:59.892887 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:13:59.892870 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 16 20:13:59.892995 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:13:59.892964 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 16 20:13:59.893061 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:13:59.893027 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 16 20:13:59.893223 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:13:59.893210 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-cxprg\"" Apr 16 20:13:59.893760 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:13:59.893738 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 16 20:13:59.893913 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:13:59.893889 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 16 20:13:59.894041 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:13:59.894024 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 16 20:13:59.894317 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:13:59.894300 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-5wbbz\"" Apr 16 20:13:59.894408 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:13:59.894350 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 20:13:59.894472 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:13:59.894452 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 16 20:13:59.894531 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:13:59.894516 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 20:13:59.899694 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:13:59.899676 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 16 20:13:59.900897 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:13:59.900877 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-767l6"] Apr 16 20:13:59.915676 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:13:59.915656 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-z9g4t"] Apr 16 20:13:59.922555 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:13:59.922531 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-78d5944d84-snw2w"] Apr 16 20:13:59.992624 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:13:59.991275 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-gbsjj"] Apr 16 20:13:59.994309 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:13:59.994284 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-gbsjj" Apr 16 20:13:59.994690 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:13:59.994514 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-drdc7"] Apr 16 20:13:59.996290 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:13:59.996273 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-drdc7" Apr 16 20:13:59.997341 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:13:59.997319 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 16 20:13:59.997439 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:13:59.997372 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 16 20:13:59.997689 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:13:59.997675 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-977v4\"" Apr 16 20:13:59.997901 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:13:59.997886 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 16 20:13:59.997943 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:13:59.997886 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 16 20:13:59.998314 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:13:59.998287 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 16 20:13:59.998314 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:13:59.998299 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 16 20:13:59.998571 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:13:59.998555 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-zn979\"" Apr 16 20:13:59.998919 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:13:59.998901 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 16 20:14:00.004516 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:00.004495 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 16 20:14:00.008455 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:00.008428 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-gbsjj"] Apr 16 20:14:00.009742 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:00.009724 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-drdc7"] Apr 16 20:14:00.019147 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:00.019125 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/f6c48392-b5b1-4068-b4a8-8931576660ed-snapshots\") pod \"insights-operator-585dfdc468-z9g4t\" (UID: \"f6c48392-b5b1-4068-b4a8-8931576660ed\") " pod="openshift-insights/insights-operator-585dfdc468-z9g4t" Apr 16 20:14:00.019258 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:00.019156 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4525\" (UniqueName: \"kubernetes.io/projected/f6c48392-b5b1-4068-b4a8-8931576660ed-kube-api-access-g4525\") pod \"insights-operator-585dfdc468-z9g4t\" (UID: \"f6c48392-b5b1-4068-b4a8-8931576660ed\") " pod="openshift-insights/insights-operator-585dfdc468-z9g4t" Apr 16 20:14:00.019258 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:00.019178 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxsf9\" (UniqueName: \"kubernetes.io/projected/cb38b8a2-445d-466a-84c3-eac1bca4f425-kube-api-access-fxsf9\") pod \"volume-data-source-validator-7c6cbb6c87-767l6\" (UID: \"cb38b8a2-445d-466a-84c3-eac1bca4f425\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-767l6" Apr 16 20:14:00.019258 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:00.019211 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6c48392-b5b1-4068-b4a8-8931576660ed-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-z9g4t\" (UID: \"f6c48392-b5b1-4068-b4a8-8931576660ed\") " pod="openshift-insights/insights-operator-585dfdc468-z9g4t" Apr 16 20:14:00.019258 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:00.019250 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6c48392-b5b1-4068-b4a8-8931576660ed-serving-cert\") pod \"insights-operator-585dfdc468-z9g4t\" (UID: \"f6c48392-b5b1-4068-b4a8-8931576660ed\") " pod="openshift-insights/insights-operator-585dfdc468-z9g4t" Apr 16 20:14:00.019408 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:00.019274 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/09c2bd17-9feb-4960-ba71-c41b908e03e4-stats-auth\") pod \"router-default-78d5944d84-snw2w\" (UID: \"09c2bd17-9feb-4960-ba71-c41b908e03e4\") " pod="openshift-ingress/router-default-78d5944d84-snw2w" Apr 16 20:14:00.019408 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:00.019309 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6c48392-b5b1-4068-b4a8-8931576660ed-service-ca-bundle\") pod \"insights-operator-585dfdc468-z9g4t\" (UID: \"f6c48392-b5b1-4068-b4a8-8931576660ed\") " pod="openshift-insights/insights-operator-585dfdc468-z9g4t" Apr 16 20:14:00.019408 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:00.019328 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09c2bd17-9feb-4960-ba71-c41b908e03e4-service-ca-bundle\") pod \"router-default-78d5944d84-snw2w\" (UID: \"09c2bd17-9feb-4960-ba71-c41b908e03e4\") " pod="openshift-ingress/router-default-78d5944d84-snw2w" Apr 16 20:14:00.019408 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:00.019342 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wx275\" (UniqueName: \"kubernetes.io/projected/09c2bd17-9feb-4960-ba71-c41b908e03e4-kube-api-access-wx275\") pod \"router-default-78d5944d84-snw2w\" (UID: \"09c2bd17-9feb-4960-ba71-c41b908e03e4\") " pod="openshift-ingress/router-default-78d5944d84-snw2w" Apr 16 20:14:00.019408 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:00.019363 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/09c2bd17-9feb-4960-ba71-c41b908e03e4-default-certificate\") pod \"router-default-78d5944d84-snw2w\" (UID: \"09c2bd17-9feb-4960-ba71-c41b908e03e4\") " pod="openshift-ingress/router-default-78d5944d84-snw2w" Apr 16 20:14:00.019408 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:00.019387 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/09c2bd17-9feb-4960-ba71-c41b908e03e4-metrics-certs\") pod \"router-default-78d5944d84-snw2w\" (UID: \"09c2bd17-9feb-4960-ba71-c41b908e03e4\") " pod="openshift-ingress/router-default-78d5944d84-snw2w" Apr 16 20:14:00.019597 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:00.019418 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f6c48392-b5b1-4068-b4a8-8931576660ed-tmp\") pod \"insights-operator-585dfdc468-z9g4t\" (UID: \"f6c48392-b5b1-4068-b4a8-8931576660ed\") " pod="openshift-insights/insights-operator-585dfdc468-z9g4t" Apr 16 20:14:00.120659 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:00.120624 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/f6c48392-b5b1-4068-b4a8-8931576660ed-snapshots\") pod \"insights-operator-585dfdc468-z9g4t\" (UID: \"f6c48392-b5b1-4068-b4a8-8931576660ed\") " pod="openshift-insights/insights-operator-585dfdc468-z9g4t" Apr 16 20:14:00.120659 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:00.120659 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g4525\" (UniqueName: \"kubernetes.io/projected/f6c48392-b5b1-4068-b4a8-8931576660ed-kube-api-access-g4525\") pod \"insights-operator-585dfdc468-z9g4t\" (UID: \"f6c48392-b5b1-4068-b4a8-8931576660ed\") " pod="openshift-insights/insights-operator-585dfdc468-z9g4t" Apr 16 20:14:00.120909 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:00.120678 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fxsf9\" (UniqueName: \"kubernetes.io/projected/cb38b8a2-445d-466a-84c3-eac1bca4f425-kube-api-access-fxsf9\") pod \"volume-data-source-validator-7c6cbb6c87-767l6\" (UID: \"cb38b8a2-445d-466a-84c3-eac1bca4f425\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-767l6" Apr 16 20:14:00.120909 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:00.120700 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/066e6004-882a-4b2c-a626-ecac83f8b502-trusted-ca\") pod \"console-operator-9d4b6777b-gbsjj\" (UID: \"066e6004-882a-4b2c-a626-ecac83f8b502\") " pod="openshift-console-operator/console-operator-9d4b6777b-gbsjj" Apr 16 20:14:00.120909 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:00.120719 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxtwq\" (UniqueName: \"kubernetes.io/projected/40311283-c4f7-4c58-93d7-21162ac74b07-kube-api-access-wxtwq\") pod \"cluster-samples-operator-6dc5bdb6b4-drdc7\" (UID: \"40311283-c4f7-4c58-93d7-21162ac74b07\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-drdc7" Apr 16 20:14:00.120909 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:00.120787 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6c48392-b5b1-4068-b4a8-8931576660ed-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-z9g4t\" (UID: \"f6c48392-b5b1-4068-b4a8-8931576660ed\") " pod="openshift-insights/insights-operator-585dfdc468-z9g4t" Apr 16 20:14:00.120909 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:00.120829 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/066e6004-882a-4b2c-a626-ecac83f8b502-serving-cert\") pod \"console-operator-9d4b6777b-gbsjj\" (UID: \"066e6004-882a-4b2c-a626-ecac83f8b502\") " pod="openshift-console-operator/console-operator-9d4b6777b-gbsjj" Apr 16 20:14:00.120909 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:00.120880 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6c48392-b5b1-4068-b4a8-8931576660ed-serving-cert\") pod \"insights-operator-585dfdc468-z9g4t\" (UID: \"f6c48392-b5b1-4068-b4a8-8931576660ed\") " pod="openshift-insights/insights-operator-585dfdc468-z9g4t" Apr 16 20:14:00.120909 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:00.120908 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/066e6004-882a-4b2c-a626-ecac83f8b502-config\") pod \"console-operator-9d4b6777b-gbsjj\" (UID: \"066e6004-882a-4b2c-a626-ecac83f8b502\") " pod="openshift-console-operator/console-operator-9d4b6777b-gbsjj" Apr 16 20:14:00.121288 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:00.120941 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/09c2bd17-9feb-4960-ba71-c41b908e03e4-stats-auth\") pod \"router-default-78d5944d84-snw2w\" (UID: \"09c2bd17-9feb-4960-ba71-c41b908e03e4\") " pod="openshift-ingress/router-default-78d5944d84-snw2w" Apr 16 20:14:00.121288 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:00.121025 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6c48392-b5b1-4068-b4a8-8931576660ed-service-ca-bundle\") pod \"insights-operator-585dfdc468-z9g4t\" (UID: \"f6c48392-b5b1-4068-b4a8-8931576660ed\") " pod="openshift-insights/insights-operator-585dfdc468-z9g4t" Apr 16 20:14:00.121288 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:00.121058 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09c2bd17-9feb-4960-ba71-c41b908e03e4-service-ca-bundle\") pod \"router-default-78d5944d84-snw2w\" (UID: \"09c2bd17-9feb-4960-ba71-c41b908e03e4\") " pod="openshift-ingress/router-default-78d5944d84-snw2w" Apr 16 20:14:00.121288 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:00.121088 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wx275\" (UniqueName: \"kubernetes.io/projected/09c2bd17-9feb-4960-ba71-c41b908e03e4-kube-api-access-wx275\") pod \"router-default-78d5944d84-snw2w\" (UID: \"09c2bd17-9feb-4960-ba71-c41b908e03e4\") " pod="openshift-ingress/router-default-78d5944d84-snw2w" Apr 16 20:14:00.121288 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:00.121117 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/40311283-c4f7-4c58-93d7-21162ac74b07-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-drdc7\" (UID: \"40311283-c4f7-4c58-93d7-21162ac74b07\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-drdc7" Apr 16 20:14:00.121288 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:00.121152 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/09c2bd17-9feb-4960-ba71-c41b908e03e4-default-certificate\") pod \"router-default-78d5944d84-snw2w\" (UID: \"09c2bd17-9feb-4960-ba71-c41b908e03e4\") " pod="openshift-ingress/router-default-78d5944d84-snw2w" Apr 16 20:14:00.121288 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:00.121178 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/09c2bd17-9feb-4960-ba71-c41b908e03e4-metrics-certs\") pod \"router-default-78d5944d84-snw2w\" (UID: \"09c2bd17-9feb-4960-ba71-c41b908e03e4\") " pod="openshift-ingress/router-default-78d5944d84-snw2w" Apr 16 20:14:00.121288 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:14:00.121201 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/09c2bd17-9feb-4960-ba71-c41b908e03e4-service-ca-bundle podName:09c2bd17-9feb-4960-ba71-c41b908e03e4 nodeName:}" failed. No retries permitted until 2026-04-16 20:14:00.621179233 +0000 UTC m=+140.297447443 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/09c2bd17-9feb-4960-ba71-c41b908e03e4-service-ca-bundle") pod "router-default-78d5944d84-snw2w" (UID: "09c2bd17-9feb-4960-ba71-c41b908e03e4") : configmap references non-existent config key: service-ca.crt Apr 16 20:14:00.121288 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:14:00.121246 2577 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 20:14:00.121288 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:14:00.121296 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/09c2bd17-9feb-4960-ba71-c41b908e03e4-metrics-certs podName:09c2bd17-9feb-4960-ba71-c41b908e03e4 nodeName:}" failed. No retries permitted until 2026-04-16 20:14:00.621279892 +0000 UTC m=+140.297548119 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/09c2bd17-9feb-4960-ba71-c41b908e03e4-metrics-certs") pod "router-default-78d5944d84-snw2w" (UID: "09c2bd17-9feb-4960-ba71-c41b908e03e4") : secret "router-metrics-certs-default" not found Apr 16 20:14:00.121766 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:00.121245 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f6c48392-b5b1-4068-b4a8-8931576660ed-tmp\") pod \"insights-operator-585dfdc468-z9g4t\" (UID: \"f6c48392-b5b1-4068-b4a8-8931576660ed\") " pod="openshift-insights/insights-operator-585dfdc468-z9g4t" Apr 16 20:14:00.121766 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:00.121333 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7lmm\" (UniqueName: \"kubernetes.io/projected/066e6004-882a-4b2c-a626-ecac83f8b502-kube-api-access-q7lmm\") pod \"console-operator-9d4b6777b-gbsjj\" (UID: \"066e6004-882a-4b2c-a626-ecac83f8b502\") " pod="openshift-console-operator/console-operator-9d4b6777b-gbsjj" Apr 16 20:14:00.121766 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:00.121389 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/f6c48392-b5b1-4068-b4a8-8931576660ed-snapshots\") pod \"insights-operator-585dfdc468-z9g4t\" (UID: \"f6c48392-b5b1-4068-b4a8-8931576660ed\") " pod="openshift-insights/insights-operator-585dfdc468-z9g4t" Apr 16 20:14:00.121766 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:00.121494 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f6c48392-b5b1-4068-b4a8-8931576660ed-tmp\") pod \"insights-operator-585dfdc468-z9g4t\" (UID: \"f6c48392-b5b1-4068-b4a8-8931576660ed\") " pod="openshift-insights/insights-operator-585dfdc468-z9g4t" Apr 16 20:14:00.121766 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:00.121685 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6c48392-b5b1-4068-b4a8-8931576660ed-service-ca-bundle\") pod \"insights-operator-585dfdc468-z9g4t\" (UID: \"f6c48392-b5b1-4068-b4a8-8931576660ed\") " pod="openshift-insights/insights-operator-585dfdc468-z9g4t" Apr 16 20:14:00.122097 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:00.121791 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6c48392-b5b1-4068-b4a8-8931576660ed-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-z9g4t\" (UID: \"f6c48392-b5b1-4068-b4a8-8931576660ed\") " pod="openshift-insights/insights-operator-585dfdc468-z9g4t" Apr 16 20:14:00.123245 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:00.123225 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6c48392-b5b1-4068-b4a8-8931576660ed-serving-cert\") pod \"insights-operator-585dfdc468-z9g4t\" (UID: \"f6c48392-b5b1-4068-b4a8-8931576660ed\") " pod="openshift-insights/insights-operator-585dfdc468-z9g4t" Apr 16 20:14:00.123599 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:00.123583 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/09c2bd17-9feb-4960-ba71-c41b908e03e4-default-certificate\") pod \"router-default-78d5944d84-snw2w\" (UID: \"09c2bd17-9feb-4960-ba71-c41b908e03e4\") " pod="openshift-ingress/router-default-78d5944d84-snw2w" Apr 16 20:14:00.123672 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:00.123652 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/09c2bd17-9feb-4960-ba71-c41b908e03e4-stats-auth\") pod \"router-default-78d5944d84-snw2w\" (UID: \"09c2bd17-9feb-4960-ba71-c41b908e03e4\") " pod="openshift-ingress/router-default-78d5944d84-snw2w" Apr 16 20:14:00.132125 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:00.132065 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4525\" (UniqueName: \"kubernetes.io/projected/f6c48392-b5b1-4068-b4a8-8931576660ed-kube-api-access-g4525\") pod \"insights-operator-585dfdc468-z9g4t\" (UID: \"f6c48392-b5b1-4068-b4a8-8931576660ed\") " pod="openshift-insights/insights-operator-585dfdc468-z9g4t" Apr 16 20:14:00.132285 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:00.132265 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxsf9\" (UniqueName: \"kubernetes.io/projected/cb38b8a2-445d-466a-84c3-eac1bca4f425-kube-api-access-fxsf9\") pod \"volume-data-source-validator-7c6cbb6c87-767l6\" (UID: \"cb38b8a2-445d-466a-84c3-eac1bca4f425\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-767l6" Apr 16 20:14:00.132324 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:00.132292 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wx275\" (UniqueName: \"kubernetes.io/projected/09c2bd17-9feb-4960-ba71-c41b908e03e4-kube-api-access-wx275\") pod \"router-default-78d5944d84-snw2w\" (UID: \"09c2bd17-9feb-4960-ba71-c41b908e03e4\") " pod="openshift-ingress/router-default-78d5944d84-snw2w" Apr 16 20:14:00.190590 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:00.190551 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-767l6" Apr 16 20:14:00.205471 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:00.205449 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-z9g4t" Apr 16 20:14:00.222786 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:00.222577 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/066e6004-882a-4b2c-a626-ecac83f8b502-trusted-ca\") pod \"console-operator-9d4b6777b-gbsjj\" (UID: \"066e6004-882a-4b2c-a626-ecac83f8b502\") " pod="openshift-console-operator/console-operator-9d4b6777b-gbsjj" Apr 16 20:14:00.222786 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:00.222622 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wxtwq\" (UniqueName: \"kubernetes.io/projected/40311283-c4f7-4c58-93d7-21162ac74b07-kube-api-access-wxtwq\") pod \"cluster-samples-operator-6dc5bdb6b4-drdc7\" (UID: \"40311283-c4f7-4c58-93d7-21162ac74b07\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-drdc7" Apr 16 20:14:00.222786 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:00.222659 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/066e6004-882a-4b2c-a626-ecac83f8b502-serving-cert\") pod \"console-operator-9d4b6777b-gbsjj\" (UID: \"066e6004-882a-4b2c-a626-ecac83f8b502\") " pod="openshift-console-operator/console-operator-9d4b6777b-gbsjj" Apr 16 20:14:00.222786 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:00.222708 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/066e6004-882a-4b2c-a626-ecac83f8b502-config\") pod \"console-operator-9d4b6777b-gbsjj\" (UID: \"066e6004-882a-4b2c-a626-ecac83f8b502\") " pod="openshift-console-operator/console-operator-9d4b6777b-gbsjj" Apr 16 20:14:00.222786 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:00.222752 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/40311283-c4f7-4c58-93d7-21162ac74b07-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-drdc7\" (UID: \"40311283-c4f7-4c58-93d7-21162ac74b07\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-drdc7" Apr 16 20:14:00.223118 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:00.222795 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q7lmm\" (UniqueName: \"kubernetes.io/projected/066e6004-882a-4b2c-a626-ecac83f8b502-kube-api-access-q7lmm\") pod \"console-operator-9d4b6777b-gbsjj\" (UID: \"066e6004-882a-4b2c-a626-ecac83f8b502\") " pod="openshift-console-operator/console-operator-9d4b6777b-gbsjj" Apr 16 20:14:00.223118 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:14:00.222929 2577 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 20:14:00.223118 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:14:00.223036 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/40311283-c4f7-4c58-93d7-21162ac74b07-samples-operator-tls podName:40311283-c4f7-4c58-93d7-21162ac74b07 nodeName:}" failed. No retries permitted until 2026-04-16 20:14:00.723014391 +0000 UTC m=+140.399282617 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/40311283-c4f7-4c58-93d7-21162ac74b07-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-drdc7" (UID: "40311283-c4f7-4c58-93d7-21162ac74b07") : secret "samples-operator-tls" not found Apr 16 20:14:00.223676 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:00.223654 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/066e6004-882a-4b2c-a626-ecac83f8b502-trusted-ca\") pod \"console-operator-9d4b6777b-gbsjj\" (UID: \"066e6004-882a-4b2c-a626-ecac83f8b502\") " pod="openshift-console-operator/console-operator-9d4b6777b-gbsjj" Apr 16 20:14:00.223781 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:00.223758 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/066e6004-882a-4b2c-a626-ecac83f8b502-config\") pod \"console-operator-9d4b6777b-gbsjj\" (UID: \"066e6004-882a-4b2c-a626-ecac83f8b502\") " pod="openshift-console-operator/console-operator-9d4b6777b-gbsjj" Apr 16 20:14:00.226070 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:00.226024 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/066e6004-882a-4b2c-a626-ecac83f8b502-serving-cert\") pod \"console-operator-9d4b6777b-gbsjj\" (UID: \"066e6004-882a-4b2c-a626-ecac83f8b502\") " pod="openshift-console-operator/console-operator-9d4b6777b-gbsjj" Apr 16 20:14:00.237621 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:00.237572 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxtwq\" (UniqueName: \"kubernetes.io/projected/40311283-c4f7-4c58-93d7-21162ac74b07-kube-api-access-wxtwq\") pod \"cluster-samples-operator-6dc5bdb6b4-drdc7\" (UID: \"40311283-c4f7-4c58-93d7-21162ac74b07\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-drdc7" Apr 16 20:14:00.238647 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:00.238619 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7lmm\" (UniqueName: \"kubernetes.io/projected/066e6004-882a-4b2c-a626-ecac83f8b502-kube-api-access-q7lmm\") pod \"console-operator-9d4b6777b-gbsjj\" (UID: \"066e6004-882a-4b2c-a626-ecac83f8b502\") " pod="openshift-console-operator/console-operator-9d4b6777b-gbsjj" Apr 16 20:14:00.305270 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:00.305241 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-gbsjj" Apr 16 20:14:00.332279 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:00.332252 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-767l6"] Apr 16 20:14:00.334833 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:14:00.334793 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb38b8a2_445d_466a_84c3_eac1bca4f425.slice/crio-3f8f7dfe796381f5254d57601b8f9a76972c1b41a1cfca2154f91bd6b9e43a7b WatchSource:0}: Error finding container 3f8f7dfe796381f5254d57601b8f9a76972c1b41a1cfca2154f91bd6b9e43a7b: Status 404 returned error can't find the container with id 3f8f7dfe796381f5254d57601b8f9a76972c1b41a1cfca2154f91bd6b9e43a7b Apr 16 20:14:00.339037 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:00.338994 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-z9g4t"] Apr 16 20:14:00.342599 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:14:00.342563 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6c48392_b5b1_4068_b4a8_8931576660ed.slice/crio-0840d71375816939cb7b36c4b630de0fe361eed9b35081c050781a0b39dd451b WatchSource:0}: Error finding container 0840d71375816939cb7b36c4b630de0fe361eed9b35081c050781a0b39dd451b: Status 404 returned error can't find the container with id 0840d71375816939cb7b36c4b630de0fe361eed9b35081c050781a0b39dd451b Apr 16 20:14:00.425421 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:00.425344 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-gbsjj"] Apr 16 20:14:00.428241 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:14:00.428215 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod066e6004_882a_4b2c_a626_ecac83f8b502.slice/crio-4cbea06c5d98c1ff01a28b5884e4a70fe3bee41598041a978d90ac95e81600bb WatchSource:0}: Error finding container 4cbea06c5d98c1ff01a28b5884e4a70fe3bee41598041a978d90ac95e81600bb: Status 404 returned error can't find the container with id 4cbea06c5d98c1ff01a28b5884e4a70fe3bee41598041a978d90ac95e81600bb Apr 16 20:14:00.626721 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:00.626682 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09c2bd17-9feb-4960-ba71-c41b908e03e4-service-ca-bundle\") pod \"router-default-78d5944d84-snw2w\" (UID: \"09c2bd17-9feb-4960-ba71-c41b908e03e4\") " pod="openshift-ingress/router-default-78d5944d84-snw2w" Apr 16 20:14:00.626721 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:00.626727 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/09c2bd17-9feb-4960-ba71-c41b908e03e4-metrics-certs\") pod \"router-default-78d5944d84-snw2w\" (UID: \"09c2bd17-9feb-4960-ba71-c41b908e03e4\") " pod="openshift-ingress/router-default-78d5944d84-snw2w" Apr 16 20:14:00.626910 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:14:00.626820 2577 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 20:14:00.626910 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:14:00.626853 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/09c2bd17-9feb-4960-ba71-c41b908e03e4-service-ca-bundle podName:09c2bd17-9feb-4960-ba71-c41b908e03e4 nodeName:}" failed. No retries permitted until 2026-04-16 20:14:01.62683454 +0000 UTC m=+141.303102750 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/09c2bd17-9feb-4960-ba71-c41b908e03e4-service-ca-bundle") pod "router-default-78d5944d84-snw2w" (UID: "09c2bd17-9feb-4960-ba71-c41b908e03e4") : configmap references non-existent config key: service-ca.crt Apr 16 20:14:00.626910 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:14:00.626875 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/09c2bd17-9feb-4960-ba71-c41b908e03e4-metrics-certs podName:09c2bd17-9feb-4960-ba71-c41b908e03e4 nodeName:}" failed. No retries permitted until 2026-04-16 20:14:01.626869005 +0000 UTC m=+141.303137215 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/09c2bd17-9feb-4960-ba71-c41b908e03e4-metrics-certs") pod "router-default-78d5944d84-snw2w" (UID: "09c2bd17-9feb-4960-ba71-c41b908e03e4") : secret "router-metrics-certs-default" not found Apr 16 20:14:00.728125 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:00.728046 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/40311283-c4f7-4c58-93d7-21162ac74b07-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-drdc7\" (UID: \"40311283-c4f7-4c58-93d7-21162ac74b07\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-drdc7" Apr 16 20:14:00.728264 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:14:00.728216 2577 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 20:14:00.728316 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:14:00.728294 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/40311283-c4f7-4c58-93d7-21162ac74b07-samples-operator-tls podName:40311283-c4f7-4c58-93d7-21162ac74b07 nodeName:}" failed. No retries permitted until 2026-04-16 20:14:01.728273248 +0000 UTC m=+141.404541464 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/40311283-c4f7-4c58-93d7-21162ac74b07-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-drdc7" (UID: "40311283-c4f7-4c58-93d7-21162ac74b07") : secret "samples-operator-tls" not found Apr 16 20:14:01.274036 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:01.273972 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-z9g4t" event={"ID":"f6c48392-b5b1-4068-b4a8-8931576660ed","Type":"ContainerStarted","Data":"0840d71375816939cb7b36c4b630de0fe361eed9b35081c050781a0b39dd451b"} Apr 16 20:14:01.275318 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:01.275260 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-gbsjj" event={"ID":"066e6004-882a-4b2c-a626-ecac83f8b502","Type":"ContainerStarted","Data":"4cbea06c5d98c1ff01a28b5884e4a70fe3bee41598041a978d90ac95e81600bb"} Apr 16 20:14:01.276605 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:01.276566 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-767l6" event={"ID":"cb38b8a2-445d-466a-84c3-eac1bca4f425","Type":"ContainerStarted","Data":"3f8f7dfe796381f5254d57601b8f9a76972c1b41a1cfca2154f91bd6b9e43a7b"} Apr 16 20:14:01.637672 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:01.637635 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09c2bd17-9feb-4960-ba71-c41b908e03e4-service-ca-bundle\") pod \"router-default-78d5944d84-snw2w\" (UID: \"09c2bd17-9feb-4960-ba71-c41b908e03e4\") " pod="openshift-ingress/router-default-78d5944d84-snw2w" Apr 16 20:14:01.637823 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:01.637694 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/09c2bd17-9feb-4960-ba71-c41b908e03e4-metrics-certs\") pod \"router-default-78d5944d84-snw2w\" (UID: \"09c2bd17-9feb-4960-ba71-c41b908e03e4\") " pod="openshift-ingress/router-default-78d5944d84-snw2w" Apr 16 20:14:01.637865 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:14:01.637825 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/09c2bd17-9feb-4960-ba71-c41b908e03e4-service-ca-bundle podName:09c2bd17-9feb-4960-ba71-c41b908e03e4 nodeName:}" failed. No retries permitted until 2026-04-16 20:14:03.637808137 +0000 UTC m=+143.314076360 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/09c2bd17-9feb-4960-ba71-c41b908e03e4-service-ca-bundle") pod "router-default-78d5944d84-snw2w" (UID: "09c2bd17-9feb-4960-ba71-c41b908e03e4") : configmap references non-existent config key: service-ca.crt Apr 16 20:14:01.637938 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:14:01.637881 2577 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 20:14:01.638011 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:14:01.637967 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/09c2bd17-9feb-4960-ba71-c41b908e03e4-metrics-certs podName:09c2bd17-9feb-4960-ba71-c41b908e03e4 nodeName:}" failed. No retries permitted until 2026-04-16 20:14:03.637934182 +0000 UTC m=+143.314202392 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/09c2bd17-9feb-4960-ba71-c41b908e03e4-metrics-certs") pod "router-default-78d5944d84-snw2w" (UID: "09c2bd17-9feb-4960-ba71-c41b908e03e4") : secret "router-metrics-certs-default" not found Apr 16 20:14:01.738384 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:01.738347 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/40311283-c4f7-4c58-93d7-21162ac74b07-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-drdc7\" (UID: \"40311283-c4f7-4c58-93d7-21162ac74b07\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-drdc7" Apr 16 20:14:01.738546 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:14:01.738524 2577 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 20:14:01.738610 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:14:01.738600 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/40311283-c4f7-4c58-93d7-21162ac74b07-samples-operator-tls podName:40311283-c4f7-4c58-93d7-21162ac74b07 nodeName:}" failed. No retries permitted until 2026-04-16 20:14:03.738580853 +0000 UTC m=+143.414849065 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/40311283-c4f7-4c58-93d7-21162ac74b07-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-drdc7" (UID: "40311283-c4f7-4c58-93d7-21162ac74b07") : secret "samples-operator-tls" not found Apr 16 20:14:02.279896 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:02.279859 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-767l6" event={"ID":"cb38b8a2-445d-466a-84c3-eac1bca4f425","Type":"ContainerStarted","Data":"5c0be163e66cfb7ed9aaa2ea1ca865b00385d2f8d894b5596cfbe62f0edc1c50"} Apr 16 20:14:02.299867 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:02.299812 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-767l6" podStartSLOduration=1.9776808510000001 podStartE2EDuration="3.299795293s" podCreationTimestamp="2026-04-16 20:13:59 +0000 UTC" firstStartedPulling="2026-04-16 20:14:00.337062924 +0000 UTC m=+140.013331137" lastFinishedPulling="2026-04-16 20:14:01.659177355 +0000 UTC m=+141.335445579" observedRunningTime="2026-04-16 20:14:02.299080814 +0000 UTC m=+141.975349047" watchObservedRunningTime="2026-04-16 20:14:02.299795293 +0000 UTC m=+141.976063525" Apr 16 20:14:03.283316 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:03.283284 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-gbsjj_066e6004-882a-4b2c-a626-ecac83f8b502/console-operator/0.log" Apr 16 20:14:03.283733 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:03.283327 2577 generic.go:358] "Generic (PLEG): container finished" podID="066e6004-882a-4b2c-a626-ecac83f8b502" containerID="0d46bd01645659df3aa65066ce4715e5ce3549be5b02a7548de7d489a6d7ac8b" exitCode=255 Apr 16 20:14:03.283733 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:03.283420 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-gbsjj" event={"ID":"066e6004-882a-4b2c-a626-ecac83f8b502","Type":"ContainerDied","Data":"0d46bd01645659df3aa65066ce4715e5ce3549be5b02a7548de7d489a6d7ac8b"} Apr 16 20:14:03.283733 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:03.283647 2577 scope.go:117] "RemoveContainer" containerID="0d46bd01645659df3aa65066ce4715e5ce3549be5b02a7548de7d489a6d7ac8b" Apr 16 20:14:03.284820 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:03.284796 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-z9g4t" event={"ID":"f6c48392-b5b1-4068-b4a8-8931576660ed","Type":"ContainerStarted","Data":"008ef17e3e664d0ad59432817fafe41029ddac70d6ca26ce6b12195aba40554c"} Apr 16 20:14:03.319359 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:03.319311 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-z9g4t" podStartSLOduration=1.953808242 podStartE2EDuration="4.319294245s" podCreationTimestamp="2026-04-16 20:13:59 +0000 UTC" firstStartedPulling="2026-04-16 20:14:00.34466696 +0000 UTC m=+140.020935185" lastFinishedPulling="2026-04-16 20:14:02.710152978 +0000 UTC m=+142.386421188" observedRunningTime="2026-04-16 20:14:03.318716704 +0000 UTC m=+142.994984935" watchObservedRunningTime="2026-04-16 20:14:03.319294245 +0000 UTC m=+142.995562479" Apr 16 20:14:03.654776 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:03.654740 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09c2bd17-9feb-4960-ba71-c41b908e03e4-service-ca-bundle\") pod \"router-default-78d5944d84-snw2w\" (UID: \"09c2bd17-9feb-4960-ba71-c41b908e03e4\") " pod="openshift-ingress/router-default-78d5944d84-snw2w" Apr 16 20:14:03.654977 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:03.654799 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/09c2bd17-9feb-4960-ba71-c41b908e03e4-metrics-certs\") pod \"router-default-78d5944d84-snw2w\" (UID: \"09c2bd17-9feb-4960-ba71-c41b908e03e4\") " pod="openshift-ingress/router-default-78d5944d84-snw2w" Apr 16 20:14:03.654977 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:14:03.654930 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/09c2bd17-9feb-4960-ba71-c41b908e03e4-service-ca-bundle podName:09c2bd17-9feb-4960-ba71-c41b908e03e4 nodeName:}" failed. No retries permitted until 2026-04-16 20:14:07.654908871 +0000 UTC m=+147.331177097 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/09c2bd17-9feb-4960-ba71-c41b908e03e4-service-ca-bundle") pod "router-default-78d5944d84-snw2w" (UID: "09c2bd17-9feb-4960-ba71-c41b908e03e4") : configmap references non-existent config key: service-ca.crt Apr 16 20:14:03.654977 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:14:03.654934 2577 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 20:14:03.655115 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:14:03.655006 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/09c2bd17-9feb-4960-ba71-c41b908e03e4-metrics-certs podName:09c2bd17-9feb-4960-ba71-c41b908e03e4 nodeName:}" failed. No retries permitted until 2026-04-16 20:14:07.654995228 +0000 UTC m=+147.331263438 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/09c2bd17-9feb-4960-ba71-c41b908e03e4-metrics-certs") pod "router-default-78d5944d84-snw2w" (UID: "09c2bd17-9feb-4960-ba71-c41b908e03e4") : secret "router-metrics-certs-default" not found Apr 16 20:14:03.755706 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:03.755658 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/40311283-c4f7-4c58-93d7-21162ac74b07-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-drdc7\" (UID: \"40311283-c4f7-4c58-93d7-21162ac74b07\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-drdc7" Apr 16 20:14:03.755879 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:14:03.755834 2577 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 20:14:03.755998 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:14:03.755921 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/40311283-c4f7-4c58-93d7-21162ac74b07-samples-operator-tls podName:40311283-c4f7-4c58-93d7-21162ac74b07 nodeName:}" failed. No retries permitted until 2026-04-16 20:14:07.755900741 +0000 UTC m=+147.432168968 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/40311283-c4f7-4c58-93d7-21162ac74b07-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-drdc7" (UID: "40311283-c4f7-4c58-93d7-21162ac74b07") : secret "samples-operator-tls" not found Apr 16 20:14:04.076360 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:04.076266 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-t6pq4"] Apr 16 20:14:04.078378 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:04.078356 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-t6pq4" Apr 16 20:14:04.082538 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:04.082518 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 16 20:14:04.082654 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:04.082603 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 16 20:14:04.084792 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:04.084513 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-2jzgl\"" Apr 16 20:14:04.090775 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:04.090751 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-t6pq4"] Apr 16 20:14:04.158743 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:04.158706 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fhkw\" (UniqueName: \"kubernetes.io/projected/2979838d-4d67-4cb3-9880-9d8d330a873f-kube-api-access-7fhkw\") pod \"migrator-74bb7799d9-t6pq4\" (UID: \"2979838d-4d67-4cb3-9880-9d8d330a873f\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-t6pq4" Apr 16 20:14:04.259656 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:04.259598 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7fhkw\" (UniqueName: \"kubernetes.io/projected/2979838d-4d67-4cb3-9880-9d8d330a873f-kube-api-access-7fhkw\") pod \"migrator-74bb7799d9-t6pq4\" (UID: \"2979838d-4d67-4cb3-9880-9d8d330a873f\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-t6pq4" Apr 16 20:14:04.272497 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:04.272466 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fhkw\" (UniqueName: \"kubernetes.io/projected/2979838d-4d67-4cb3-9880-9d8d330a873f-kube-api-access-7fhkw\") pod \"migrator-74bb7799d9-t6pq4\" (UID: \"2979838d-4d67-4cb3-9880-9d8d330a873f\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-t6pq4" Apr 16 20:14:04.289118 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:04.289089 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-gbsjj_066e6004-882a-4b2c-a626-ecac83f8b502/console-operator/1.log" Apr 16 20:14:04.289557 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:04.289539 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-gbsjj_066e6004-882a-4b2c-a626-ecac83f8b502/console-operator/0.log" Apr 16 20:14:04.289608 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:04.289578 2577 generic.go:358] "Generic (PLEG): container finished" podID="066e6004-882a-4b2c-a626-ecac83f8b502" containerID="58170be2885311314e34046f5b1447725c0ca94cb551a55de747bdef94e0de33" exitCode=255 Apr 16 20:14:04.289699 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:04.289676 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-gbsjj" event={"ID":"066e6004-882a-4b2c-a626-ecac83f8b502","Type":"ContainerDied","Data":"58170be2885311314e34046f5b1447725c0ca94cb551a55de747bdef94e0de33"} Apr 16 20:14:04.289759 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:04.289730 2577 scope.go:117] "RemoveContainer" containerID="0d46bd01645659df3aa65066ce4715e5ce3549be5b02a7548de7d489a6d7ac8b" Apr 16 20:14:04.289938 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:04.289919 2577 scope.go:117] "RemoveContainer" containerID="58170be2885311314e34046f5b1447725c0ca94cb551a55de747bdef94e0de33" Apr 16 20:14:04.290190 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:14:04.290169 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-gbsjj_openshift-console-operator(066e6004-882a-4b2c-a626-ecac83f8b502)\"" pod="openshift-console-operator/console-operator-9d4b6777b-gbsjj" podUID="066e6004-882a-4b2c-a626-ecac83f8b502" Apr 16 20:14:04.387437 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:04.387407 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-t6pq4" Apr 16 20:14:04.499380 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:04.499346 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-t6pq4"] Apr 16 20:14:04.502863 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:14:04.502828 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2979838d_4d67_4cb3_9880_9d8d330a873f.slice/crio-2b7ac184e3d6c87dc192c8193b03e182a9ce56bc09402dfc49c916337943016c WatchSource:0}: Error finding container 2b7ac184e3d6c87dc192c8193b03e182a9ce56bc09402dfc49c916337943016c: Status 404 returned error can't find the container with id 2b7ac184e3d6c87dc192c8193b03e182a9ce56bc09402dfc49c916337943016c Apr 16 20:14:05.292595 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:05.292559 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-t6pq4" event={"ID":"2979838d-4d67-4cb3-9880-9d8d330a873f","Type":"ContainerStarted","Data":"2b7ac184e3d6c87dc192c8193b03e182a9ce56bc09402dfc49c916337943016c"} Apr 16 20:14:05.293846 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:05.293826 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-gbsjj_066e6004-882a-4b2c-a626-ecac83f8b502/console-operator/1.log" Apr 16 20:14:05.294173 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:05.294159 2577 scope.go:117] "RemoveContainer" containerID="58170be2885311314e34046f5b1447725c0ca94cb551a55de747bdef94e0de33" Apr 16 20:14:05.294364 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:14:05.294346 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-gbsjj_openshift-console-operator(066e6004-882a-4b2c-a626-ecac83f8b502)\"" pod="openshift-console-operator/console-operator-9d4b6777b-gbsjj" podUID="066e6004-882a-4b2c-a626-ecac83f8b502" Apr 16 20:14:06.298210 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:06.298169 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-t6pq4" event={"ID":"2979838d-4d67-4cb3-9880-9d8d330a873f","Type":"ContainerStarted","Data":"da84e7af5b5a2b4e6420850e7c217590b48d73da7e7f9359e322f9ae6fc16a6b"} Apr 16 20:14:06.298210 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:06.298209 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-t6pq4" event={"ID":"2979838d-4d67-4cb3-9880-9d8d330a873f","Type":"ContainerStarted","Data":"21d07e1d170777e1c6c286d2aec2e8ad0df094bb1c70f89755036a2262c83310"} Apr 16 20:14:06.315939 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:06.315888 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-t6pq4" podStartSLOduration=1.170230354 podStartE2EDuration="2.315871827s" podCreationTimestamp="2026-04-16 20:14:04 +0000 UTC" firstStartedPulling="2026-04-16 20:14:04.505737957 +0000 UTC m=+144.182006181" lastFinishedPulling="2026-04-16 20:14:05.651379444 +0000 UTC m=+145.327647654" observedRunningTime="2026-04-16 20:14:06.314224405 +0000 UTC m=+145.990492641" watchObservedRunningTime="2026-04-16 20:14:06.315871827 +0000 UTC m=+145.992140060" Apr 16 20:14:07.182517 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:07.182486 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-w8lw7_9c16c53d-2e96-42ef-86b8-344e57a267ee/dns-node-resolver/0.log" Apr 16 20:14:07.684467 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:07.684414 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09c2bd17-9feb-4960-ba71-c41b908e03e4-service-ca-bundle\") pod \"router-default-78d5944d84-snw2w\" (UID: \"09c2bd17-9feb-4960-ba71-c41b908e03e4\") " pod="openshift-ingress/router-default-78d5944d84-snw2w" Apr 16 20:14:07.684467 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:07.684477 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/09c2bd17-9feb-4960-ba71-c41b908e03e4-metrics-certs\") pod \"router-default-78d5944d84-snw2w\" (UID: \"09c2bd17-9feb-4960-ba71-c41b908e03e4\") " pod="openshift-ingress/router-default-78d5944d84-snw2w" Apr 16 20:14:07.684878 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:14:07.684604 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/09c2bd17-9feb-4960-ba71-c41b908e03e4-service-ca-bundle podName:09c2bd17-9feb-4960-ba71-c41b908e03e4 nodeName:}" failed. No retries permitted until 2026-04-16 20:14:15.684585792 +0000 UTC m=+155.360854006 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/09c2bd17-9feb-4960-ba71-c41b908e03e4-service-ca-bundle") pod "router-default-78d5944d84-snw2w" (UID: "09c2bd17-9feb-4960-ba71-c41b908e03e4") : configmap references non-existent config key: service-ca.crt Apr 16 20:14:07.684878 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:14:07.684632 2577 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 20:14:07.684878 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:14:07.684695 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/09c2bd17-9feb-4960-ba71-c41b908e03e4-metrics-certs podName:09c2bd17-9feb-4960-ba71-c41b908e03e4 nodeName:}" failed. No retries permitted until 2026-04-16 20:14:15.684682011 +0000 UTC m=+155.360950227 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/09c2bd17-9feb-4960-ba71-c41b908e03e4-metrics-certs") pod "router-default-78d5944d84-snw2w" (UID: "09c2bd17-9feb-4960-ba71-c41b908e03e4") : secret "router-metrics-certs-default" not found Apr 16 20:14:07.785099 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:07.785068 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/40311283-c4f7-4c58-93d7-21162ac74b07-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-drdc7\" (UID: \"40311283-c4f7-4c58-93d7-21162ac74b07\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-drdc7" Apr 16 20:14:07.785258 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:14:07.785237 2577 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 20:14:07.785331 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:14:07.785321 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/40311283-c4f7-4c58-93d7-21162ac74b07-samples-operator-tls podName:40311283-c4f7-4c58-93d7-21162ac74b07 nodeName:}" failed. No retries permitted until 2026-04-16 20:14:15.78530377 +0000 UTC m=+155.461571980 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/40311283-c4f7-4c58-93d7-21162ac74b07-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-drdc7" (UID: "40311283-c4f7-4c58-93d7-21162ac74b07") : secret "samples-operator-tls" not found Apr 16 20:14:08.180997 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:08.180972 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-jwfkw_56120f2f-13d6-48da-a3d5-a472edb09dfc/node-ca/0.log" Apr 16 20:14:09.180150 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:09.180119 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-t6pq4_2979838d-4d67-4cb3-9880-9d8d330a873f/migrator/0.log" Apr 16 20:14:09.380571 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:09.380539 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-t6pq4_2979838d-4d67-4cb3-9880-9d8d330a873f/graceful-termination/0.log" Apr 16 20:14:10.305992 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:10.305964 2577 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-gbsjj" Apr 16 20:14:10.305992 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:10.305994 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-gbsjj" Apr 16 20:14:10.306403 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:10.306386 2577 scope.go:117] "RemoveContainer" containerID="58170be2885311314e34046f5b1447725c0ca94cb551a55de747bdef94e0de33" Apr 16 20:14:10.306575 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:14:10.306558 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-gbsjj_openshift-console-operator(066e6004-882a-4b2c-a626-ecac83f8b502)\"" pod="openshift-console-operator/console-operator-9d4b6777b-gbsjj" podUID="066e6004-882a-4b2c-a626-ecac83f8b502" Apr 16 20:14:15.746471 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:15.746414 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09c2bd17-9feb-4960-ba71-c41b908e03e4-service-ca-bundle\") pod \"router-default-78d5944d84-snw2w\" (UID: \"09c2bd17-9feb-4960-ba71-c41b908e03e4\") " pod="openshift-ingress/router-default-78d5944d84-snw2w" Apr 16 20:14:15.746861 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:15.746503 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/09c2bd17-9feb-4960-ba71-c41b908e03e4-metrics-certs\") pod \"router-default-78d5944d84-snw2w\" (UID: \"09c2bd17-9feb-4960-ba71-c41b908e03e4\") " pod="openshift-ingress/router-default-78d5944d84-snw2w" Apr 16 20:14:15.747066 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:15.747046 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09c2bd17-9feb-4960-ba71-c41b908e03e4-service-ca-bundle\") pod \"router-default-78d5944d84-snw2w\" (UID: \"09c2bd17-9feb-4960-ba71-c41b908e03e4\") " pod="openshift-ingress/router-default-78d5944d84-snw2w" Apr 16 20:14:15.748806 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:15.748784 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/09c2bd17-9feb-4960-ba71-c41b908e03e4-metrics-certs\") pod \"router-default-78d5944d84-snw2w\" (UID: \"09c2bd17-9feb-4960-ba71-c41b908e03e4\") " pod="openshift-ingress/router-default-78d5944d84-snw2w" Apr 16 20:14:15.798806 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:15.798767 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-78d5944d84-snw2w" Apr 16 20:14:15.847419 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:15.847374 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/40311283-c4f7-4c58-93d7-21162ac74b07-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-drdc7\" (UID: \"40311283-c4f7-4c58-93d7-21162ac74b07\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-drdc7" Apr 16 20:14:15.850890 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:15.850835 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/40311283-c4f7-4c58-93d7-21162ac74b07-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-drdc7\" (UID: \"40311283-c4f7-4c58-93d7-21162ac74b07\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-drdc7" Apr 16 20:14:15.910757 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:15.910731 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-drdc7" Apr 16 20:14:15.919088 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:15.919062 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-78d5944d84-snw2w"] Apr 16 20:14:15.923036 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:14:15.923008 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09c2bd17_9feb_4960_ba71_c41b908e03e4.slice/crio-0ef5d392ce60126edbf40b360ce6bec762b69a8514b00449f9a4d751b0bdbc52 WatchSource:0}: Error finding container 0ef5d392ce60126edbf40b360ce6bec762b69a8514b00449f9a4d751b0bdbc52: Status 404 returned error can't find the container with id 0ef5d392ce60126edbf40b360ce6bec762b69a8514b00449f9a4d751b0bdbc52 Apr 16 20:14:16.034036 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:16.033789 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-drdc7"] Apr 16 20:14:16.322610 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:16.322528 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-drdc7" event={"ID":"40311283-c4f7-4c58-93d7-21162ac74b07","Type":"ContainerStarted","Data":"2e9e6b651b0d56c41d793e41ec194742f6cc3d362934eb7da3583162f655ef4a"} Apr 16 20:14:16.323654 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:16.323630 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-78d5944d84-snw2w" event={"ID":"09c2bd17-9feb-4960-ba71-c41b908e03e4","Type":"ContainerStarted","Data":"1ff248d22396f486f7d170a4d32c72c62ca9e7c9b276a2f07a6381efa189a898"} Apr 16 20:14:16.323727 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:16.323664 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-78d5944d84-snw2w" event={"ID":"09c2bd17-9feb-4960-ba71-c41b908e03e4","Type":"ContainerStarted","Data":"0ef5d392ce60126edbf40b360ce6bec762b69a8514b00449f9a4d751b0bdbc52"} Apr 16 20:14:16.342067 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:16.342024 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-78d5944d84-snw2w" podStartSLOduration=17.342010973 podStartE2EDuration="17.342010973s" podCreationTimestamp="2026-04-16 20:13:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:14:16.341304515 +0000 UTC m=+156.017572748" watchObservedRunningTime="2026-04-16 20:14:16.342010973 +0000 UTC m=+156.018279202" Apr 16 20:14:16.751919 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:14:16.751879 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-lbx4w" podUID="07dfb99b-8784-4865-98a4-3e29b5215bfd" Apr 16 20:14:16.776355 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:14:16.776299 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-nv4vl" podUID="9865442e-a577-4b94-be71-e7cbe1fa9199" Apr 16 20:14:16.799453 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:16.799421 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-78d5944d84-snw2w" Apr 16 20:14:16.801939 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:16.801916 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-78d5944d84-snw2w" Apr 16 20:14:16.893759 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:14:16.893726 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-l9pv6" podUID="c7481f18-a2dc-4df2-8b59-4184bc5c7283" Apr 16 20:14:17.326623 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:17.326577 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-lbx4w" Apr 16 20:14:17.326920 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:17.326866 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-78d5944d84-snw2w" Apr 16 20:14:17.327047 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:17.326936 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-nv4vl" Apr 16 20:14:17.327991 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:17.327968 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-78d5944d84-snw2w" Apr 16 20:14:18.330932 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:18.330895 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-drdc7" event={"ID":"40311283-c4f7-4c58-93d7-21162ac74b07","Type":"ContainerStarted","Data":"51b55ee9c3797413e5d07b3ced6809e441dacc876dd488516ad16a7d2ac6305f"} Apr 16 20:14:18.331394 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:18.330938 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-drdc7" event={"ID":"40311283-c4f7-4c58-93d7-21162ac74b07","Type":"ContainerStarted","Data":"17b615ddb03052838d04825536f48b4b0fcb402c649fe632cdb0dba3b48b4a48"} Apr 16 20:14:18.350058 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:18.350011 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-drdc7" podStartSLOduration=17.686869196 podStartE2EDuration="19.349997098s" podCreationTimestamp="2026-04-16 20:13:59 +0000 UTC" firstStartedPulling="2026-04-16 20:14:16.077408908 +0000 UTC m=+155.753677118" lastFinishedPulling="2026-04-16 20:14:17.740536811 +0000 UTC m=+157.416805020" observedRunningTime="2026-04-16 20:14:18.349278981 +0000 UTC m=+158.025547390" watchObservedRunningTime="2026-04-16 20:14:18.349997098 +0000 UTC m=+158.026265331" Apr 16 20:14:21.692805 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:21.692774 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9865442e-a577-4b94-be71-e7cbe1fa9199-cert\") pod \"ingress-canary-nv4vl\" (UID: \"9865442e-a577-4b94-be71-e7cbe1fa9199\") " pod="openshift-ingress-canary/ingress-canary-nv4vl" Apr 16 20:14:21.693328 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:21.692823 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/07dfb99b-8784-4865-98a4-3e29b5215bfd-metrics-tls\") pod \"dns-default-lbx4w\" (UID: \"07dfb99b-8784-4865-98a4-3e29b5215bfd\") " pod="openshift-dns/dns-default-lbx4w" Apr 16 20:14:21.695160 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:21.695137 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/07dfb99b-8784-4865-98a4-3e29b5215bfd-metrics-tls\") pod \"dns-default-lbx4w\" (UID: \"07dfb99b-8784-4865-98a4-3e29b5215bfd\") " pod="openshift-dns/dns-default-lbx4w" Apr 16 20:14:21.695226 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:21.695209 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9865442e-a577-4b94-be71-e7cbe1fa9199-cert\") pod \"ingress-canary-nv4vl\" (UID: \"9865442e-a577-4b94-be71-e7cbe1fa9199\") " pod="openshift-ingress-canary/ingress-canary-nv4vl" Apr 16 20:14:21.830482 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:21.830447 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-gq66p\"" Apr 16 20:14:21.831178 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:21.831153 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-28d82\"" Apr 16 20:14:21.838151 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:21.838130 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-lbx4w" Apr 16 20:14:21.838280 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:21.838203 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-nv4vl" Apr 16 20:14:21.995043 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:21.994154 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-lbx4w"] Apr 16 20:14:22.000059 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:14:22.000028 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07dfb99b_8784_4865_98a4_3e29b5215bfd.slice/crio-7470b2c98bf50f5a3ac77f683a074475e615d532198ab9b7c5821ebea75d4708 WatchSource:0}: Error finding container 7470b2c98bf50f5a3ac77f683a074475e615d532198ab9b7c5821ebea75d4708: Status 404 returned error can't find the container with id 7470b2c98bf50f5a3ac77f683a074475e615d532198ab9b7c5821ebea75d4708 Apr 16 20:14:22.005327 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:22.005297 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-nv4vl"] Apr 16 20:14:22.008636 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:14:22.008613 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9865442e_a577_4b94_be71_e7cbe1fa9199.slice/crio-bde3c0e097bbe6e76f72f77b18132ff90f68c8ee162d68929d691e9334fa920e WatchSource:0}: Error finding container bde3c0e097bbe6e76f72f77b18132ff90f68c8ee162d68929d691e9334fa920e: Status 404 returned error can't find the container with id bde3c0e097bbe6e76f72f77b18132ff90f68c8ee162d68929d691e9334fa920e Apr 16 20:14:22.344681 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:22.344639 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-nv4vl" event={"ID":"9865442e-a577-4b94-be71-e7cbe1fa9199","Type":"ContainerStarted","Data":"bde3c0e097bbe6e76f72f77b18132ff90f68c8ee162d68929d691e9334fa920e"} Apr 16 20:14:22.345826 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:22.345796 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-lbx4w" event={"ID":"07dfb99b-8784-4865-98a4-3e29b5215bfd","Type":"ContainerStarted","Data":"7470b2c98bf50f5a3ac77f683a074475e615d532198ab9b7c5821ebea75d4708"} Apr 16 20:14:24.352211 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:24.352172 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-nv4vl" event={"ID":"9865442e-a577-4b94-be71-e7cbe1fa9199","Type":"ContainerStarted","Data":"1fc32a34ce38c1b21493d489ccda57715848c01545791a2d812d30627758310c"} Apr 16 20:14:24.353698 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:24.353672 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-lbx4w" event={"ID":"07dfb99b-8784-4865-98a4-3e29b5215bfd","Type":"ContainerStarted","Data":"7cabe60e049403e7fedff00d0273cd802dc1d4596ad5abd876f57792a79d1d2b"} Apr 16 20:14:24.353819 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:24.353702 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-lbx4w" event={"ID":"07dfb99b-8784-4865-98a4-3e29b5215bfd","Type":"ContainerStarted","Data":"49ee74e494a03f2f0f59c2dee02026aff2de08c4649f1208422d7ac6b1b3bb90"} Apr 16 20:14:24.353819 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:24.353803 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-lbx4w" Apr 16 20:14:24.369818 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:24.369779 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-nv4vl" podStartSLOduration=129.470269706 podStartE2EDuration="2m11.369764839s" podCreationTimestamp="2026-04-16 20:12:13 +0000 UTC" firstStartedPulling="2026-04-16 20:14:22.010303622 +0000 UTC m=+161.686571836" lastFinishedPulling="2026-04-16 20:14:23.909798759 +0000 UTC m=+163.586066969" observedRunningTime="2026-04-16 20:14:24.369362712 +0000 UTC m=+164.045630954" watchObservedRunningTime="2026-04-16 20:14:24.369764839 +0000 UTC m=+164.046033071" Apr 16 20:14:24.390902 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:24.390862 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-lbx4w" podStartSLOduration=130.134970957 podStartE2EDuration="2m11.39085105s" podCreationTimestamp="2026-04-16 20:12:13 +0000 UTC" firstStartedPulling="2026-04-16 20:14:22.001897174 +0000 UTC m=+161.678165387" lastFinishedPulling="2026-04-16 20:14:23.257777267 +0000 UTC m=+162.934045480" observedRunningTime="2026-04-16 20:14:24.390128981 +0000 UTC m=+164.066397212" watchObservedRunningTime="2026-04-16 20:14:24.39085105 +0000 UTC m=+164.067119323" Apr 16 20:14:25.870890 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:25.870864 2577 scope.go:117] "RemoveContainer" containerID="58170be2885311314e34046f5b1447725c0ca94cb551a55de747bdef94e0de33" Apr 16 20:14:26.360501 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:26.360473 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-gbsjj_066e6004-882a-4b2c-a626-ecac83f8b502/console-operator/2.log" Apr 16 20:14:26.360861 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:26.360844 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-gbsjj_066e6004-882a-4b2c-a626-ecac83f8b502/console-operator/1.log" Apr 16 20:14:26.360937 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:26.360878 2577 generic.go:358] "Generic (PLEG): container finished" podID="066e6004-882a-4b2c-a626-ecac83f8b502" containerID="a948ce18ec4e45930d3d7677af36b634fba5a06fde7a4d9fff354a17436cd421" exitCode=255 Apr 16 20:14:26.360997 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:26.360963 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-gbsjj" event={"ID":"066e6004-882a-4b2c-a626-ecac83f8b502","Type":"ContainerDied","Data":"a948ce18ec4e45930d3d7677af36b634fba5a06fde7a4d9fff354a17436cd421"} Apr 16 20:14:26.361031 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:26.361001 2577 scope.go:117] "RemoveContainer" containerID="58170be2885311314e34046f5b1447725c0ca94cb551a55de747bdef94e0de33" Apr 16 20:14:26.361324 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:26.361300 2577 scope.go:117] "RemoveContainer" containerID="a948ce18ec4e45930d3d7677af36b634fba5a06fde7a4d9fff354a17436cd421" Apr 16 20:14:26.361513 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:14:26.361487 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-gbsjj_openshift-console-operator(066e6004-882a-4b2c-a626-ecac83f8b502)\"" pod="openshift-console-operator/console-operator-9d4b6777b-gbsjj" podUID="066e6004-882a-4b2c-a626-ecac83f8b502" Apr 16 20:14:27.364322 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:27.364296 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-gbsjj_066e6004-882a-4b2c-a626-ecac83f8b502/console-operator/2.log" Apr 16 20:14:29.590320 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:29.590285 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-v5l7c"] Apr 16 20:14:29.593696 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:29.593671 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-v5l7c" Apr 16 20:14:29.596435 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:29.596411 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-gwklr\"" Apr 16 20:14:29.597176 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:29.597154 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 20:14:29.597355 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:29.597335 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 20:14:29.605437 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:29.605416 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-v5l7c"] Apr 16 20:14:29.653780 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:29.653750 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/215df8f5-e00a-4e0a-ac8f-587008d6c91b-crio-socket\") pod \"insights-runtime-extractor-v5l7c\" (UID: \"215df8f5-e00a-4e0a-ac8f-587008d6c91b\") " pod="openshift-insights/insights-runtime-extractor-v5l7c" Apr 16 20:14:29.653906 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:29.653786 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/215df8f5-e00a-4e0a-ac8f-587008d6c91b-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-v5l7c\" (UID: \"215df8f5-e00a-4e0a-ac8f-587008d6c91b\") " pod="openshift-insights/insights-runtime-extractor-v5l7c" Apr 16 20:14:29.653906 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:29.653812 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9ll6\" (UniqueName: \"kubernetes.io/projected/215df8f5-e00a-4e0a-ac8f-587008d6c91b-kube-api-access-d9ll6\") pod \"insights-runtime-extractor-v5l7c\" (UID: \"215df8f5-e00a-4e0a-ac8f-587008d6c91b\") " pod="openshift-insights/insights-runtime-extractor-v5l7c" Apr 16 20:14:29.653906 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:29.653892 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/215df8f5-e00a-4e0a-ac8f-587008d6c91b-data-volume\") pod \"insights-runtime-extractor-v5l7c\" (UID: \"215df8f5-e00a-4e0a-ac8f-587008d6c91b\") " pod="openshift-insights/insights-runtime-extractor-v5l7c" Apr 16 20:14:29.654025 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:29.653912 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/215df8f5-e00a-4e0a-ac8f-587008d6c91b-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-v5l7c\" (UID: \"215df8f5-e00a-4e0a-ac8f-587008d6c91b\") " pod="openshift-insights/insights-runtime-extractor-v5l7c" Apr 16 20:14:29.754442 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:29.754409 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/215df8f5-e00a-4e0a-ac8f-587008d6c91b-data-volume\") pod \"insights-runtime-extractor-v5l7c\" (UID: \"215df8f5-e00a-4e0a-ac8f-587008d6c91b\") " pod="openshift-insights/insights-runtime-extractor-v5l7c" Apr 16 20:14:29.754442 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:29.754443 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/215df8f5-e00a-4e0a-ac8f-587008d6c91b-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-v5l7c\" (UID: \"215df8f5-e00a-4e0a-ac8f-587008d6c91b\") " pod="openshift-insights/insights-runtime-extractor-v5l7c" Apr 16 20:14:29.754606 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:29.754475 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/215df8f5-e00a-4e0a-ac8f-587008d6c91b-crio-socket\") pod \"insights-runtime-extractor-v5l7c\" (UID: \"215df8f5-e00a-4e0a-ac8f-587008d6c91b\") " pod="openshift-insights/insights-runtime-extractor-v5l7c" Apr 16 20:14:29.754606 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:29.754505 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/215df8f5-e00a-4e0a-ac8f-587008d6c91b-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-v5l7c\" (UID: \"215df8f5-e00a-4e0a-ac8f-587008d6c91b\") " pod="openshift-insights/insights-runtime-extractor-v5l7c" Apr 16 20:14:29.754606 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:29.754526 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d9ll6\" (UniqueName: \"kubernetes.io/projected/215df8f5-e00a-4e0a-ac8f-587008d6c91b-kube-api-access-d9ll6\") pod \"insights-runtime-extractor-v5l7c\" (UID: \"215df8f5-e00a-4e0a-ac8f-587008d6c91b\") " pod="openshift-insights/insights-runtime-extractor-v5l7c" Apr 16 20:14:29.754716 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:29.754610 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/215df8f5-e00a-4e0a-ac8f-587008d6c91b-crio-socket\") pod \"insights-runtime-extractor-v5l7c\" (UID: \"215df8f5-e00a-4e0a-ac8f-587008d6c91b\") " pod="openshift-insights/insights-runtime-extractor-v5l7c" Apr 16 20:14:29.754833 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:29.754812 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/215df8f5-e00a-4e0a-ac8f-587008d6c91b-data-volume\") pod \"insights-runtime-extractor-v5l7c\" (UID: \"215df8f5-e00a-4e0a-ac8f-587008d6c91b\") " pod="openshift-insights/insights-runtime-extractor-v5l7c" Apr 16 20:14:29.755051 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:29.755030 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/215df8f5-e00a-4e0a-ac8f-587008d6c91b-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-v5l7c\" (UID: \"215df8f5-e00a-4e0a-ac8f-587008d6c91b\") " pod="openshift-insights/insights-runtime-extractor-v5l7c" Apr 16 20:14:29.756845 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:29.756821 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/215df8f5-e00a-4e0a-ac8f-587008d6c91b-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-v5l7c\" (UID: \"215df8f5-e00a-4e0a-ac8f-587008d6c91b\") " pod="openshift-insights/insights-runtime-extractor-v5l7c" Apr 16 20:14:29.763594 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:29.763571 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9ll6\" (UniqueName: \"kubernetes.io/projected/215df8f5-e00a-4e0a-ac8f-587008d6c91b-kube-api-access-d9ll6\") pod \"insights-runtime-extractor-v5l7c\" (UID: \"215df8f5-e00a-4e0a-ac8f-587008d6c91b\") " pod="openshift-insights/insights-runtime-extractor-v5l7c" Apr 16 20:14:29.904602 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:29.904584 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-v5l7c" Apr 16 20:14:30.018572 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:30.018538 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-v5l7c"] Apr 16 20:14:30.021461 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:14:30.021435 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod215df8f5_e00a_4e0a_ac8f_587008d6c91b.slice/crio-8a8b2cbed5c684316a404bf7a38648d4990702a274ca962d167b57ba9ef77fe2 WatchSource:0}: Error finding container 8a8b2cbed5c684316a404bf7a38648d4990702a274ca962d167b57ba9ef77fe2: Status 404 returned error can't find the container with id 8a8b2cbed5c684316a404bf7a38648d4990702a274ca962d167b57ba9ef77fe2 Apr 16 20:14:30.305783 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:30.305704 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-gbsjj" Apr 16 20:14:30.305783 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:30.305740 2577 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-gbsjj" Apr 16 20:14:30.306118 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:30.306097 2577 scope.go:117] "RemoveContainer" containerID="a948ce18ec4e45930d3d7677af36b634fba5a06fde7a4d9fff354a17436cd421" Apr 16 20:14:30.306292 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:14:30.306275 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-gbsjj_openshift-console-operator(066e6004-882a-4b2c-a626-ecac83f8b502)\"" pod="openshift-console-operator/console-operator-9d4b6777b-gbsjj" podUID="066e6004-882a-4b2c-a626-ecac83f8b502" Apr 16 20:14:30.375224 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:30.375196 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-v5l7c" event={"ID":"215df8f5-e00a-4e0a-ac8f-587008d6c91b","Type":"ContainerStarted","Data":"e6edb1944c5f1438d645f23bffc3563cf07ba0b2bb8744bc60ad06bcc71b685a"} Apr 16 20:14:30.375224 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:30.375227 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-v5l7c" event={"ID":"215df8f5-e00a-4e0a-ac8f-587008d6c91b","Type":"ContainerStarted","Data":"8a8b2cbed5c684316a404bf7a38648d4990702a274ca962d167b57ba9ef77fe2"} Apr 16 20:14:30.873328 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:30.873303 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l9pv6" Apr 16 20:14:31.379359 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:31.379319 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-v5l7c" event={"ID":"215df8f5-e00a-4e0a-ac8f-587008d6c91b","Type":"ContainerStarted","Data":"313db65fc835b5bc5369fea9fccc07fc87054bfd0fc849a799dcc21c2bf47d82"} Apr 16 20:14:32.383206 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:32.383173 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-v5l7c" event={"ID":"215df8f5-e00a-4e0a-ac8f-587008d6c91b","Type":"ContainerStarted","Data":"45d4f40f5aa73829f50c979fb6fba1b129b0d4f839f4a29a9e0387467e2c3e47"} Apr 16 20:14:32.408744 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:32.408694 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-v5l7c" podStartSLOduration=1.486427648 podStartE2EDuration="3.408679177s" podCreationTimestamp="2026-04-16 20:14:29 +0000 UTC" firstStartedPulling="2026-04-16 20:14:30.07186563 +0000 UTC m=+169.748133844" lastFinishedPulling="2026-04-16 20:14:31.994117147 +0000 UTC m=+171.670385373" observedRunningTime="2026-04-16 20:14:32.407402223 +0000 UTC m=+172.083670455" watchObservedRunningTime="2026-04-16 20:14:32.408679177 +0000 UTC m=+172.084947409" Apr 16 20:14:34.359328 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:34.359301 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-lbx4w" Apr 16 20:14:43.006047 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:43.006011 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-qmmmv"] Apr 16 20:14:43.010792 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:43.010759 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-qmmmv" Apr 16 20:14:43.013468 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:43.013440 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 20:14:43.013579 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:43.013494 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 20:14:43.013579 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:43.013509 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 20:14:43.013837 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:43.013817 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-xns2x\"" Apr 16 20:14:43.013936 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:43.013851 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 20:14:43.013936 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:43.013902 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 20:14:43.014420 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:43.014406 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 20:14:43.048368 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:43.048344 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/eb507573-fa4f-4085-a62c-3ffc4585f0c8-sys\") pod \"node-exporter-qmmmv\" (UID: \"eb507573-fa4f-4085-a62c-3ffc4585f0c8\") " pod="openshift-monitoring/node-exporter-qmmmv" Apr 16 20:14:43.048482 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:43.048378 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/eb507573-fa4f-4085-a62c-3ffc4585f0c8-node-exporter-wtmp\") pod \"node-exporter-qmmmv\" (UID: \"eb507573-fa4f-4085-a62c-3ffc4585f0c8\") " pod="openshift-monitoring/node-exporter-qmmmv" Apr 16 20:14:43.048482 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:43.048434 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rz75\" (UniqueName: \"kubernetes.io/projected/eb507573-fa4f-4085-a62c-3ffc4585f0c8-kube-api-access-5rz75\") pod \"node-exporter-qmmmv\" (UID: \"eb507573-fa4f-4085-a62c-3ffc4585f0c8\") " pod="openshift-monitoring/node-exporter-qmmmv" Apr 16 20:14:43.048573 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:43.048487 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/eb507573-fa4f-4085-a62c-3ffc4585f0c8-node-exporter-textfile\") pod \"node-exporter-qmmmv\" (UID: \"eb507573-fa4f-4085-a62c-3ffc4585f0c8\") " pod="openshift-monitoring/node-exporter-qmmmv" Apr 16 20:14:43.048573 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:43.048506 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/eb507573-fa4f-4085-a62c-3ffc4585f0c8-metrics-client-ca\") pod \"node-exporter-qmmmv\" (UID: \"eb507573-fa4f-4085-a62c-3ffc4585f0c8\") " pod="openshift-monitoring/node-exporter-qmmmv" Apr 16 20:14:43.048573 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:43.048559 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/eb507573-fa4f-4085-a62c-3ffc4585f0c8-node-exporter-tls\") pod \"node-exporter-qmmmv\" (UID: \"eb507573-fa4f-4085-a62c-3ffc4585f0c8\") " pod="openshift-monitoring/node-exporter-qmmmv" Apr 16 20:14:43.048679 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:43.048577 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/eb507573-fa4f-4085-a62c-3ffc4585f0c8-root\") pod \"node-exporter-qmmmv\" (UID: \"eb507573-fa4f-4085-a62c-3ffc4585f0c8\") " pod="openshift-monitoring/node-exporter-qmmmv" Apr 16 20:14:43.048679 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:43.048593 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/eb507573-fa4f-4085-a62c-3ffc4585f0c8-node-exporter-accelerators-collector-config\") pod \"node-exporter-qmmmv\" (UID: \"eb507573-fa4f-4085-a62c-3ffc4585f0c8\") " pod="openshift-monitoring/node-exporter-qmmmv" Apr 16 20:14:43.048679 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:43.048619 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/eb507573-fa4f-4085-a62c-3ffc4585f0c8-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-qmmmv\" (UID: \"eb507573-fa4f-4085-a62c-3ffc4585f0c8\") " pod="openshift-monitoring/node-exporter-qmmmv" Apr 16 20:14:43.092257 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:43.092231 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-2dc5g"] Apr 16 20:14:43.094295 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:43.094276 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2dc5g" Apr 16 20:14:43.097053 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:43.097037 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-7hj5h\"" Apr 16 20:14:43.099338 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:43.099316 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 16 20:14:43.099431 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:43.099339 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 16 20:14:43.118524 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:43.118492 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-2dc5g"] Apr 16 20:14:43.124868 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:43.124849 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-x28rw"] Apr 16 20:14:43.127168 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:43.127150 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-x28rw" Apr 16 20:14:43.129384 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:43.129365 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 16 20:14:43.129753 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:43.129730 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 16 20:14:43.129825 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:43.129734 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 16 20:14:43.129825 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:43.129796 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-64t4m\"" Apr 16 20:14:43.142737 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:43.142714 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-x28rw"] Apr 16 20:14:43.149218 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:43.149197 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/eb507573-fa4f-4085-a62c-3ffc4585f0c8-sys\") pod \"node-exporter-qmmmv\" (UID: \"eb507573-fa4f-4085-a62c-3ffc4585f0c8\") " pod="openshift-monitoring/node-exporter-qmmmv" Apr 16 20:14:43.149316 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:43.149236 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6ce0c9f4-cc74-47c2-8391-1095355362df-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-2dc5g\" (UID: \"6ce0c9f4-cc74-47c2-8391-1095355362df\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2dc5g" Apr 16 20:14:43.149316 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:43.149269 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/eb507573-fa4f-4085-a62c-3ffc4585f0c8-node-exporter-wtmp\") pod \"node-exporter-qmmmv\" (UID: \"eb507573-fa4f-4085-a62c-3ffc4585f0c8\") " pod="openshift-monitoring/node-exporter-qmmmv" Apr 16 20:14:43.149316 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:43.149303 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6ce0c9f4-cc74-47c2-8391-1095355362df-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-2dc5g\" (UID: \"6ce0c9f4-cc74-47c2-8391-1095355362df\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2dc5g" Apr 16 20:14:43.149477 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:43.149324 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/eb507573-fa4f-4085-a62c-3ffc4585f0c8-sys\") pod \"node-exporter-qmmmv\" (UID: \"eb507573-fa4f-4085-a62c-3ffc4585f0c8\") " pod="openshift-monitoring/node-exporter-qmmmv" Apr 16 20:14:43.149477 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:43.149361 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5rz75\" (UniqueName: \"kubernetes.io/projected/eb507573-fa4f-4085-a62c-3ffc4585f0c8-kube-api-access-5rz75\") pod \"node-exporter-qmmmv\" (UID: \"eb507573-fa4f-4085-a62c-3ffc4585f0c8\") " pod="openshift-monitoring/node-exporter-qmmmv" Apr 16 20:14:43.149477 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:43.149406 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfwq9\" (UniqueName: \"kubernetes.io/projected/6ce0c9f4-cc74-47c2-8391-1095355362df-kube-api-access-bfwq9\") pod \"openshift-state-metrics-9d44df66c-2dc5g\" (UID: \"6ce0c9f4-cc74-47c2-8391-1095355362df\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2dc5g" Apr 16 20:14:43.149477 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:43.149428 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/eb507573-fa4f-4085-a62c-3ffc4585f0c8-node-exporter-wtmp\") pod \"node-exporter-qmmmv\" (UID: \"eb507573-fa4f-4085-a62c-3ffc4585f0c8\") " pod="openshift-monitoring/node-exporter-qmmmv" Apr 16 20:14:43.149477 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:43.149465 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/eb507573-fa4f-4085-a62c-3ffc4585f0c8-node-exporter-textfile\") pod \"node-exporter-qmmmv\" (UID: \"eb507573-fa4f-4085-a62c-3ffc4585f0c8\") " pod="openshift-monitoring/node-exporter-qmmmv" Apr 16 20:14:43.149733 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:43.149490 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/eb507573-fa4f-4085-a62c-3ffc4585f0c8-metrics-client-ca\") pod \"node-exporter-qmmmv\" (UID: \"eb507573-fa4f-4085-a62c-3ffc4585f0c8\") " pod="openshift-monitoring/node-exporter-qmmmv" Apr 16 20:14:43.149733 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:43.149527 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/6ce0c9f4-cc74-47c2-8391-1095355362df-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-2dc5g\" (UID: \"6ce0c9f4-cc74-47c2-8391-1095355362df\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2dc5g" Apr 16 20:14:43.149733 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:43.149561 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/eb507573-fa4f-4085-a62c-3ffc4585f0c8-node-exporter-tls\") pod \"node-exporter-qmmmv\" (UID: \"eb507573-fa4f-4085-a62c-3ffc4585f0c8\") " pod="openshift-monitoring/node-exporter-qmmmv" Apr 16 20:14:43.149733 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:43.149593 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/eb507573-fa4f-4085-a62c-3ffc4585f0c8-root\") pod \"node-exporter-qmmmv\" (UID: \"eb507573-fa4f-4085-a62c-3ffc4585f0c8\") " pod="openshift-monitoring/node-exporter-qmmmv" Apr 16 20:14:43.149733 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:43.149618 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/eb507573-fa4f-4085-a62c-3ffc4585f0c8-node-exporter-accelerators-collector-config\") pod \"node-exporter-qmmmv\" (UID: \"eb507573-fa4f-4085-a62c-3ffc4585f0c8\") " pod="openshift-monitoring/node-exporter-qmmmv" Apr 16 20:14:43.149733 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:43.149661 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/eb507573-fa4f-4085-a62c-3ffc4585f0c8-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-qmmmv\" (UID: \"eb507573-fa4f-4085-a62c-3ffc4585f0c8\") " pod="openshift-monitoring/node-exporter-qmmmv" Apr 16 20:14:43.149733 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:43.149667 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/eb507573-fa4f-4085-a62c-3ffc4585f0c8-root\") pod \"node-exporter-qmmmv\" (UID: \"eb507573-fa4f-4085-a62c-3ffc4585f0c8\") " pod="openshift-monitoring/node-exporter-qmmmv" Apr 16 20:14:43.149733 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:14:43.149686 2577 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 16 20:14:43.150101 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:14:43.149778 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb507573-fa4f-4085-a62c-3ffc4585f0c8-node-exporter-tls podName:eb507573-fa4f-4085-a62c-3ffc4585f0c8 nodeName:}" failed. No retries permitted until 2026-04-16 20:14:43.649758886 +0000 UTC m=+183.326027111 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/eb507573-fa4f-4085-a62c-3ffc4585f0c8-node-exporter-tls") pod "node-exporter-qmmmv" (UID: "eb507573-fa4f-4085-a62c-3ffc4585f0c8") : secret "node-exporter-tls" not found Apr 16 20:14:43.150101 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:43.149793 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/eb507573-fa4f-4085-a62c-3ffc4585f0c8-node-exporter-textfile\") pod \"node-exporter-qmmmv\" (UID: \"eb507573-fa4f-4085-a62c-3ffc4585f0c8\") " pod="openshift-monitoring/node-exporter-qmmmv" Apr 16 20:14:43.150182 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:43.150121 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/eb507573-fa4f-4085-a62c-3ffc4585f0c8-metrics-client-ca\") pod \"node-exporter-qmmmv\" (UID: \"eb507573-fa4f-4085-a62c-3ffc4585f0c8\") " pod="openshift-monitoring/node-exporter-qmmmv" Apr 16 20:14:43.150182 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:43.150130 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/eb507573-fa4f-4085-a62c-3ffc4585f0c8-node-exporter-accelerators-collector-config\") pod \"node-exporter-qmmmv\" (UID: \"eb507573-fa4f-4085-a62c-3ffc4585f0c8\") " pod="openshift-monitoring/node-exporter-qmmmv" Apr 16 20:14:43.151899 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:43.151883 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/eb507573-fa4f-4085-a62c-3ffc4585f0c8-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-qmmmv\" (UID: \"eb507573-fa4f-4085-a62c-3ffc4585f0c8\") " pod="openshift-monitoring/node-exporter-qmmmv" Apr 16 20:14:43.168662 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:43.168637 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rz75\" (UniqueName: \"kubernetes.io/projected/eb507573-fa4f-4085-a62c-3ffc4585f0c8-kube-api-access-5rz75\") pod \"node-exporter-qmmmv\" (UID: \"eb507573-fa4f-4085-a62c-3ffc4585f0c8\") " pod="openshift-monitoring/node-exporter-qmmmv" Apr 16 20:14:43.250550 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:43.250524 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/6ce0c9f4-cc74-47c2-8391-1095355362df-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-2dc5g\" (UID: \"6ce0c9f4-cc74-47c2-8391-1095355362df\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2dc5g" Apr 16 20:14:43.250675 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:43.250581 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/260cd2e6-6dc5-4bf2-bbc0-56935883c1da-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-x28rw\" (UID: \"260cd2e6-6dc5-4bf2-bbc0-56935883c1da\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-x28rw" Apr 16 20:14:43.250675 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:43.250603 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/260cd2e6-6dc5-4bf2-bbc0-56935883c1da-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-x28rw\" (UID: \"260cd2e6-6dc5-4bf2-bbc0-56935883c1da\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-x28rw" Apr 16 20:14:43.250675 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:43.250628 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8pjf\" (UniqueName: \"kubernetes.io/projected/260cd2e6-6dc5-4bf2-bbc0-56935883c1da-kube-api-access-b8pjf\") pod \"kube-state-metrics-69db897b98-x28rw\" (UID: \"260cd2e6-6dc5-4bf2-bbc0-56935883c1da\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-x28rw" Apr 16 20:14:43.250675 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:43.250648 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6ce0c9f4-cc74-47c2-8391-1095355362df-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-2dc5g\" (UID: \"6ce0c9f4-cc74-47c2-8391-1095355362df\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2dc5g" Apr 16 20:14:43.250854 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:43.250695 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6ce0c9f4-cc74-47c2-8391-1095355362df-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-2dc5g\" (UID: \"6ce0c9f4-cc74-47c2-8391-1095355362df\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2dc5g" Apr 16 20:14:43.250854 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:43.250737 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/260cd2e6-6dc5-4bf2-bbc0-56935883c1da-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-x28rw\" (UID: \"260cd2e6-6dc5-4bf2-bbc0-56935883c1da\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-x28rw" Apr 16 20:14:43.250854 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:43.250759 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/260cd2e6-6dc5-4bf2-bbc0-56935883c1da-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-x28rw\" (UID: \"260cd2e6-6dc5-4bf2-bbc0-56935883c1da\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-x28rw" Apr 16 20:14:43.250854 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:43.250786 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bfwq9\" (UniqueName: \"kubernetes.io/projected/6ce0c9f4-cc74-47c2-8391-1095355362df-kube-api-access-bfwq9\") pod \"openshift-state-metrics-9d44df66c-2dc5g\" (UID: \"6ce0c9f4-cc74-47c2-8391-1095355362df\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2dc5g" Apr 16 20:14:43.250854 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:43.250823 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/260cd2e6-6dc5-4bf2-bbc0-56935883c1da-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-x28rw\" (UID: \"260cd2e6-6dc5-4bf2-bbc0-56935883c1da\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-x28rw" Apr 16 20:14:43.251321 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:43.251301 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6ce0c9f4-cc74-47c2-8391-1095355362df-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-2dc5g\" (UID: \"6ce0c9f4-cc74-47c2-8391-1095355362df\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2dc5g" Apr 16 20:14:43.252965 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:43.252917 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/6ce0c9f4-cc74-47c2-8391-1095355362df-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-2dc5g\" (UID: \"6ce0c9f4-cc74-47c2-8391-1095355362df\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2dc5g" Apr 16 20:14:43.253057 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:43.253000 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6ce0c9f4-cc74-47c2-8391-1095355362df-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-2dc5g\" (UID: \"6ce0c9f4-cc74-47c2-8391-1095355362df\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2dc5g" Apr 16 20:14:43.261807 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:43.261757 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfwq9\" (UniqueName: \"kubernetes.io/projected/6ce0c9f4-cc74-47c2-8391-1095355362df-kube-api-access-bfwq9\") pod \"openshift-state-metrics-9d44df66c-2dc5g\" (UID: \"6ce0c9f4-cc74-47c2-8391-1095355362df\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2dc5g" Apr 16 20:14:43.351137 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:43.351115 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/260cd2e6-6dc5-4bf2-bbc0-56935883c1da-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-x28rw\" (UID: \"260cd2e6-6dc5-4bf2-bbc0-56935883c1da\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-x28rw" Apr 16 20:14:43.351270 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:43.351157 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/260cd2e6-6dc5-4bf2-bbc0-56935883c1da-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-x28rw\" (UID: \"260cd2e6-6dc5-4bf2-bbc0-56935883c1da\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-x28rw" Apr 16 20:14:43.351270 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:43.351200 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/260cd2e6-6dc5-4bf2-bbc0-56935883c1da-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-x28rw\" (UID: \"260cd2e6-6dc5-4bf2-bbc0-56935883c1da\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-x28rw" Apr 16 20:14:43.351270 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:43.351219 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/260cd2e6-6dc5-4bf2-bbc0-56935883c1da-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-x28rw\" (UID: \"260cd2e6-6dc5-4bf2-bbc0-56935883c1da\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-x28rw" Apr 16 20:14:43.351270 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:43.351241 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b8pjf\" (UniqueName: \"kubernetes.io/projected/260cd2e6-6dc5-4bf2-bbc0-56935883c1da-kube-api-access-b8pjf\") pod \"kube-state-metrics-69db897b98-x28rw\" (UID: \"260cd2e6-6dc5-4bf2-bbc0-56935883c1da\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-x28rw" Apr 16 20:14:43.351476 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:43.351366 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/260cd2e6-6dc5-4bf2-bbc0-56935883c1da-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-x28rw\" (UID: \"260cd2e6-6dc5-4bf2-bbc0-56935883c1da\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-x28rw" Apr 16 20:14:43.351684 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:43.351662 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/260cd2e6-6dc5-4bf2-bbc0-56935883c1da-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-x28rw\" (UID: \"260cd2e6-6dc5-4bf2-bbc0-56935883c1da\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-x28rw" Apr 16 20:14:43.351847 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:43.351827 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/260cd2e6-6dc5-4bf2-bbc0-56935883c1da-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-x28rw\" (UID: \"260cd2e6-6dc5-4bf2-bbc0-56935883c1da\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-x28rw" Apr 16 20:14:43.351939 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:43.351917 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/260cd2e6-6dc5-4bf2-bbc0-56935883c1da-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-x28rw\" (UID: \"260cd2e6-6dc5-4bf2-bbc0-56935883c1da\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-x28rw" Apr 16 20:14:43.353546 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:43.353528 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/260cd2e6-6dc5-4bf2-bbc0-56935883c1da-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-x28rw\" (UID: \"260cd2e6-6dc5-4bf2-bbc0-56935883c1da\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-x28rw" Apr 16 20:14:43.353603 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:43.353577 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/260cd2e6-6dc5-4bf2-bbc0-56935883c1da-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-x28rw\" (UID: \"260cd2e6-6dc5-4bf2-bbc0-56935883c1da\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-x28rw" Apr 16 20:14:43.359389 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:43.359367 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8pjf\" (UniqueName: \"kubernetes.io/projected/260cd2e6-6dc5-4bf2-bbc0-56935883c1da-kube-api-access-b8pjf\") pod \"kube-state-metrics-69db897b98-x28rw\" (UID: \"260cd2e6-6dc5-4bf2-bbc0-56935883c1da\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-x28rw" Apr 16 20:14:43.402738 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:43.402717 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2dc5g" Apr 16 20:14:43.436001 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:43.435976 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-x28rw" Apr 16 20:14:43.540305 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:43.540271 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-2dc5g"] Apr 16 20:14:43.543099 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:14:43.543065 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ce0c9f4_cc74_47c2_8391_1095355362df.slice/crio-eb30dd24fefc07a69f939ea3d2616fa2530a5578aab2c2809714bb8241e679c1 WatchSource:0}: Error finding container eb30dd24fefc07a69f939ea3d2616fa2530a5578aab2c2809714bb8241e679c1: Status 404 returned error can't find the container with id eb30dd24fefc07a69f939ea3d2616fa2530a5578aab2c2809714bb8241e679c1 Apr 16 20:14:43.564739 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:43.564713 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-x28rw"] Apr 16 20:14:43.579967 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:14:43.579920 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod260cd2e6_6dc5_4bf2_bbc0_56935883c1da.slice/crio-df337c6a4670c80e002fbd0993aa34baa61585f9b7cd27460ae073ed112b15d4 WatchSource:0}: Error finding container df337c6a4670c80e002fbd0993aa34baa61585f9b7cd27460ae073ed112b15d4: Status 404 returned error can't find the container with id df337c6a4670c80e002fbd0993aa34baa61585f9b7cd27460ae073ed112b15d4 Apr 16 20:14:43.653061 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:43.653037 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/eb507573-fa4f-4085-a62c-3ffc4585f0c8-node-exporter-tls\") pod \"node-exporter-qmmmv\" (UID: \"eb507573-fa4f-4085-a62c-3ffc4585f0c8\") " pod="openshift-monitoring/node-exporter-qmmmv" Apr 16 20:14:43.655155 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:43.655134 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/eb507573-fa4f-4085-a62c-3ffc4585f0c8-node-exporter-tls\") pod \"node-exporter-qmmmv\" (UID: \"eb507573-fa4f-4085-a62c-3ffc4585f0c8\") " pod="openshift-monitoring/node-exporter-qmmmv" Apr 16 20:14:43.919837 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:43.919805 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-qmmmv" Apr 16 20:14:43.928846 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:14:43.928815 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb507573_fa4f_4085_a62c_3ffc4585f0c8.slice/crio-c843f0960253a851268d992f28067d12e2fbc47d57a80770ec2e9119e6425260 WatchSource:0}: Error finding container c843f0960253a851268d992f28067d12e2fbc47d57a80770ec2e9119e6425260: Status 404 returned error can't find the container with id c843f0960253a851268d992f28067d12e2fbc47d57a80770ec2e9119e6425260 Apr 16 20:14:44.414103 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:44.414062 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-qmmmv" event={"ID":"eb507573-fa4f-4085-a62c-3ffc4585f0c8","Type":"ContainerStarted","Data":"c843f0960253a851268d992f28067d12e2fbc47d57a80770ec2e9119e6425260"} Apr 16 20:14:44.417965 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:44.417914 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-x28rw" event={"ID":"260cd2e6-6dc5-4bf2-bbc0-56935883c1da","Type":"ContainerStarted","Data":"df337c6a4670c80e002fbd0993aa34baa61585f9b7cd27460ae073ed112b15d4"} Apr 16 20:14:44.420439 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:44.420404 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2dc5g" event={"ID":"6ce0c9f4-cc74-47c2-8391-1095355362df","Type":"ContainerStarted","Data":"06cdcb9ebf3ebadab3ad023a10ed118c5d786502bab4340a7ace5d9a16223460"} Apr 16 20:14:44.420527 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:44.420443 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2dc5g" event={"ID":"6ce0c9f4-cc74-47c2-8391-1095355362df","Type":"ContainerStarted","Data":"526cdb295ef3767458ae685675dc75db32d426ec644dced424f53a65dbe8cc08"} Apr 16 20:14:44.420527 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:44.420458 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2dc5g" event={"ID":"6ce0c9f4-cc74-47c2-8391-1095355362df","Type":"ContainerStarted","Data":"eb30dd24fefc07a69f939ea3d2616fa2530a5578aab2c2809714bb8241e679c1"} Apr 16 20:14:44.871526 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:44.871500 2577 scope.go:117] "RemoveContainer" containerID="a948ce18ec4e45930d3d7677af36b634fba5a06fde7a4d9fff354a17436cd421" Apr 16 20:14:44.871747 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:14:44.871726 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-gbsjj_openshift-console-operator(066e6004-882a-4b2c-a626-ecac83f8b502)\"" pod="openshift-console-operator/console-operator-9d4b6777b-gbsjj" podUID="066e6004-882a-4b2c-a626-ecac83f8b502" Apr 16 20:14:45.095876 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:45.095844 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-6575947857-tnl4r"] Apr 16 20:14:45.105297 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:45.102725 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-6575947857-tnl4r" Apr 16 20:14:45.106502 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:45.106006 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 16 20:14:45.106502 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:45.106038 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 16 20:14:45.106502 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:45.106336 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-9md6p\"" Apr 16 20:14:45.108079 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:45.107361 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-6sglig786c0ap\"" Apr 16 20:14:45.108079 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:45.107624 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 16 20:14:45.108079 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:45.107895 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 16 20:14:45.108724 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:45.108529 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 16 20:14:45.120993 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:45.120964 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-6575947857-tnl4r"] Apr 16 20:14:45.168043 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:45.168010 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9e9c1e01-825d-4a05-9b6a-a90f26ea32a0-metrics-client-ca\") pod \"thanos-querier-6575947857-tnl4r\" (UID: \"9e9c1e01-825d-4a05-9b6a-a90f26ea32a0\") " pod="openshift-monitoring/thanos-querier-6575947857-tnl4r" Apr 16 20:14:45.168214 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:45.168095 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/9e9c1e01-825d-4a05-9b6a-a90f26ea32a0-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-6575947857-tnl4r\" (UID: \"9e9c1e01-825d-4a05-9b6a-a90f26ea32a0\") " pod="openshift-monitoring/thanos-querier-6575947857-tnl4r" Apr 16 20:14:45.168214 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:45.168129 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbzc5\" (UniqueName: \"kubernetes.io/projected/9e9c1e01-825d-4a05-9b6a-a90f26ea32a0-kube-api-access-cbzc5\") pod \"thanos-querier-6575947857-tnl4r\" (UID: \"9e9c1e01-825d-4a05-9b6a-a90f26ea32a0\") " pod="openshift-monitoring/thanos-querier-6575947857-tnl4r" Apr 16 20:14:45.168214 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:45.168167 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/9e9c1e01-825d-4a05-9b6a-a90f26ea32a0-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-6575947857-tnl4r\" (UID: \"9e9c1e01-825d-4a05-9b6a-a90f26ea32a0\") " pod="openshift-monitoring/thanos-querier-6575947857-tnl4r" Apr 16 20:14:45.168320 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:45.168239 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/9e9c1e01-825d-4a05-9b6a-a90f26ea32a0-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-6575947857-tnl4r\" (UID: \"9e9c1e01-825d-4a05-9b6a-a90f26ea32a0\") " pod="openshift-monitoring/thanos-querier-6575947857-tnl4r" Apr 16 20:14:45.168320 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:45.168312 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/9e9c1e01-825d-4a05-9b6a-a90f26ea32a0-secret-thanos-querier-tls\") pod \"thanos-querier-6575947857-tnl4r\" (UID: \"9e9c1e01-825d-4a05-9b6a-a90f26ea32a0\") " pod="openshift-monitoring/thanos-querier-6575947857-tnl4r" Apr 16 20:14:45.168391 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:45.168337 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/9e9c1e01-825d-4a05-9b6a-a90f26ea32a0-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-6575947857-tnl4r\" (UID: \"9e9c1e01-825d-4a05-9b6a-a90f26ea32a0\") " pod="openshift-monitoring/thanos-querier-6575947857-tnl4r" Apr 16 20:14:45.168391 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:45.168355 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/9e9c1e01-825d-4a05-9b6a-a90f26ea32a0-secret-grpc-tls\") pod \"thanos-querier-6575947857-tnl4r\" (UID: \"9e9c1e01-825d-4a05-9b6a-a90f26ea32a0\") " pod="openshift-monitoring/thanos-querier-6575947857-tnl4r" Apr 16 20:14:45.269001 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:45.268968 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cbzc5\" (UniqueName: \"kubernetes.io/projected/9e9c1e01-825d-4a05-9b6a-a90f26ea32a0-kube-api-access-cbzc5\") pod \"thanos-querier-6575947857-tnl4r\" (UID: \"9e9c1e01-825d-4a05-9b6a-a90f26ea32a0\") " pod="openshift-monitoring/thanos-querier-6575947857-tnl4r" Apr 16 20:14:45.269158 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:45.269014 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/9e9c1e01-825d-4a05-9b6a-a90f26ea32a0-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-6575947857-tnl4r\" (UID: \"9e9c1e01-825d-4a05-9b6a-a90f26ea32a0\") " pod="openshift-monitoring/thanos-querier-6575947857-tnl4r" Apr 16 20:14:45.269158 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:45.269038 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/9e9c1e01-825d-4a05-9b6a-a90f26ea32a0-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-6575947857-tnl4r\" (UID: \"9e9c1e01-825d-4a05-9b6a-a90f26ea32a0\") " pod="openshift-monitoring/thanos-querier-6575947857-tnl4r" Apr 16 20:14:45.269158 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:45.269062 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/9e9c1e01-825d-4a05-9b6a-a90f26ea32a0-secret-thanos-querier-tls\") pod \"thanos-querier-6575947857-tnl4r\" (UID: \"9e9c1e01-825d-4a05-9b6a-a90f26ea32a0\") " pod="openshift-monitoring/thanos-querier-6575947857-tnl4r" Apr 16 20:14:45.269158 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:45.269081 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/9e9c1e01-825d-4a05-9b6a-a90f26ea32a0-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-6575947857-tnl4r\" (UID: \"9e9c1e01-825d-4a05-9b6a-a90f26ea32a0\") " pod="openshift-monitoring/thanos-querier-6575947857-tnl4r" Apr 16 20:14:45.269158 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:45.269099 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/9e9c1e01-825d-4a05-9b6a-a90f26ea32a0-secret-grpc-tls\") pod \"thanos-querier-6575947857-tnl4r\" (UID: \"9e9c1e01-825d-4a05-9b6a-a90f26ea32a0\") " pod="openshift-monitoring/thanos-querier-6575947857-tnl4r" Apr 16 20:14:45.269158 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:45.269138 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9e9c1e01-825d-4a05-9b6a-a90f26ea32a0-metrics-client-ca\") pod \"thanos-querier-6575947857-tnl4r\" (UID: \"9e9c1e01-825d-4a05-9b6a-a90f26ea32a0\") " pod="openshift-monitoring/thanos-querier-6575947857-tnl4r" Apr 16 20:14:45.269488 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:45.269186 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/9e9c1e01-825d-4a05-9b6a-a90f26ea32a0-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-6575947857-tnl4r\" (UID: \"9e9c1e01-825d-4a05-9b6a-a90f26ea32a0\") " pod="openshift-monitoring/thanos-querier-6575947857-tnl4r" Apr 16 20:14:45.270091 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:45.270035 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9e9c1e01-825d-4a05-9b6a-a90f26ea32a0-metrics-client-ca\") pod \"thanos-querier-6575947857-tnl4r\" (UID: \"9e9c1e01-825d-4a05-9b6a-a90f26ea32a0\") " pod="openshift-monitoring/thanos-querier-6575947857-tnl4r" Apr 16 20:14:45.272218 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:45.272172 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/9e9c1e01-825d-4a05-9b6a-a90f26ea32a0-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-6575947857-tnl4r\" (UID: \"9e9c1e01-825d-4a05-9b6a-a90f26ea32a0\") " pod="openshift-monitoring/thanos-querier-6575947857-tnl4r" Apr 16 20:14:45.272218 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:45.272172 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/9e9c1e01-825d-4a05-9b6a-a90f26ea32a0-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-6575947857-tnl4r\" (UID: \"9e9c1e01-825d-4a05-9b6a-a90f26ea32a0\") " pod="openshift-monitoring/thanos-querier-6575947857-tnl4r" Apr 16 20:14:45.272349 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:45.272257 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/9e9c1e01-825d-4a05-9b6a-a90f26ea32a0-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-6575947857-tnl4r\" (UID: \"9e9c1e01-825d-4a05-9b6a-a90f26ea32a0\") " pod="openshift-monitoring/thanos-querier-6575947857-tnl4r" Apr 16 20:14:45.272540 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:45.272520 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/9e9c1e01-825d-4a05-9b6a-a90f26ea32a0-secret-grpc-tls\") pod \"thanos-querier-6575947857-tnl4r\" (UID: \"9e9c1e01-825d-4a05-9b6a-a90f26ea32a0\") " pod="openshift-monitoring/thanos-querier-6575947857-tnl4r" Apr 16 20:14:45.272618 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:45.272575 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/9e9c1e01-825d-4a05-9b6a-a90f26ea32a0-secret-thanos-querier-tls\") pod \"thanos-querier-6575947857-tnl4r\" (UID: \"9e9c1e01-825d-4a05-9b6a-a90f26ea32a0\") " pod="openshift-monitoring/thanos-querier-6575947857-tnl4r" Apr 16 20:14:45.272743 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:45.272724 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/9e9c1e01-825d-4a05-9b6a-a90f26ea32a0-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-6575947857-tnl4r\" (UID: \"9e9c1e01-825d-4a05-9b6a-a90f26ea32a0\") " pod="openshift-monitoring/thanos-querier-6575947857-tnl4r" Apr 16 20:14:45.280561 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:45.280541 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbzc5\" (UniqueName: \"kubernetes.io/projected/9e9c1e01-825d-4a05-9b6a-a90f26ea32a0-kube-api-access-cbzc5\") pod \"thanos-querier-6575947857-tnl4r\" (UID: \"9e9c1e01-825d-4a05-9b6a-a90f26ea32a0\") " pod="openshift-monitoring/thanos-querier-6575947857-tnl4r" Apr 16 20:14:45.424641 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:45.424552 2577 generic.go:358] "Generic (PLEG): container finished" podID="eb507573-fa4f-4085-a62c-3ffc4585f0c8" containerID="8f35dbcb4e3009be4625add5d8d0738820e63ccc02c6faed663036704c4e2a21" exitCode=0 Apr 16 20:14:45.424641 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:45.424626 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-qmmmv" event={"ID":"eb507573-fa4f-4085-a62c-3ffc4585f0c8","Type":"ContainerDied","Data":"8f35dbcb4e3009be4625add5d8d0738820e63ccc02c6faed663036704c4e2a21"} Apr 16 20:14:45.425128 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:45.424974 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-6575947857-tnl4r" Apr 16 20:14:45.427005 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:45.426913 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-x28rw" event={"ID":"260cd2e6-6dc5-4bf2-bbc0-56935883c1da","Type":"ContainerStarted","Data":"bcf58f62a0b82627ac68bda3d002123167f2711aff9a1bfe207321d1c2131da9"} Apr 16 20:14:45.427005 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:45.426972 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-x28rw" event={"ID":"260cd2e6-6dc5-4bf2-bbc0-56935883c1da","Type":"ContainerStarted","Data":"f2b2b557175e9d8a58ed8ff0a779c4a159be0f17d1f541bb17591d83b3931959"} Apr 16 20:14:45.427005 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:45.426986 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-x28rw" event={"ID":"260cd2e6-6dc5-4bf2-bbc0-56935883c1da","Type":"ContainerStarted","Data":"b602fd9edefa50cd60a2baac054a3c6ba07e14bc1a0a5fc3dda548ba0c5e0ba3"} Apr 16 20:14:45.429110 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:45.429087 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2dc5g" event={"ID":"6ce0c9f4-cc74-47c2-8391-1095355362df","Type":"ContainerStarted","Data":"953bdfea5e8b9c5462e83cbcd3d6b3250dcae35e7c3b52041954cdbbcc57c7e0"} Apr 16 20:14:45.468701 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:45.468652 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2dc5g" podStartSLOduration=1.186476223 podStartE2EDuration="2.468634051s" podCreationTimestamp="2026-04-16 20:14:43 +0000 UTC" firstStartedPulling="2026-04-16 20:14:43.654652472 +0000 UTC m=+183.330920685" lastFinishedPulling="2026-04-16 20:14:44.936810286 +0000 UTC m=+184.613078513" observedRunningTime="2026-04-16 20:14:45.467176891 +0000 UTC m=+185.143445124" watchObservedRunningTime="2026-04-16 20:14:45.468634051 +0000 UTC m=+185.144902282" Apr 16 20:14:45.494883 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:45.494838 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-x28rw" podStartSLOduration=1.137646871 podStartE2EDuration="2.494822597s" podCreationTimestamp="2026-04-16 20:14:43 +0000 UTC" firstStartedPulling="2026-04-16 20:14:43.5818272 +0000 UTC m=+183.258095417" lastFinishedPulling="2026-04-16 20:14:44.93900293 +0000 UTC m=+184.615271143" observedRunningTime="2026-04-16 20:14:45.492863302 +0000 UTC m=+185.169131533" watchObservedRunningTime="2026-04-16 20:14:45.494822597 +0000 UTC m=+185.171090829" Apr 16 20:14:45.571625 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:45.571600 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-6575947857-tnl4r"] Apr 16 20:14:45.573603 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:14:45.573568 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e9c1e01_825d_4a05_9b6a_a90f26ea32a0.slice/crio-f9303e78191216a8850c6b7a7e548496d0e00662cc68c263df923676100f49b3 WatchSource:0}: Error finding container f9303e78191216a8850c6b7a7e548496d0e00662cc68c263df923676100f49b3: Status 404 returned error can't find the container with id f9303e78191216a8850c6b7a7e548496d0e00662cc68c263df923676100f49b3 Apr 16 20:14:46.434234 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:46.434197 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-qmmmv" event={"ID":"eb507573-fa4f-4085-a62c-3ffc4585f0c8","Type":"ContainerStarted","Data":"33399b72c588c5f39e4173ea7fecddb28e05f324420debae70d7a5f98c97984c"} Apr 16 20:14:46.434234 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:46.434240 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-qmmmv" event={"ID":"eb507573-fa4f-4085-a62c-3ffc4585f0c8","Type":"ContainerStarted","Data":"32c7992750c41694beb349aff14cc42183213da350cf2189a388078673f021ef"} Apr 16 20:14:46.435533 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:46.435492 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6575947857-tnl4r" event={"ID":"9e9c1e01-825d-4a05-9b6a-a90f26ea32a0","Type":"ContainerStarted","Data":"f9303e78191216a8850c6b7a7e548496d0e00662cc68c263df923676100f49b3"} Apr 16 20:14:46.454973 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:46.454915 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-qmmmv" podStartSLOduration=3.4250713360000002 podStartE2EDuration="4.454901336s" podCreationTimestamp="2026-04-16 20:14:42 +0000 UTC" firstStartedPulling="2026-04-16 20:14:43.930976777 +0000 UTC m=+183.607244990" lastFinishedPulling="2026-04-16 20:14:44.96080677 +0000 UTC m=+184.637074990" observedRunningTime="2026-04-16 20:14:46.454672478 +0000 UTC m=+186.130940709" watchObservedRunningTime="2026-04-16 20:14:46.454901336 +0000 UTC m=+186.131169567" Apr 16 20:14:47.445845 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:47.445812 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6575947857-tnl4r" event={"ID":"9e9c1e01-825d-4a05-9b6a-a90f26ea32a0","Type":"ContainerStarted","Data":"bfac6be4d137cb3b3f51312d1269785d777553cb7de7823f110cff4c374809c4"} Apr 16 20:14:47.446321 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:47.445855 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6575947857-tnl4r" event={"ID":"9e9c1e01-825d-4a05-9b6a-a90f26ea32a0","Type":"ContainerStarted","Data":"41e8167482823a9e07af4b3441ae1436f8c376d383ce76754c03e58039c84f17"} Apr 16 20:14:47.794604 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:47.794572 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-64wr5"] Apr 16 20:14:47.796711 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:47.796694 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-64wr5" Apr 16 20:14:47.798962 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:47.798930 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 16 20:14:47.799073 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:47.798941 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-75nrp\"" Apr 16 20:14:47.805512 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:47.805490 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-64wr5"] Apr 16 20:14:47.893459 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:47.893426 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d427af70-58f2-4318-aa05-1933c3bcd25e-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-64wr5\" (UID: \"d427af70-58f2-4318-aa05-1933c3bcd25e\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-64wr5" Apr 16 20:14:47.994190 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:47.994152 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d427af70-58f2-4318-aa05-1933c3bcd25e-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-64wr5\" (UID: \"d427af70-58f2-4318-aa05-1933c3bcd25e\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-64wr5" Apr 16 20:14:47.994326 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:14:47.994270 2577 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 16 20:14:47.994326 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:14:47.994325 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d427af70-58f2-4318-aa05-1933c3bcd25e-monitoring-plugin-cert podName:d427af70-58f2-4318-aa05-1933c3bcd25e nodeName:}" failed. No retries permitted until 2026-04-16 20:14:48.494310569 +0000 UTC m=+188.170578780 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/d427af70-58f2-4318-aa05-1933c3bcd25e-monitoring-plugin-cert") pod "monitoring-plugin-7dccd58f55-64wr5" (UID: "d427af70-58f2-4318-aa05-1933c3bcd25e") : secret "monitoring-plugin-cert" not found Apr 16 20:14:48.314101 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:48.314075 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-75f7f6db87-tnr4d"] Apr 16 20:14:48.317239 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:48.317221 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-75f7f6db87-tnr4d" Apr 16 20:14:48.320443 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:48.320381 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 16 20:14:48.320443 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:48.320390 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-5l4t4\"" Apr 16 20:14:48.320809 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:48.320675 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 16 20:14:48.320993 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:48.320943 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 16 20:14:48.321306 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:48.321287 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 16 20:14:48.322396 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:48.322349 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 16 20:14:48.333127 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:48.333106 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 16 20:14:48.338517 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:48.338497 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-75f7f6db87-tnr4d"] Apr 16 20:14:48.397905 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:48.397807 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2db51ca0-6cdc-4f9c-b747-6a7f612dba5e-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-75f7f6db87-tnr4d\" (UID: \"2db51ca0-6cdc-4f9c-b747-6a7f612dba5e\") " pod="openshift-monitoring/telemeter-client-75f7f6db87-tnr4d" Apr 16 20:14:48.397905 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:48.397851 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvvjp\" (UniqueName: \"kubernetes.io/projected/2db51ca0-6cdc-4f9c-b747-6a7f612dba5e-kube-api-access-bvvjp\") pod \"telemeter-client-75f7f6db87-tnr4d\" (UID: \"2db51ca0-6cdc-4f9c-b747-6a7f612dba5e\") " pod="openshift-monitoring/telemeter-client-75f7f6db87-tnr4d" Apr 16 20:14:48.398034 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:48.397939 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2db51ca0-6cdc-4f9c-b747-6a7f612dba5e-serving-certs-ca-bundle\") pod \"telemeter-client-75f7f6db87-tnr4d\" (UID: \"2db51ca0-6cdc-4f9c-b747-6a7f612dba5e\") " pod="openshift-monitoring/telemeter-client-75f7f6db87-tnr4d" Apr 16 20:14:48.398034 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:48.398012 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2db51ca0-6cdc-4f9c-b747-6a7f612dba5e-metrics-client-ca\") pod \"telemeter-client-75f7f6db87-tnr4d\" (UID: \"2db51ca0-6cdc-4f9c-b747-6a7f612dba5e\") " pod="openshift-monitoring/telemeter-client-75f7f6db87-tnr4d" Apr 16 20:14:48.398121 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:48.398053 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/2db51ca0-6cdc-4f9c-b747-6a7f612dba5e-federate-client-tls\") pod \"telemeter-client-75f7f6db87-tnr4d\" (UID: \"2db51ca0-6cdc-4f9c-b747-6a7f612dba5e\") " pod="openshift-monitoring/telemeter-client-75f7f6db87-tnr4d" Apr 16 20:14:48.398121 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:48.398101 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/2db51ca0-6cdc-4f9c-b747-6a7f612dba5e-secret-telemeter-client\") pod \"telemeter-client-75f7f6db87-tnr4d\" (UID: \"2db51ca0-6cdc-4f9c-b747-6a7f612dba5e\") " pod="openshift-monitoring/telemeter-client-75f7f6db87-tnr4d" Apr 16 20:14:48.398265 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:48.398136 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/2db51ca0-6cdc-4f9c-b747-6a7f612dba5e-telemeter-client-tls\") pod \"telemeter-client-75f7f6db87-tnr4d\" (UID: \"2db51ca0-6cdc-4f9c-b747-6a7f612dba5e\") " pod="openshift-monitoring/telemeter-client-75f7f6db87-tnr4d" Apr 16 20:14:48.398265 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:48.398199 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2db51ca0-6cdc-4f9c-b747-6a7f612dba5e-telemeter-trusted-ca-bundle\") pod \"telemeter-client-75f7f6db87-tnr4d\" (UID: \"2db51ca0-6cdc-4f9c-b747-6a7f612dba5e\") " pod="openshift-monitoring/telemeter-client-75f7f6db87-tnr4d" Apr 16 20:14:48.451497 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:48.451465 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6575947857-tnl4r" event={"ID":"9e9c1e01-825d-4a05-9b6a-a90f26ea32a0","Type":"ContainerStarted","Data":"3dde6e8cb97a8ff5262da9af488cd770db824d04661f0eb65aa73faa29306c47"} Apr 16 20:14:48.451825 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:48.451501 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6575947857-tnl4r" event={"ID":"9e9c1e01-825d-4a05-9b6a-a90f26ea32a0","Type":"ContainerStarted","Data":"ed6482c2915b43e9c775bb2f8d1affbd041c9b6f8fa219bbf63104e584f6289a"} Apr 16 20:14:48.451825 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:48.451515 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6575947857-tnl4r" event={"ID":"9e9c1e01-825d-4a05-9b6a-a90f26ea32a0","Type":"ContainerStarted","Data":"67547bf97cac6b779bee359f607900e57c7dfb8bedf821bad8101ca6cf6ccf52"} Apr 16 20:14:48.451825 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:48.451528 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6575947857-tnl4r" event={"ID":"9e9c1e01-825d-4a05-9b6a-a90f26ea32a0","Type":"ContainerStarted","Data":"5595a6e52fae67b6b0381f4aec72bf0d8f02c5ef490efd4667af01da0991a7d3"} Apr 16 20:14:48.451825 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:48.451606 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-6575947857-tnl4r" Apr 16 20:14:48.477691 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:48.477641 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-6575947857-tnl4r" podStartSLOduration=0.774963242 podStartE2EDuration="3.477626101s" podCreationTimestamp="2026-04-16 20:14:45 +0000 UTC" firstStartedPulling="2026-04-16 20:14:45.575467845 +0000 UTC m=+185.251736054" lastFinishedPulling="2026-04-16 20:14:48.278130702 +0000 UTC m=+187.954398913" observedRunningTime="2026-04-16 20:14:48.476168499 +0000 UTC m=+188.152436731" watchObservedRunningTime="2026-04-16 20:14:48.477626101 +0000 UTC m=+188.153894333" Apr 16 20:14:48.499494 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:48.499466 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/2db51ca0-6cdc-4f9c-b747-6a7f612dba5e-federate-client-tls\") pod \"telemeter-client-75f7f6db87-tnr4d\" (UID: \"2db51ca0-6cdc-4f9c-b747-6a7f612dba5e\") " pod="openshift-monitoring/telemeter-client-75f7f6db87-tnr4d" Apr 16 20:14:48.499623 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:48.499512 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/2db51ca0-6cdc-4f9c-b747-6a7f612dba5e-secret-telemeter-client\") pod \"telemeter-client-75f7f6db87-tnr4d\" (UID: \"2db51ca0-6cdc-4f9c-b747-6a7f612dba5e\") " pod="openshift-monitoring/telemeter-client-75f7f6db87-tnr4d" Apr 16 20:14:48.499623 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:48.499529 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/2db51ca0-6cdc-4f9c-b747-6a7f612dba5e-telemeter-client-tls\") pod \"telemeter-client-75f7f6db87-tnr4d\" (UID: \"2db51ca0-6cdc-4f9c-b747-6a7f612dba5e\") " pod="openshift-monitoring/telemeter-client-75f7f6db87-tnr4d" Apr 16 20:14:48.499623 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:48.499576 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2db51ca0-6cdc-4f9c-b747-6a7f612dba5e-telemeter-trusted-ca-bundle\") pod \"telemeter-client-75f7f6db87-tnr4d\" (UID: \"2db51ca0-6cdc-4f9c-b747-6a7f612dba5e\") " pod="openshift-monitoring/telemeter-client-75f7f6db87-tnr4d" Apr 16 20:14:48.499779 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:48.499703 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2db51ca0-6cdc-4f9c-b747-6a7f612dba5e-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-75f7f6db87-tnr4d\" (UID: \"2db51ca0-6cdc-4f9c-b747-6a7f612dba5e\") " pod="openshift-monitoring/telemeter-client-75f7f6db87-tnr4d" Apr 16 20:14:48.499779 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:48.499749 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d427af70-58f2-4318-aa05-1933c3bcd25e-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-64wr5\" (UID: \"d427af70-58f2-4318-aa05-1933c3bcd25e\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-64wr5" Apr 16 20:14:48.499891 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:48.499778 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bvvjp\" (UniqueName: \"kubernetes.io/projected/2db51ca0-6cdc-4f9c-b747-6a7f612dba5e-kube-api-access-bvvjp\") pod \"telemeter-client-75f7f6db87-tnr4d\" (UID: \"2db51ca0-6cdc-4f9c-b747-6a7f612dba5e\") " pod="openshift-monitoring/telemeter-client-75f7f6db87-tnr4d" Apr 16 20:14:48.499891 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:48.499811 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2db51ca0-6cdc-4f9c-b747-6a7f612dba5e-serving-certs-ca-bundle\") pod \"telemeter-client-75f7f6db87-tnr4d\" (UID: \"2db51ca0-6cdc-4f9c-b747-6a7f612dba5e\") " pod="openshift-monitoring/telemeter-client-75f7f6db87-tnr4d" Apr 16 20:14:48.499891 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:48.499848 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2db51ca0-6cdc-4f9c-b747-6a7f612dba5e-metrics-client-ca\") pod \"telemeter-client-75f7f6db87-tnr4d\" (UID: \"2db51ca0-6cdc-4f9c-b747-6a7f612dba5e\") " pod="openshift-monitoring/telemeter-client-75f7f6db87-tnr4d" Apr 16 20:14:48.500772 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:48.500682 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2db51ca0-6cdc-4f9c-b747-6a7f612dba5e-serving-certs-ca-bundle\") pod \"telemeter-client-75f7f6db87-tnr4d\" (UID: \"2db51ca0-6cdc-4f9c-b747-6a7f612dba5e\") " pod="openshift-monitoring/telemeter-client-75f7f6db87-tnr4d" Apr 16 20:14:48.500772 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:48.500682 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2db51ca0-6cdc-4f9c-b747-6a7f612dba5e-metrics-client-ca\") pod \"telemeter-client-75f7f6db87-tnr4d\" (UID: \"2db51ca0-6cdc-4f9c-b747-6a7f612dba5e\") " pod="openshift-monitoring/telemeter-client-75f7f6db87-tnr4d" Apr 16 20:14:48.501016 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:48.500973 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2db51ca0-6cdc-4f9c-b747-6a7f612dba5e-telemeter-trusted-ca-bundle\") pod \"telemeter-client-75f7f6db87-tnr4d\" (UID: \"2db51ca0-6cdc-4f9c-b747-6a7f612dba5e\") " pod="openshift-monitoring/telemeter-client-75f7f6db87-tnr4d" Apr 16 20:14:48.502235 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:48.502208 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/2db51ca0-6cdc-4f9c-b747-6a7f612dba5e-secret-telemeter-client\") pod \"telemeter-client-75f7f6db87-tnr4d\" (UID: \"2db51ca0-6cdc-4f9c-b747-6a7f612dba5e\") " pod="openshift-monitoring/telemeter-client-75f7f6db87-tnr4d" Apr 16 20:14:48.502395 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:48.502377 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/2db51ca0-6cdc-4f9c-b747-6a7f612dba5e-federate-client-tls\") pod \"telemeter-client-75f7f6db87-tnr4d\" (UID: \"2db51ca0-6cdc-4f9c-b747-6a7f612dba5e\") " pod="openshift-monitoring/telemeter-client-75f7f6db87-tnr4d" Apr 16 20:14:48.502692 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:48.502674 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d427af70-58f2-4318-aa05-1933c3bcd25e-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-64wr5\" (UID: \"d427af70-58f2-4318-aa05-1933c3bcd25e\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-64wr5" Apr 16 20:14:48.503115 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:48.503092 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/2db51ca0-6cdc-4f9c-b747-6a7f612dba5e-telemeter-client-tls\") pod \"telemeter-client-75f7f6db87-tnr4d\" (UID: \"2db51ca0-6cdc-4f9c-b747-6a7f612dba5e\") " pod="openshift-monitoring/telemeter-client-75f7f6db87-tnr4d" Apr 16 20:14:48.503221 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:48.503203 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2db51ca0-6cdc-4f9c-b747-6a7f612dba5e-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-75f7f6db87-tnr4d\" (UID: \"2db51ca0-6cdc-4f9c-b747-6a7f612dba5e\") " pod="openshift-monitoring/telemeter-client-75f7f6db87-tnr4d" Apr 16 20:14:48.508879 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:48.508851 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvvjp\" (UniqueName: \"kubernetes.io/projected/2db51ca0-6cdc-4f9c-b747-6a7f612dba5e-kube-api-access-bvvjp\") pod \"telemeter-client-75f7f6db87-tnr4d\" (UID: \"2db51ca0-6cdc-4f9c-b747-6a7f612dba5e\") " pod="openshift-monitoring/telemeter-client-75f7f6db87-tnr4d" Apr 16 20:14:48.634726 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:48.634701 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-75f7f6db87-tnr4d" Apr 16 20:14:48.706369 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:48.706338 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-64wr5" Apr 16 20:14:48.757537 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:48.757496 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-75f7f6db87-tnr4d"] Apr 16 20:14:48.763832 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:14:48.763803 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2db51ca0_6cdc_4f9c_b747_6a7f612dba5e.slice/crio-83514cc98e6c23957236ebdc0823461d83177f1f394a21e18ffc0a86145e4a8f WatchSource:0}: Error finding container 83514cc98e6c23957236ebdc0823461d83177f1f394a21e18ffc0a86145e4a8f: Status 404 returned error can't find the container with id 83514cc98e6c23957236ebdc0823461d83177f1f394a21e18ffc0a86145e4a8f Apr 16 20:14:48.822700 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:48.822655 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-64wr5"] Apr 16 20:14:48.825090 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:14:48.825063 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd427af70_58f2_4318_aa05_1933c3bcd25e.slice/crio-ede6dedb4440124b6c283d400b1217b42547b778f3ac9a564e32866cf3585256 WatchSource:0}: Error finding container ede6dedb4440124b6c283d400b1217b42547b778f3ac9a564e32866cf3585256: Status 404 returned error can't find the container with id ede6dedb4440124b6c283d400b1217b42547b778f3ac9a564e32866cf3585256 Apr 16 20:14:49.456046 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:49.455986 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-64wr5" event={"ID":"d427af70-58f2-4318-aa05-1933c3bcd25e","Type":"ContainerStarted","Data":"ede6dedb4440124b6c283d400b1217b42547b778f3ac9a564e32866cf3585256"} Apr 16 20:14:49.457550 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:49.457414 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-75f7f6db87-tnr4d" event={"ID":"2db51ca0-6cdc-4f9c-b747-6a7f612dba5e","Type":"ContainerStarted","Data":"83514cc98e6c23957236ebdc0823461d83177f1f394a21e18ffc0a86145e4a8f"} Apr 16 20:14:51.464128 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:51.464098 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-64wr5" event={"ID":"d427af70-58f2-4318-aa05-1933c3bcd25e","Type":"ContainerStarted","Data":"e05e79b5900d398f8b4dd6412730fae43c83de449c04f63b87015a432b71a695"} Apr 16 20:14:51.465272 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:51.465136 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-64wr5" Apr 16 20:14:51.465393 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:51.465373 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-75f7f6db87-tnr4d" event={"ID":"2db51ca0-6cdc-4f9c-b747-6a7f612dba5e","Type":"ContainerStarted","Data":"1ec1ecfc691901715fa7c7ad34383da08521a43a90196655abbc7b518a8038bc"} Apr 16 20:14:51.469908 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:51.469888 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-64wr5" Apr 16 20:14:51.482555 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:51.482508 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-64wr5" podStartSLOduration=2.829307895 podStartE2EDuration="4.482490662s" podCreationTimestamp="2026-04-16 20:14:47 +0000 UTC" firstStartedPulling="2026-04-16 20:14:48.826884833 +0000 UTC m=+188.503153047" lastFinishedPulling="2026-04-16 20:14:50.480067604 +0000 UTC m=+190.156335814" observedRunningTime="2026-04-16 20:14:51.480522889 +0000 UTC m=+191.156791122" watchObservedRunningTime="2026-04-16 20:14:51.482490662 +0000 UTC m=+191.158758895" Apr 16 20:14:52.470054 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:52.470012 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-75f7f6db87-tnr4d" event={"ID":"2db51ca0-6cdc-4f9c-b747-6a7f612dba5e","Type":"ContainerStarted","Data":"0173a0484d389eff3b2242a67f6e9ba706d9508dd8b86da9d35fd8e087a93401"} Apr 16 20:14:52.470054 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:52.470055 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-75f7f6db87-tnr4d" event={"ID":"2db51ca0-6cdc-4f9c-b747-6a7f612dba5e","Type":"ContainerStarted","Data":"e60fe33477f346833002e5ba41906c01756b03188dcdc900e5839255c5422375"} Apr 16 20:14:52.495084 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:52.494988 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-75f7f6db87-tnr4d" podStartSLOduration=1.800984718 podStartE2EDuration="4.494970075s" podCreationTimestamp="2026-04-16 20:14:48 +0000 UTC" firstStartedPulling="2026-04-16 20:14:48.766750726 +0000 UTC m=+188.443018944" lastFinishedPulling="2026-04-16 20:14:51.460736078 +0000 UTC m=+191.137004301" observedRunningTime="2026-04-16 20:14:52.494530154 +0000 UTC m=+192.170798388" watchObservedRunningTime="2026-04-16 20:14:52.494970075 +0000 UTC m=+192.171238301" Apr 16 20:14:54.463448 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:54.463419 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-6575947857-tnl4r" Apr 16 20:14:56.871309 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:56.871277 2577 scope.go:117] "RemoveContainer" containerID="a948ce18ec4e45930d3d7677af36b634fba5a06fde7a4d9fff354a17436cd421" Apr 16 20:14:57.486564 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:57.486534 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-gbsjj_066e6004-882a-4b2c-a626-ecac83f8b502/console-operator/2.log" Apr 16 20:14:57.486748 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:57.486632 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-gbsjj" event={"ID":"066e6004-882a-4b2c-a626-ecac83f8b502","Type":"ContainerStarted","Data":"ee82075ad45406ed8f7dc12f6b40aa7eeaaeda5b31eaa749f6af32ab5e673800"} Apr 16 20:14:57.486962 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:57.486915 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-gbsjj" Apr 16 20:14:57.507397 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:57.507357 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-gbsjj" podStartSLOduration=56.226305489 podStartE2EDuration="58.507345972s" podCreationTimestamp="2026-04-16 20:13:59 +0000 UTC" firstStartedPulling="2026-04-16 20:14:00.430080513 +0000 UTC m=+140.106348726" lastFinishedPulling="2026-04-16 20:14:02.711120986 +0000 UTC m=+142.387389209" observedRunningTime="2026-04-16 20:14:57.505597409 +0000 UTC m=+197.181865640" watchObservedRunningTime="2026-04-16 20:14:57.507345972 +0000 UTC m=+197.183614203" Apr 16 20:14:57.566263 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:57.566238 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-gbsjj" Apr 16 20:14:57.750735 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:57.750649 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-lzfpw"] Apr 16 20:14:57.753355 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:57.753328 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-lzfpw" Apr 16 20:14:57.755740 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:57.755721 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-5tqqr\"" Apr 16 20:14:57.755864 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:57.755827 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 16 20:14:57.755864 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:57.755839 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 16 20:14:57.763430 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:57.763400 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-lzfpw"] Apr 16 20:14:57.881344 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:57.881313 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnw8c\" (UniqueName: \"kubernetes.io/projected/3362abe5-749c-485c-998e-fa0949747cac-kube-api-access-dnw8c\") pod \"downloads-6bcc868b7-lzfpw\" (UID: \"3362abe5-749c-485c-998e-fa0949747cac\") " pod="openshift-console/downloads-6bcc868b7-lzfpw" Apr 16 20:14:57.982317 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:57.982283 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dnw8c\" (UniqueName: \"kubernetes.io/projected/3362abe5-749c-485c-998e-fa0949747cac-kube-api-access-dnw8c\") pod \"downloads-6bcc868b7-lzfpw\" (UID: \"3362abe5-749c-485c-998e-fa0949747cac\") " pod="openshift-console/downloads-6bcc868b7-lzfpw" Apr 16 20:14:57.990722 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:57.990701 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnw8c\" (UniqueName: \"kubernetes.io/projected/3362abe5-749c-485c-998e-fa0949747cac-kube-api-access-dnw8c\") pod \"downloads-6bcc868b7-lzfpw\" (UID: \"3362abe5-749c-485c-998e-fa0949747cac\") " pod="openshift-console/downloads-6bcc868b7-lzfpw" Apr 16 20:14:58.064206 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:58.064118 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-lzfpw" Apr 16 20:14:58.185310 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:58.185232 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-lzfpw"] Apr 16 20:14:58.188100 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:14:58.188060 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3362abe5_749c_485c_998e_fa0949747cac.slice/crio-3cffd694da7d56b76bdecbad2fa131aefa12151dc778a470c2df3a69ba8333cf WatchSource:0}: Error finding container 3cffd694da7d56b76bdecbad2fa131aefa12151dc778a470c2df3a69ba8333cf: Status 404 returned error can't find the container with id 3cffd694da7d56b76bdecbad2fa131aefa12151dc778a470c2df3a69ba8333cf Apr 16 20:14:58.492059 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:14:58.492027 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-lzfpw" event={"ID":"3362abe5-749c-485c-998e-fa0949747cac","Type":"ContainerStarted","Data":"3cffd694da7d56b76bdecbad2fa131aefa12151dc778a470c2df3a69ba8333cf"} Apr 16 20:15:08.398340 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:08.398306 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-54dc6597cd-dpl68"] Apr 16 20:15:08.401294 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:08.401268 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-54dc6597cd-dpl68" Apr 16 20:15:08.405107 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:08.405080 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 16 20:15:08.405386 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:08.405366 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 16 20:15:08.405504 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:08.405420 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 16 20:15:08.405504 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:08.405455 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 16 20:15:08.405504 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:08.405464 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-wzzgx\"" Apr 16 20:15:08.405504 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:08.405464 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 16 20:15:08.421571 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:08.421548 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-54dc6597cd-dpl68"] Apr 16 20:15:08.584639 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:08.584609 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9fjs\" (UniqueName: \"kubernetes.io/projected/d2e7b5fb-0f2a-4fd9-8dd6-cfe6e132098b-kube-api-access-p9fjs\") pod \"console-54dc6597cd-dpl68\" (UID: \"d2e7b5fb-0f2a-4fd9-8dd6-cfe6e132098b\") " pod="openshift-console/console-54dc6597cd-dpl68" Apr 16 20:15:08.584788 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:08.584652 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d2e7b5fb-0f2a-4fd9-8dd6-cfe6e132098b-console-config\") pod \"console-54dc6597cd-dpl68\" (UID: \"d2e7b5fb-0f2a-4fd9-8dd6-cfe6e132098b\") " pod="openshift-console/console-54dc6597cd-dpl68" Apr 16 20:15:08.584788 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:08.584722 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d2e7b5fb-0f2a-4fd9-8dd6-cfe6e132098b-console-oauth-config\") pod \"console-54dc6597cd-dpl68\" (UID: \"d2e7b5fb-0f2a-4fd9-8dd6-cfe6e132098b\") " pod="openshift-console/console-54dc6597cd-dpl68" Apr 16 20:15:08.584788 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:08.584749 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d2e7b5fb-0f2a-4fd9-8dd6-cfe6e132098b-service-ca\") pod \"console-54dc6597cd-dpl68\" (UID: \"d2e7b5fb-0f2a-4fd9-8dd6-cfe6e132098b\") " pod="openshift-console/console-54dc6597cd-dpl68" Apr 16 20:15:08.584788 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:08.584768 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d2e7b5fb-0f2a-4fd9-8dd6-cfe6e132098b-oauth-serving-cert\") pod \"console-54dc6597cd-dpl68\" (UID: \"d2e7b5fb-0f2a-4fd9-8dd6-cfe6e132098b\") " pod="openshift-console/console-54dc6597cd-dpl68" Apr 16 20:15:08.584970 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:08.584794 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d2e7b5fb-0f2a-4fd9-8dd6-cfe6e132098b-console-serving-cert\") pod \"console-54dc6597cd-dpl68\" (UID: \"d2e7b5fb-0f2a-4fd9-8dd6-cfe6e132098b\") " pod="openshift-console/console-54dc6597cd-dpl68" Apr 16 20:15:08.686281 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:08.686199 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d2e7b5fb-0f2a-4fd9-8dd6-cfe6e132098b-service-ca\") pod \"console-54dc6597cd-dpl68\" (UID: \"d2e7b5fb-0f2a-4fd9-8dd6-cfe6e132098b\") " pod="openshift-console/console-54dc6597cd-dpl68" Apr 16 20:15:08.686281 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:08.686233 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d2e7b5fb-0f2a-4fd9-8dd6-cfe6e132098b-oauth-serving-cert\") pod \"console-54dc6597cd-dpl68\" (UID: \"d2e7b5fb-0f2a-4fd9-8dd6-cfe6e132098b\") " pod="openshift-console/console-54dc6597cd-dpl68" Apr 16 20:15:08.686503 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:08.686352 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d2e7b5fb-0f2a-4fd9-8dd6-cfe6e132098b-console-serving-cert\") pod \"console-54dc6597cd-dpl68\" (UID: \"d2e7b5fb-0f2a-4fd9-8dd6-cfe6e132098b\") " pod="openshift-console/console-54dc6597cd-dpl68" Apr 16 20:15:08.686503 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:08.686428 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p9fjs\" (UniqueName: \"kubernetes.io/projected/d2e7b5fb-0f2a-4fd9-8dd6-cfe6e132098b-kube-api-access-p9fjs\") pod \"console-54dc6597cd-dpl68\" (UID: \"d2e7b5fb-0f2a-4fd9-8dd6-cfe6e132098b\") " pod="openshift-console/console-54dc6597cd-dpl68" Apr 16 20:15:08.686503 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:08.686462 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d2e7b5fb-0f2a-4fd9-8dd6-cfe6e132098b-console-config\") pod \"console-54dc6597cd-dpl68\" (UID: \"d2e7b5fb-0f2a-4fd9-8dd6-cfe6e132098b\") " pod="openshift-console/console-54dc6597cd-dpl68" Apr 16 20:15:08.686658 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:08.686529 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d2e7b5fb-0f2a-4fd9-8dd6-cfe6e132098b-console-oauth-config\") pod \"console-54dc6597cd-dpl68\" (UID: \"d2e7b5fb-0f2a-4fd9-8dd6-cfe6e132098b\") " pod="openshift-console/console-54dc6597cd-dpl68" Apr 16 20:15:08.687042 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:08.687014 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d2e7b5fb-0f2a-4fd9-8dd6-cfe6e132098b-service-ca\") pod \"console-54dc6597cd-dpl68\" (UID: \"d2e7b5fb-0f2a-4fd9-8dd6-cfe6e132098b\") " pod="openshift-console/console-54dc6597cd-dpl68" Apr 16 20:15:08.687390 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:08.687354 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d2e7b5fb-0f2a-4fd9-8dd6-cfe6e132098b-console-config\") pod \"console-54dc6597cd-dpl68\" (UID: \"d2e7b5fb-0f2a-4fd9-8dd6-cfe6e132098b\") " pod="openshift-console/console-54dc6597cd-dpl68" Apr 16 20:15:08.687505 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:08.687428 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d2e7b5fb-0f2a-4fd9-8dd6-cfe6e132098b-oauth-serving-cert\") pod \"console-54dc6597cd-dpl68\" (UID: \"d2e7b5fb-0f2a-4fd9-8dd6-cfe6e132098b\") " pod="openshift-console/console-54dc6597cd-dpl68" Apr 16 20:15:08.689428 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:08.689324 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d2e7b5fb-0f2a-4fd9-8dd6-cfe6e132098b-console-oauth-config\") pod \"console-54dc6597cd-dpl68\" (UID: \"d2e7b5fb-0f2a-4fd9-8dd6-cfe6e132098b\") " pod="openshift-console/console-54dc6597cd-dpl68" Apr 16 20:15:08.689641 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:08.689620 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d2e7b5fb-0f2a-4fd9-8dd6-cfe6e132098b-console-serving-cert\") pod \"console-54dc6597cd-dpl68\" (UID: \"d2e7b5fb-0f2a-4fd9-8dd6-cfe6e132098b\") " pod="openshift-console/console-54dc6597cd-dpl68" Apr 16 20:15:08.700263 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:08.700231 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9fjs\" (UniqueName: \"kubernetes.io/projected/d2e7b5fb-0f2a-4fd9-8dd6-cfe6e132098b-kube-api-access-p9fjs\") pod \"console-54dc6597cd-dpl68\" (UID: \"d2e7b5fb-0f2a-4fd9-8dd6-cfe6e132098b\") " pod="openshift-console/console-54dc6597cd-dpl68" Apr 16 20:15:08.713001 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:08.712975 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-54dc6597cd-dpl68" Apr 16 20:15:13.711649 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:13.711624 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-54dc6597cd-dpl68"] Apr 16 20:15:13.716238 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:15:13.716207 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd2e7b5fb_0f2a_4fd9_8dd6_cfe6e132098b.slice/crio-7dd815bb61b50634c61674f7267d6daf6f47c8037bc05ed068fbcb7008fbe3ef WatchSource:0}: Error finding container 7dd815bb61b50634c61674f7267d6daf6f47c8037bc05ed068fbcb7008fbe3ef: Status 404 returned error can't find the container with id 7dd815bb61b50634c61674f7267d6daf6f47c8037bc05ed068fbcb7008fbe3ef Apr 16 20:15:14.542453 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:14.542405 2577 generic.go:358] "Generic (PLEG): container finished" podID="f6c48392-b5b1-4068-b4a8-8931576660ed" containerID="008ef17e3e664d0ad59432817fafe41029ddac70d6ca26ce6b12195aba40554c" exitCode=0 Apr 16 20:15:14.542716 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:14.542478 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-z9g4t" event={"ID":"f6c48392-b5b1-4068-b4a8-8931576660ed","Type":"ContainerDied","Data":"008ef17e3e664d0ad59432817fafe41029ddac70d6ca26ce6b12195aba40554c"} Apr 16 20:15:14.542908 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:14.542887 2577 scope.go:117] "RemoveContainer" containerID="008ef17e3e664d0ad59432817fafe41029ddac70d6ca26ce6b12195aba40554c" Apr 16 20:15:14.544622 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:14.544550 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-54dc6597cd-dpl68" event={"ID":"d2e7b5fb-0f2a-4fd9-8dd6-cfe6e132098b","Type":"ContainerStarted","Data":"7dd815bb61b50634c61674f7267d6daf6f47c8037bc05ed068fbcb7008fbe3ef"} Apr 16 20:15:14.547459 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:14.547386 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-lzfpw" event={"ID":"3362abe5-749c-485c-998e-fa0949747cac","Type":"ContainerStarted","Data":"2087732bb1bf7ad5d7af73b2d4156ccb0d1ac103a9fa4af8fe47b11ec5452469"} Apr 16 20:15:14.547825 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:14.547795 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-lzfpw" Apr 16 20:15:14.565059 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:14.565024 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-lzfpw" Apr 16 20:15:14.589129 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:14.589063 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-lzfpw" podStartSLOduration=2.129877179 podStartE2EDuration="17.589042856s" podCreationTimestamp="2026-04-16 20:14:57 +0000 UTC" firstStartedPulling="2026-04-16 20:14:58.190349937 +0000 UTC m=+197.866618150" lastFinishedPulling="2026-04-16 20:15:13.649515617 +0000 UTC m=+213.325783827" observedRunningTime="2026-04-16 20:15:14.587732612 +0000 UTC m=+214.264000857" watchObservedRunningTime="2026-04-16 20:15:14.589042856 +0000 UTC m=+214.265311093" Apr 16 20:15:15.561750 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:15.559105 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-z9g4t" event={"ID":"f6c48392-b5b1-4068-b4a8-8931576660ed","Type":"ContainerStarted","Data":"7c902268e84d0eb006a4f7621d43fdf52ea4773034da1eef29e992bd9f2e3da2"} Apr 16 20:15:16.862125 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:16.862084 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7594bc574f-tlq6c"] Apr 16 20:15:16.883319 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:16.883278 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7594bc574f-tlq6c"] Apr 16 20:15:16.883450 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:16.883431 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7594bc574f-tlq6c" Apr 16 20:15:16.891548 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:16.891520 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 16 20:15:16.955703 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:16.955670 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5e1e8da8-7519-4844-b6ec-eb288cf55d5e-console-oauth-config\") pod \"console-7594bc574f-tlq6c\" (UID: \"5e1e8da8-7519-4844-b6ec-eb288cf55d5e\") " pod="openshift-console/console-7594bc574f-tlq6c" Apr 16 20:15:16.955703 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:16.955703 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5e1e8da8-7519-4844-b6ec-eb288cf55d5e-oauth-serving-cert\") pod \"console-7594bc574f-tlq6c\" (UID: \"5e1e8da8-7519-4844-b6ec-eb288cf55d5e\") " pod="openshift-console/console-7594bc574f-tlq6c" Apr 16 20:15:16.955899 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:16.955733 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46v2p\" (UniqueName: \"kubernetes.io/projected/5e1e8da8-7519-4844-b6ec-eb288cf55d5e-kube-api-access-46v2p\") pod \"console-7594bc574f-tlq6c\" (UID: \"5e1e8da8-7519-4844-b6ec-eb288cf55d5e\") " pod="openshift-console/console-7594bc574f-tlq6c" Apr 16 20:15:16.955899 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:16.955862 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5e1e8da8-7519-4844-b6ec-eb288cf55d5e-console-serving-cert\") pod \"console-7594bc574f-tlq6c\" (UID: \"5e1e8da8-7519-4844-b6ec-eb288cf55d5e\") " pod="openshift-console/console-7594bc574f-tlq6c" Apr 16 20:15:16.955899 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:16.955894 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5e1e8da8-7519-4844-b6ec-eb288cf55d5e-console-config\") pod \"console-7594bc574f-tlq6c\" (UID: \"5e1e8da8-7519-4844-b6ec-eb288cf55d5e\") " pod="openshift-console/console-7594bc574f-tlq6c" Apr 16 20:15:16.956046 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:16.955914 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5e1e8da8-7519-4844-b6ec-eb288cf55d5e-trusted-ca-bundle\") pod \"console-7594bc574f-tlq6c\" (UID: \"5e1e8da8-7519-4844-b6ec-eb288cf55d5e\") " pod="openshift-console/console-7594bc574f-tlq6c" Apr 16 20:15:16.956046 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:16.955974 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5e1e8da8-7519-4844-b6ec-eb288cf55d5e-service-ca\") pod \"console-7594bc574f-tlq6c\" (UID: \"5e1e8da8-7519-4844-b6ec-eb288cf55d5e\") " pod="openshift-console/console-7594bc574f-tlq6c" Apr 16 20:15:16.983148 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:16.983117 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-x28rw_260cd2e6-6dc5-4bf2-bbc0-56935883c1da/kube-state-metrics/0.log" Apr 16 20:15:17.056428 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:17.056396 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5e1e8da8-7519-4844-b6ec-eb288cf55d5e-console-serving-cert\") pod \"console-7594bc574f-tlq6c\" (UID: \"5e1e8da8-7519-4844-b6ec-eb288cf55d5e\") " pod="openshift-console/console-7594bc574f-tlq6c" Apr 16 20:15:17.056428 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:17.056433 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5e1e8da8-7519-4844-b6ec-eb288cf55d5e-console-config\") pod \"console-7594bc574f-tlq6c\" (UID: \"5e1e8da8-7519-4844-b6ec-eb288cf55d5e\") " pod="openshift-console/console-7594bc574f-tlq6c" Apr 16 20:15:17.056646 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:17.056460 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5e1e8da8-7519-4844-b6ec-eb288cf55d5e-trusted-ca-bundle\") pod \"console-7594bc574f-tlq6c\" (UID: \"5e1e8da8-7519-4844-b6ec-eb288cf55d5e\") " pod="openshift-console/console-7594bc574f-tlq6c" Apr 16 20:15:17.056646 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:17.056505 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5e1e8da8-7519-4844-b6ec-eb288cf55d5e-service-ca\") pod \"console-7594bc574f-tlq6c\" (UID: \"5e1e8da8-7519-4844-b6ec-eb288cf55d5e\") " pod="openshift-console/console-7594bc574f-tlq6c" Apr 16 20:15:17.056646 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:17.056616 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5e1e8da8-7519-4844-b6ec-eb288cf55d5e-console-oauth-config\") pod \"console-7594bc574f-tlq6c\" (UID: \"5e1e8da8-7519-4844-b6ec-eb288cf55d5e\") " pod="openshift-console/console-7594bc574f-tlq6c" Apr 16 20:15:17.056646 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:17.056644 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5e1e8da8-7519-4844-b6ec-eb288cf55d5e-oauth-serving-cert\") pod \"console-7594bc574f-tlq6c\" (UID: \"5e1e8da8-7519-4844-b6ec-eb288cf55d5e\") " pod="openshift-console/console-7594bc574f-tlq6c" Apr 16 20:15:17.056841 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:17.056668 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-46v2p\" (UniqueName: \"kubernetes.io/projected/5e1e8da8-7519-4844-b6ec-eb288cf55d5e-kube-api-access-46v2p\") pod \"console-7594bc574f-tlq6c\" (UID: \"5e1e8da8-7519-4844-b6ec-eb288cf55d5e\") " pod="openshift-console/console-7594bc574f-tlq6c" Apr 16 20:15:17.057356 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:17.057329 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5e1e8da8-7519-4844-b6ec-eb288cf55d5e-service-ca\") pod \"console-7594bc574f-tlq6c\" (UID: \"5e1e8da8-7519-4844-b6ec-eb288cf55d5e\") " pod="openshift-console/console-7594bc574f-tlq6c" Apr 16 20:15:17.057356 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:17.057344 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5e1e8da8-7519-4844-b6ec-eb288cf55d5e-oauth-serving-cert\") pod \"console-7594bc574f-tlq6c\" (UID: \"5e1e8da8-7519-4844-b6ec-eb288cf55d5e\") " pod="openshift-console/console-7594bc574f-tlq6c" Apr 16 20:15:17.057533 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:17.057409 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5e1e8da8-7519-4844-b6ec-eb288cf55d5e-trusted-ca-bundle\") pod \"console-7594bc574f-tlq6c\" (UID: \"5e1e8da8-7519-4844-b6ec-eb288cf55d5e\") " pod="openshift-console/console-7594bc574f-tlq6c" Apr 16 20:15:17.057533 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:17.057449 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5e1e8da8-7519-4844-b6ec-eb288cf55d5e-console-config\") pod \"console-7594bc574f-tlq6c\" (UID: \"5e1e8da8-7519-4844-b6ec-eb288cf55d5e\") " pod="openshift-console/console-7594bc574f-tlq6c" Apr 16 20:15:17.059322 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:17.059286 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5e1e8da8-7519-4844-b6ec-eb288cf55d5e-console-serving-cert\") pod \"console-7594bc574f-tlq6c\" (UID: \"5e1e8da8-7519-4844-b6ec-eb288cf55d5e\") " pod="openshift-console/console-7594bc574f-tlq6c" Apr 16 20:15:17.059425 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:17.059414 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5e1e8da8-7519-4844-b6ec-eb288cf55d5e-console-oauth-config\") pod \"console-7594bc574f-tlq6c\" (UID: \"5e1e8da8-7519-4844-b6ec-eb288cf55d5e\") " pod="openshift-console/console-7594bc574f-tlq6c" Apr 16 20:15:17.071485 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:17.071465 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-46v2p\" (UniqueName: \"kubernetes.io/projected/5e1e8da8-7519-4844-b6ec-eb288cf55d5e-kube-api-access-46v2p\") pod \"console-7594bc574f-tlq6c\" (UID: \"5e1e8da8-7519-4844-b6ec-eb288cf55d5e\") " pod="openshift-console/console-7594bc574f-tlq6c" Apr 16 20:15:17.182515 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:17.182439 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-x28rw_260cd2e6-6dc5-4bf2-bbc0-56935883c1da/kube-rbac-proxy-main/0.log" Apr 16 20:15:17.195469 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:17.195441 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7594bc574f-tlq6c" Apr 16 20:15:17.382305 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:17.382241 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-x28rw_260cd2e6-6dc5-4bf2-bbc0-56935883c1da/kube-rbac-proxy-self/0.log" Apr 16 20:15:17.432032 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:17.431942 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7594bc574f-tlq6c"] Apr 16 20:15:17.434435 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:15:17.434377 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e1e8da8_7519_4844_b6ec_eb288cf55d5e.slice/crio-820e889c09eaec9fda58b2f30723cecc8c337c8235fd98f5264a5d9145c00357 WatchSource:0}: Error finding container 820e889c09eaec9fda58b2f30723cecc8c337c8235fd98f5264a5d9145c00357: Status 404 returned error can't find the container with id 820e889c09eaec9fda58b2f30723cecc8c337c8235fd98f5264a5d9145c00357 Apr 16 20:15:17.568362 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:17.568328 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-54dc6597cd-dpl68" event={"ID":"d2e7b5fb-0f2a-4fd9-8dd6-cfe6e132098b","Type":"ContainerStarted","Data":"d04cd5e4abd9593a448dbc018dc134ca4a9e4a4b3e875ba307eddc291c05efb3"} Apr 16 20:15:17.569989 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:17.569938 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7594bc574f-tlq6c" event={"ID":"5e1e8da8-7519-4844-b6ec-eb288cf55d5e","Type":"ContainerStarted","Data":"d6c9e477d3286341e44a44f1309766324ac74b944439e9648a6a2c9cd0757e58"} Apr 16 20:15:17.570115 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:17.569996 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7594bc574f-tlq6c" event={"ID":"5e1e8da8-7519-4844-b6ec-eb288cf55d5e","Type":"ContainerStarted","Data":"820e889c09eaec9fda58b2f30723cecc8c337c8235fd98f5264a5d9145c00357"} Apr 16 20:15:17.586598 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:17.586550 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-54dc6597cd-dpl68" podStartSLOduration=6.011088528 podStartE2EDuration="9.586534609s" podCreationTimestamp="2026-04-16 20:15:08 +0000 UTC" firstStartedPulling="2026-04-16 20:15:13.718069121 +0000 UTC m=+213.394337335" lastFinishedPulling="2026-04-16 20:15:17.293515187 +0000 UTC m=+216.969783416" observedRunningTime="2026-04-16 20:15:17.585422683 +0000 UTC m=+217.261690918" watchObservedRunningTime="2026-04-16 20:15:17.586534609 +0000 UTC m=+217.262802852" Apr 16 20:15:17.607906 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:17.607848 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7594bc574f-tlq6c" podStartSLOduration=1.607829768 podStartE2EDuration="1.607829768s" podCreationTimestamp="2026-04-16 20:15:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:15:17.606785105 +0000 UTC m=+217.283053338" watchObservedRunningTime="2026-04-16 20:15:17.607829768 +0000 UTC m=+217.284098001" Apr 16 20:15:17.784419 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:17.784342 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-64wr5_d427af70-58f2-4318-aa05-1933c3bcd25e/monitoring-plugin/0.log" Apr 16 20:15:18.581286 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:18.581260 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-qmmmv_eb507573-fa4f-4085-a62c-3ffc4585f0c8/init-textfile/0.log" Apr 16 20:15:18.713663 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:18.713627 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-54dc6597cd-dpl68" Apr 16 20:15:18.713663 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:18.713665 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-54dc6597cd-dpl68" Apr 16 20:15:18.719194 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:18.719161 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-54dc6597cd-dpl68" Apr 16 20:15:18.784131 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:18.784103 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-qmmmv_eb507573-fa4f-4085-a62c-3ffc4585f0c8/node-exporter/0.log" Apr 16 20:15:18.982486 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:18.982411 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-qmmmv_eb507573-fa4f-4085-a62c-3ffc4585f0c8/kube-rbac-proxy/0.log" Apr 16 20:15:19.580655 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:19.580629 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-54dc6597cd-dpl68" Apr 16 20:15:19.781873 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:19.781817 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-2dc5g_6ce0c9f4-cc74-47c2-8391-1095355362df/kube-rbac-proxy-main/0.log" Apr 16 20:15:19.982407 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:19.982315 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-2dc5g_6ce0c9f4-cc74-47c2-8391-1095355362df/kube-rbac-proxy-self/0.log" Apr 16 20:15:20.185549 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:20.185511 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-2dc5g_6ce0c9f4-cc74-47c2-8391-1095355362df/openshift-state-metrics/0.log" Apr 16 20:15:22.382533 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:22.382501 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-75f7f6db87-tnr4d_2db51ca0-6cdc-4f9c-b747-6a7f612dba5e/telemeter-client/0.log" Apr 16 20:15:22.582058 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:22.582028 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-75f7f6db87-tnr4d_2db51ca0-6cdc-4f9c-b747-6a7f612dba5e/reload/0.log" Apr 16 20:15:22.782049 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:22.781960 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-75f7f6db87-tnr4d_2db51ca0-6cdc-4f9c-b747-6a7f612dba5e/kube-rbac-proxy/0.log" Apr 16 20:15:22.981532 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:22.981490 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6575947857-tnl4r_9e9c1e01-825d-4a05-9b6a-a90f26ea32a0/thanos-query/0.log" Apr 16 20:15:23.182136 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:23.182103 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6575947857-tnl4r_9e9c1e01-825d-4a05-9b6a-a90f26ea32a0/kube-rbac-proxy-web/0.log" Apr 16 20:15:23.381566 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:23.381533 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6575947857-tnl4r_9e9c1e01-825d-4a05-9b6a-a90f26ea32a0/kube-rbac-proxy/0.log" Apr 16 20:15:23.582121 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:23.582094 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6575947857-tnl4r_9e9c1e01-825d-4a05-9b6a-a90f26ea32a0/prom-label-proxy/0.log" Apr 16 20:15:23.782139 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:23.782111 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6575947857-tnl4r_9e9c1e01-825d-4a05-9b6a-a90f26ea32a0/kube-rbac-proxy-rules/0.log" Apr 16 20:15:23.981205 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:23.981113 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6575947857-tnl4r_9e9c1e01-825d-4a05-9b6a-a90f26ea32a0/kube-rbac-proxy-metrics/0.log" Apr 16 20:15:24.384566 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:24.384538 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-gbsjj_066e6004-882a-4b2c-a626-ecac83f8b502/console-operator/2.log" Apr 16 20:15:24.584588 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:24.584554 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-gbsjj_066e6004-882a-4b2c-a626-ecac83f8b502/console-operator/3.log" Apr 16 20:15:24.782471 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:24.782395 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-54dc6597cd-dpl68_d2e7b5fb-0f2a-4fd9-8dd6-cfe6e132098b/console/0.log" Apr 16 20:15:24.983975 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:24.983931 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-lzfpw_3362abe5-749c-485c-998e-fa0949747cac/download-server/0.log" Apr 16 20:15:27.195855 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:27.195823 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7594bc574f-tlq6c" Apr 16 20:15:27.196365 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:27.195910 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7594bc574f-tlq6c" Apr 16 20:15:27.201651 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:27.201627 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7594bc574f-tlq6c" Apr 16 20:15:27.603864 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:27.603837 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7594bc574f-tlq6c" Apr 16 20:15:27.688201 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:27.687329 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-54dc6597cd-dpl68"] Apr 16 20:15:52.565410 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:52.565379 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c7481f18-a2dc-4df2-8b59-4184bc5c7283-metrics-certs\") pod \"network-metrics-daemon-l9pv6\" (UID: \"c7481f18-a2dc-4df2-8b59-4184bc5c7283\") " pod="openshift-multus/network-metrics-daemon-l9pv6" Apr 16 20:15:52.567599 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:52.567580 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c7481f18-a2dc-4df2-8b59-4184bc5c7283-metrics-certs\") pod \"network-metrics-daemon-l9pv6\" (UID: \"c7481f18-a2dc-4df2-8b59-4184bc5c7283\") " pod="openshift-multus/network-metrics-daemon-l9pv6" Apr 16 20:15:52.718207 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:52.718149 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-54dc6597cd-dpl68" podUID="d2e7b5fb-0f2a-4fd9-8dd6-cfe6e132098b" containerName="console" containerID="cri-o://d04cd5e4abd9593a448dbc018dc134ca4a9e4a4b3e875ba307eddc291c05efb3" gracePeriod=15 Apr 16 20:15:52.778083 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:52.778056 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-8662n\"" Apr 16 20:15:52.785380 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:52.785365 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l9pv6" Apr 16 20:15:52.908638 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:52.908494 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-l9pv6"] Apr 16 20:15:52.911001 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:15:52.910972 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7481f18_a2dc_4df2_8b59_4184bc5c7283.slice/crio-fd5099136e2796eb835875d9315bbc991aeedc0c69b66659ead9a6e182c45506 WatchSource:0}: Error finding container fd5099136e2796eb835875d9315bbc991aeedc0c69b66659ead9a6e182c45506: Status 404 returned error can't find the container with id fd5099136e2796eb835875d9315bbc991aeedc0c69b66659ead9a6e182c45506 Apr 16 20:15:52.995139 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:52.995118 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-54dc6597cd-dpl68_d2e7b5fb-0f2a-4fd9-8dd6-cfe6e132098b/console/0.log" Apr 16 20:15:52.995256 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:52.995183 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-54dc6597cd-dpl68" Apr 16 20:15:53.170870 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:53.170788 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d2e7b5fb-0f2a-4fd9-8dd6-cfe6e132098b-console-config\") pod \"d2e7b5fb-0f2a-4fd9-8dd6-cfe6e132098b\" (UID: \"d2e7b5fb-0f2a-4fd9-8dd6-cfe6e132098b\") " Apr 16 20:15:53.170870 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:53.170836 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d2e7b5fb-0f2a-4fd9-8dd6-cfe6e132098b-oauth-serving-cert\") pod \"d2e7b5fb-0f2a-4fd9-8dd6-cfe6e132098b\" (UID: \"d2e7b5fb-0f2a-4fd9-8dd6-cfe6e132098b\") " Apr 16 20:15:53.171103 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:53.170861 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d2e7b5fb-0f2a-4fd9-8dd6-cfe6e132098b-console-serving-cert\") pod \"d2e7b5fb-0f2a-4fd9-8dd6-cfe6e132098b\" (UID: \"d2e7b5fb-0f2a-4fd9-8dd6-cfe6e132098b\") " Apr 16 20:15:53.171103 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:53.170927 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d2e7b5fb-0f2a-4fd9-8dd6-cfe6e132098b-console-oauth-config\") pod \"d2e7b5fb-0f2a-4fd9-8dd6-cfe6e132098b\" (UID: \"d2e7b5fb-0f2a-4fd9-8dd6-cfe6e132098b\") " Apr 16 20:15:53.171103 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:53.170971 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d2e7b5fb-0f2a-4fd9-8dd6-cfe6e132098b-service-ca\") pod \"d2e7b5fb-0f2a-4fd9-8dd6-cfe6e132098b\" (UID: \"d2e7b5fb-0f2a-4fd9-8dd6-cfe6e132098b\") " Apr 16 20:15:53.171103 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:53.171022 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p9fjs\" (UniqueName: \"kubernetes.io/projected/d2e7b5fb-0f2a-4fd9-8dd6-cfe6e132098b-kube-api-access-p9fjs\") pod \"d2e7b5fb-0f2a-4fd9-8dd6-cfe6e132098b\" (UID: \"d2e7b5fb-0f2a-4fd9-8dd6-cfe6e132098b\") " Apr 16 20:15:53.171383 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:53.171357 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2e7b5fb-0f2a-4fd9-8dd6-cfe6e132098b-console-config" (OuterVolumeSpecName: "console-config") pod "d2e7b5fb-0f2a-4fd9-8dd6-cfe6e132098b" (UID: "d2e7b5fb-0f2a-4fd9-8dd6-cfe6e132098b"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:15:53.171481 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:53.171386 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2e7b5fb-0f2a-4fd9-8dd6-cfe6e132098b-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "d2e7b5fb-0f2a-4fd9-8dd6-cfe6e132098b" (UID: "d2e7b5fb-0f2a-4fd9-8dd6-cfe6e132098b"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:15:53.171481 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:53.171443 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2e7b5fb-0f2a-4fd9-8dd6-cfe6e132098b-service-ca" (OuterVolumeSpecName: "service-ca") pod "d2e7b5fb-0f2a-4fd9-8dd6-cfe6e132098b" (UID: "d2e7b5fb-0f2a-4fd9-8dd6-cfe6e132098b"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:15:53.173209 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:53.173186 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2e7b5fb-0f2a-4fd9-8dd6-cfe6e132098b-kube-api-access-p9fjs" (OuterVolumeSpecName: "kube-api-access-p9fjs") pod "d2e7b5fb-0f2a-4fd9-8dd6-cfe6e132098b" (UID: "d2e7b5fb-0f2a-4fd9-8dd6-cfe6e132098b"). InnerVolumeSpecName "kube-api-access-p9fjs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:15:53.173295 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:53.173227 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2e7b5fb-0f2a-4fd9-8dd6-cfe6e132098b-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "d2e7b5fb-0f2a-4fd9-8dd6-cfe6e132098b" (UID: "d2e7b5fb-0f2a-4fd9-8dd6-cfe6e132098b"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:15:53.173295 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:53.173263 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2e7b5fb-0f2a-4fd9-8dd6-cfe6e132098b-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "d2e7b5fb-0f2a-4fd9-8dd6-cfe6e132098b" (UID: "d2e7b5fb-0f2a-4fd9-8dd6-cfe6e132098b"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:15:53.272427 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:53.272400 2577 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d2e7b5fb-0f2a-4fd9-8dd6-cfe6e132098b-console-oauth-config\") on node \"ip-10-0-131-31.ec2.internal\" DevicePath \"\"" Apr 16 20:15:53.272427 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:53.272423 2577 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d2e7b5fb-0f2a-4fd9-8dd6-cfe6e132098b-service-ca\") on node \"ip-10-0-131-31.ec2.internal\" DevicePath \"\"" Apr 16 20:15:53.272579 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:53.272433 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-p9fjs\" (UniqueName: \"kubernetes.io/projected/d2e7b5fb-0f2a-4fd9-8dd6-cfe6e132098b-kube-api-access-p9fjs\") on node \"ip-10-0-131-31.ec2.internal\" DevicePath \"\"" Apr 16 20:15:53.272579 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:53.272452 2577 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d2e7b5fb-0f2a-4fd9-8dd6-cfe6e132098b-console-config\") on node \"ip-10-0-131-31.ec2.internal\" DevicePath \"\"" Apr 16 20:15:53.272579 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:53.272462 2577 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d2e7b5fb-0f2a-4fd9-8dd6-cfe6e132098b-oauth-serving-cert\") on node \"ip-10-0-131-31.ec2.internal\" DevicePath \"\"" Apr 16 20:15:53.272579 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:53.272471 2577 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d2e7b5fb-0f2a-4fd9-8dd6-cfe6e132098b-console-serving-cert\") on node \"ip-10-0-131-31.ec2.internal\" DevicePath \"\"" Apr 16 20:15:53.694355 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:53.694317 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-l9pv6" event={"ID":"c7481f18-a2dc-4df2-8b59-4184bc5c7283","Type":"ContainerStarted","Data":"fd5099136e2796eb835875d9315bbc991aeedc0c69b66659ead9a6e182c45506"} Apr 16 20:15:53.695420 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:53.695401 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-54dc6597cd-dpl68_d2e7b5fb-0f2a-4fd9-8dd6-cfe6e132098b/console/0.log" Apr 16 20:15:53.695514 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:53.695435 2577 generic.go:358] "Generic (PLEG): container finished" podID="d2e7b5fb-0f2a-4fd9-8dd6-cfe6e132098b" containerID="d04cd5e4abd9593a448dbc018dc134ca4a9e4a4b3e875ba307eddc291c05efb3" exitCode=2 Apr 16 20:15:53.695514 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:53.695489 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-54dc6597cd-dpl68" event={"ID":"d2e7b5fb-0f2a-4fd9-8dd6-cfe6e132098b","Type":"ContainerDied","Data":"d04cd5e4abd9593a448dbc018dc134ca4a9e4a4b3e875ba307eddc291c05efb3"} Apr 16 20:15:53.695514 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:53.695512 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-54dc6597cd-dpl68" event={"ID":"d2e7b5fb-0f2a-4fd9-8dd6-cfe6e132098b","Type":"ContainerDied","Data":"7dd815bb61b50634c61674f7267d6daf6f47c8037bc05ed068fbcb7008fbe3ef"} Apr 16 20:15:53.695610 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:53.695515 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-54dc6597cd-dpl68" Apr 16 20:15:53.695610 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:53.695526 2577 scope.go:117] "RemoveContainer" containerID="d04cd5e4abd9593a448dbc018dc134ca4a9e4a4b3e875ba307eddc291c05efb3" Apr 16 20:15:53.711544 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:53.711520 2577 scope.go:117] "RemoveContainer" containerID="d04cd5e4abd9593a448dbc018dc134ca4a9e4a4b3e875ba307eddc291c05efb3" Apr 16 20:15:53.711809 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:15:53.711789 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d04cd5e4abd9593a448dbc018dc134ca4a9e4a4b3e875ba307eddc291c05efb3\": container with ID starting with d04cd5e4abd9593a448dbc018dc134ca4a9e4a4b3e875ba307eddc291c05efb3 not found: ID does not exist" containerID="d04cd5e4abd9593a448dbc018dc134ca4a9e4a4b3e875ba307eddc291c05efb3" Apr 16 20:15:53.711854 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:53.711818 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d04cd5e4abd9593a448dbc018dc134ca4a9e4a4b3e875ba307eddc291c05efb3"} err="failed to get container status \"d04cd5e4abd9593a448dbc018dc134ca4a9e4a4b3e875ba307eddc291c05efb3\": rpc error: code = NotFound desc = could not find container \"d04cd5e4abd9593a448dbc018dc134ca4a9e4a4b3e875ba307eddc291c05efb3\": container with ID starting with d04cd5e4abd9593a448dbc018dc134ca4a9e4a4b3e875ba307eddc291c05efb3 not found: ID does not exist" Apr 16 20:15:53.717529 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:53.717505 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-54dc6597cd-dpl68"] Apr 16 20:15:53.723001 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:53.722982 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-54dc6597cd-dpl68"] Apr 16 20:15:54.699564 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:54.699535 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-l9pv6" event={"ID":"c7481f18-a2dc-4df2-8b59-4184bc5c7283","Type":"ContainerStarted","Data":"6aef1992ea1a907578f5df10cc460b57c021ec1456322b6ba6df3e45f1fca80a"} Apr 16 20:15:54.875549 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:54.875517 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2e7b5fb-0f2a-4fd9-8dd6-cfe6e132098b" path="/var/lib/kubelet/pods/d2e7b5fb-0f2a-4fd9-8dd6-cfe6e132098b/volumes" Apr 16 20:15:55.707478 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:55.707444 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-l9pv6" event={"ID":"c7481f18-a2dc-4df2-8b59-4184bc5c7283","Type":"ContainerStarted","Data":"5c47a4201480e06c7f675330bc8a5b2b9b2ebeb0bc63d81730f1ea0ea0248096"} Apr 16 20:15:55.725787 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:15:55.725732 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-l9pv6" podStartSLOduration=254.227268615 podStartE2EDuration="4m15.725718142s" podCreationTimestamp="2026-04-16 20:11:40 +0000 UTC" firstStartedPulling="2026-04-16 20:15:52.913174739 +0000 UTC m=+252.589442953" lastFinishedPulling="2026-04-16 20:15:54.41162427 +0000 UTC m=+254.087892480" observedRunningTime="2026-04-16 20:15:55.72398575 +0000 UTC m=+255.400253981" watchObservedRunningTime="2026-04-16 20:15:55.725718142 +0000 UTC m=+255.401986370" Apr 16 20:16:16.494501 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:16:16.494425 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-d4f8855db-zvqt2"] Apr 16 20:16:16.494910 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:16:16.494745 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d2e7b5fb-0f2a-4fd9-8dd6-cfe6e132098b" containerName="console" Apr 16 20:16:16.494910 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:16:16.494756 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2e7b5fb-0f2a-4fd9-8dd6-cfe6e132098b" containerName="console" Apr 16 20:16:16.494910 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:16:16.494817 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="d2e7b5fb-0f2a-4fd9-8dd6-cfe6e132098b" containerName="console" Apr 16 20:16:16.499039 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:16:16.499016 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-d4f8855db-zvqt2" Apr 16 20:16:16.507372 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:16:16.507300 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-d4f8855db-zvqt2"] Apr 16 20:16:16.556529 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:16:16.556493 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7b16828e-c4dc-448b-b30f-ca6119b260b7-console-serving-cert\") pod \"console-d4f8855db-zvqt2\" (UID: \"7b16828e-c4dc-448b-b30f-ca6119b260b7\") " pod="openshift-console/console-d4f8855db-zvqt2" Apr 16 20:16:16.556701 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:16:16.556549 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7b16828e-c4dc-448b-b30f-ca6119b260b7-console-config\") pod \"console-d4f8855db-zvqt2\" (UID: \"7b16828e-c4dc-448b-b30f-ca6119b260b7\") " pod="openshift-console/console-d4f8855db-zvqt2" Apr 16 20:16:16.556701 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:16:16.556595 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7b16828e-c4dc-448b-b30f-ca6119b260b7-console-oauth-config\") pod \"console-d4f8855db-zvqt2\" (UID: \"7b16828e-c4dc-448b-b30f-ca6119b260b7\") " pod="openshift-console/console-d4f8855db-zvqt2" Apr 16 20:16:16.556701 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:16:16.556637 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b16828e-c4dc-448b-b30f-ca6119b260b7-trusted-ca-bundle\") pod \"console-d4f8855db-zvqt2\" (UID: \"7b16828e-c4dc-448b-b30f-ca6119b260b7\") " pod="openshift-console/console-d4f8855db-zvqt2" Apr 16 20:16:16.556701 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:16:16.556702 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7b16828e-c4dc-448b-b30f-ca6119b260b7-oauth-serving-cert\") pod \"console-d4f8855db-zvqt2\" (UID: \"7b16828e-c4dc-448b-b30f-ca6119b260b7\") " pod="openshift-console/console-d4f8855db-zvqt2" Apr 16 20:16:16.556870 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:16:16.556731 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7b16828e-c4dc-448b-b30f-ca6119b260b7-service-ca\") pod \"console-d4f8855db-zvqt2\" (UID: \"7b16828e-c4dc-448b-b30f-ca6119b260b7\") " pod="openshift-console/console-d4f8855db-zvqt2" Apr 16 20:16:16.556870 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:16:16.556752 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnrnf\" (UniqueName: \"kubernetes.io/projected/7b16828e-c4dc-448b-b30f-ca6119b260b7-kube-api-access-vnrnf\") pod \"console-d4f8855db-zvqt2\" (UID: \"7b16828e-c4dc-448b-b30f-ca6119b260b7\") " pod="openshift-console/console-d4f8855db-zvqt2" Apr 16 20:16:16.658092 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:16:16.658063 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7b16828e-c4dc-448b-b30f-ca6119b260b7-oauth-serving-cert\") pod \"console-d4f8855db-zvqt2\" (UID: \"7b16828e-c4dc-448b-b30f-ca6119b260b7\") " pod="openshift-console/console-d4f8855db-zvqt2" Apr 16 20:16:16.658245 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:16:16.658100 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7b16828e-c4dc-448b-b30f-ca6119b260b7-service-ca\") pod \"console-d4f8855db-zvqt2\" (UID: \"7b16828e-c4dc-448b-b30f-ca6119b260b7\") " pod="openshift-console/console-d4f8855db-zvqt2" Apr 16 20:16:16.658245 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:16:16.658115 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vnrnf\" (UniqueName: \"kubernetes.io/projected/7b16828e-c4dc-448b-b30f-ca6119b260b7-kube-api-access-vnrnf\") pod \"console-d4f8855db-zvqt2\" (UID: \"7b16828e-c4dc-448b-b30f-ca6119b260b7\") " pod="openshift-console/console-d4f8855db-zvqt2" Apr 16 20:16:16.658331 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:16:16.658261 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7b16828e-c4dc-448b-b30f-ca6119b260b7-console-serving-cert\") pod \"console-d4f8855db-zvqt2\" (UID: \"7b16828e-c4dc-448b-b30f-ca6119b260b7\") " pod="openshift-console/console-d4f8855db-zvqt2" Apr 16 20:16:16.658331 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:16:16.658318 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7b16828e-c4dc-448b-b30f-ca6119b260b7-console-config\") pod \"console-d4f8855db-zvqt2\" (UID: \"7b16828e-c4dc-448b-b30f-ca6119b260b7\") " pod="openshift-console/console-d4f8855db-zvqt2" Apr 16 20:16:16.658415 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:16:16.658365 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7b16828e-c4dc-448b-b30f-ca6119b260b7-console-oauth-config\") pod \"console-d4f8855db-zvqt2\" (UID: \"7b16828e-c4dc-448b-b30f-ca6119b260b7\") " pod="openshift-console/console-d4f8855db-zvqt2" Apr 16 20:16:16.658415 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:16:16.658393 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b16828e-c4dc-448b-b30f-ca6119b260b7-trusted-ca-bundle\") pod \"console-d4f8855db-zvqt2\" (UID: \"7b16828e-c4dc-448b-b30f-ca6119b260b7\") " pod="openshift-console/console-d4f8855db-zvqt2" Apr 16 20:16:16.658889 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:16:16.658856 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7b16828e-c4dc-448b-b30f-ca6119b260b7-oauth-serving-cert\") pod \"console-d4f8855db-zvqt2\" (UID: \"7b16828e-c4dc-448b-b30f-ca6119b260b7\") " pod="openshift-console/console-d4f8855db-zvqt2" Apr 16 20:16:16.659053 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:16:16.658856 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7b16828e-c4dc-448b-b30f-ca6119b260b7-service-ca\") pod \"console-d4f8855db-zvqt2\" (UID: \"7b16828e-c4dc-448b-b30f-ca6119b260b7\") " pod="openshift-console/console-d4f8855db-zvqt2" Apr 16 20:16:16.659131 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:16:16.659063 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7b16828e-c4dc-448b-b30f-ca6119b260b7-console-config\") pod \"console-d4f8855db-zvqt2\" (UID: \"7b16828e-c4dc-448b-b30f-ca6119b260b7\") " pod="openshift-console/console-d4f8855db-zvqt2" Apr 16 20:16:16.659243 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:16:16.659224 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b16828e-c4dc-448b-b30f-ca6119b260b7-trusted-ca-bundle\") pod \"console-d4f8855db-zvqt2\" (UID: \"7b16828e-c4dc-448b-b30f-ca6119b260b7\") " pod="openshift-console/console-d4f8855db-zvqt2" Apr 16 20:16:16.660726 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:16:16.660702 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7b16828e-c4dc-448b-b30f-ca6119b260b7-console-oauth-config\") pod \"console-d4f8855db-zvqt2\" (UID: \"7b16828e-c4dc-448b-b30f-ca6119b260b7\") " pod="openshift-console/console-d4f8855db-zvqt2" Apr 16 20:16:16.660822 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:16:16.660803 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7b16828e-c4dc-448b-b30f-ca6119b260b7-console-serving-cert\") pod \"console-d4f8855db-zvqt2\" (UID: \"7b16828e-c4dc-448b-b30f-ca6119b260b7\") " pod="openshift-console/console-d4f8855db-zvqt2" Apr 16 20:16:16.666527 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:16:16.666504 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnrnf\" (UniqueName: \"kubernetes.io/projected/7b16828e-c4dc-448b-b30f-ca6119b260b7-kube-api-access-vnrnf\") pod \"console-d4f8855db-zvqt2\" (UID: \"7b16828e-c4dc-448b-b30f-ca6119b260b7\") " pod="openshift-console/console-d4f8855db-zvqt2" Apr 16 20:16:16.810173 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:16:16.810091 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-d4f8855db-zvqt2" Apr 16 20:16:16.943284 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:16:16.943248 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-d4f8855db-zvqt2"] Apr 16 20:16:16.946369 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:16:16.946329 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b16828e_c4dc_448b_b30f_ca6119b260b7.slice/crio-b71a2b3f1a9355fe9789604b6028d53420fc4abe36bc51a35f7fb979851d76d4 WatchSource:0}: Error finding container b71a2b3f1a9355fe9789604b6028d53420fc4abe36bc51a35f7fb979851d76d4: Status 404 returned error can't find the container with id b71a2b3f1a9355fe9789604b6028d53420fc4abe36bc51a35f7fb979851d76d4 Apr 16 20:16:17.775270 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:16:17.775180 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-d4f8855db-zvqt2" event={"ID":"7b16828e-c4dc-448b-b30f-ca6119b260b7","Type":"ContainerStarted","Data":"fd65eea43f9a6a38420150e83cbb8059c2d5f96d4774f1ab48060c9c9a481044"} Apr 16 20:16:17.775270 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:16:17.775227 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-d4f8855db-zvqt2" event={"ID":"7b16828e-c4dc-448b-b30f-ca6119b260b7","Type":"ContainerStarted","Data":"b71a2b3f1a9355fe9789604b6028d53420fc4abe36bc51a35f7fb979851d76d4"} Apr 16 20:16:17.791863 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:16:17.791819 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-d4f8855db-zvqt2" podStartSLOduration=1.7918053889999999 podStartE2EDuration="1.791805389s" podCreationTimestamp="2026-04-16 20:16:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:16:17.790382967 +0000 UTC m=+277.466651201" watchObservedRunningTime="2026-04-16 20:16:17.791805389 +0000 UTC m=+277.468073620" Apr 16 20:16:26.811008 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:16:26.810972 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-d4f8855db-zvqt2" Apr 16 20:16:26.811383 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:16:26.811020 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-d4f8855db-zvqt2" Apr 16 20:16:26.815828 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:16:26.815803 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-d4f8855db-zvqt2" Apr 16 20:16:27.806985 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:16:27.806958 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-d4f8855db-zvqt2" Apr 16 20:16:27.869192 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:16:27.869160 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7594bc574f-tlq6c"] Apr 16 20:16:40.750297 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:16:40.750265 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-gbsjj_066e6004-882a-4b2c-a626-ecac83f8b502/console-operator/2.log" Apr 16 20:16:40.752181 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:16:40.752156 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-gbsjj_066e6004-882a-4b2c-a626-ecac83f8b502/console-operator/2.log" Apr 16 20:16:40.766790 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:16:40.766769 2577 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 20:16:52.888185 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:16:52.888125 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-7594bc574f-tlq6c" podUID="5e1e8da8-7519-4844-b6ec-eb288cf55d5e" containerName="console" containerID="cri-o://d6c9e477d3286341e44a44f1309766324ac74b944439e9648a6a2c9cd0757e58" gracePeriod=15 Apr 16 20:16:53.124308 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:16:53.124281 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7594bc574f-tlq6c_5e1e8da8-7519-4844-b6ec-eb288cf55d5e/console/0.log" Apr 16 20:16:53.124424 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:16:53.124340 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7594bc574f-tlq6c" Apr 16 20:16:53.257672 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:16:53.257573 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5e1e8da8-7519-4844-b6ec-eb288cf55d5e-console-config\") pod \"5e1e8da8-7519-4844-b6ec-eb288cf55d5e\" (UID: \"5e1e8da8-7519-4844-b6ec-eb288cf55d5e\") " Apr 16 20:16:53.257672 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:16:53.257647 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5e1e8da8-7519-4844-b6ec-eb288cf55d5e-console-oauth-config\") pod \"5e1e8da8-7519-4844-b6ec-eb288cf55d5e\" (UID: \"5e1e8da8-7519-4844-b6ec-eb288cf55d5e\") " Apr 16 20:16:53.257899 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:16:53.257698 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-46v2p\" (UniqueName: \"kubernetes.io/projected/5e1e8da8-7519-4844-b6ec-eb288cf55d5e-kube-api-access-46v2p\") pod \"5e1e8da8-7519-4844-b6ec-eb288cf55d5e\" (UID: \"5e1e8da8-7519-4844-b6ec-eb288cf55d5e\") " Apr 16 20:16:53.257899 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:16:53.257743 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5e1e8da8-7519-4844-b6ec-eb288cf55d5e-service-ca\") pod \"5e1e8da8-7519-4844-b6ec-eb288cf55d5e\" (UID: \"5e1e8da8-7519-4844-b6ec-eb288cf55d5e\") " Apr 16 20:16:53.257899 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:16:53.257769 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5e1e8da8-7519-4844-b6ec-eb288cf55d5e-oauth-serving-cert\") pod \"5e1e8da8-7519-4844-b6ec-eb288cf55d5e\" (UID: \"5e1e8da8-7519-4844-b6ec-eb288cf55d5e\") " Apr 16 20:16:53.257899 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:16:53.257795 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5e1e8da8-7519-4844-b6ec-eb288cf55d5e-console-serving-cert\") pod \"5e1e8da8-7519-4844-b6ec-eb288cf55d5e\" (UID: \"5e1e8da8-7519-4844-b6ec-eb288cf55d5e\") " Apr 16 20:16:53.257899 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:16:53.257822 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5e1e8da8-7519-4844-b6ec-eb288cf55d5e-trusted-ca-bundle\") pod \"5e1e8da8-7519-4844-b6ec-eb288cf55d5e\" (UID: \"5e1e8da8-7519-4844-b6ec-eb288cf55d5e\") " Apr 16 20:16:53.258187 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:16:53.258077 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e1e8da8-7519-4844-b6ec-eb288cf55d5e-console-config" (OuterVolumeSpecName: "console-config") pod "5e1e8da8-7519-4844-b6ec-eb288cf55d5e" (UID: "5e1e8da8-7519-4844-b6ec-eb288cf55d5e"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:16:53.258237 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:16:53.258215 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e1e8da8-7519-4844-b6ec-eb288cf55d5e-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "5e1e8da8-7519-4844-b6ec-eb288cf55d5e" (UID: "5e1e8da8-7519-4844-b6ec-eb288cf55d5e"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:16:53.258237 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:16:53.258223 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e1e8da8-7519-4844-b6ec-eb288cf55d5e-service-ca" (OuterVolumeSpecName: "service-ca") pod "5e1e8da8-7519-4844-b6ec-eb288cf55d5e" (UID: "5e1e8da8-7519-4844-b6ec-eb288cf55d5e"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:16:53.258311 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:16:53.258271 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e1e8da8-7519-4844-b6ec-eb288cf55d5e-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "5e1e8da8-7519-4844-b6ec-eb288cf55d5e" (UID: "5e1e8da8-7519-4844-b6ec-eb288cf55d5e"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:16:53.258311 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:16:53.258295 2577 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5e1e8da8-7519-4844-b6ec-eb288cf55d5e-console-config\") on node \"ip-10-0-131-31.ec2.internal\" DevicePath \"\"" Apr 16 20:16:53.260149 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:16:53.260125 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e1e8da8-7519-4844-b6ec-eb288cf55d5e-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "5e1e8da8-7519-4844-b6ec-eb288cf55d5e" (UID: "5e1e8da8-7519-4844-b6ec-eb288cf55d5e"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:16:53.260251 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:16:53.260160 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e1e8da8-7519-4844-b6ec-eb288cf55d5e-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "5e1e8da8-7519-4844-b6ec-eb288cf55d5e" (UID: "5e1e8da8-7519-4844-b6ec-eb288cf55d5e"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:16:53.260251 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:16:53.260168 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e1e8da8-7519-4844-b6ec-eb288cf55d5e-kube-api-access-46v2p" (OuterVolumeSpecName: "kube-api-access-46v2p") pod "5e1e8da8-7519-4844-b6ec-eb288cf55d5e" (UID: "5e1e8da8-7519-4844-b6ec-eb288cf55d5e"). InnerVolumeSpecName "kube-api-access-46v2p". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:16:53.359477 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:16:53.359441 2577 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5e1e8da8-7519-4844-b6ec-eb288cf55d5e-console-oauth-config\") on node \"ip-10-0-131-31.ec2.internal\" DevicePath \"\"" Apr 16 20:16:53.359477 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:16:53.359472 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-46v2p\" (UniqueName: \"kubernetes.io/projected/5e1e8da8-7519-4844-b6ec-eb288cf55d5e-kube-api-access-46v2p\") on node \"ip-10-0-131-31.ec2.internal\" DevicePath \"\"" Apr 16 20:16:53.359477 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:16:53.359484 2577 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5e1e8da8-7519-4844-b6ec-eb288cf55d5e-service-ca\") on node \"ip-10-0-131-31.ec2.internal\" DevicePath \"\"" Apr 16 20:16:53.359651 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:16:53.359494 2577 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5e1e8da8-7519-4844-b6ec-eb288cf55d5e-oauth-serving-cert\") on node \"ip-10-0-131-31.ec2.internal\" DevicePath \"\"" Apr 16 20:16:53.359651 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:16:53.359503 2577 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5e1e8da8-7519-4844-b6ec-eb288cf55d5e-console-serving-cert\") on node \"ip-10-0-131-31.ec2.internal\" DevicePath \"\"" Apr 16 20:16:53.359651 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:16:53.359511 2577 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5e1e8da8-7519-4844-b6ec-eb288cf55d5e-trusted-ca-bundle\") on node \"ip-10-0-131-31.ec2.internal\" DevicePath \"\"" Apr 16 20:16:53.885130 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:16:53.885105 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7594bc574f-tlq6c_5e1e8da8-7519-4844-b6ec-eb288cf55d5e/console/0.log" Apr 16 20:16:53.885329 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:16:53.885142 2577 generic.go:358] "Generic (PLEG): container finished" podID="5e1e8da8-7519-4844-b6ec-eb288cf55d5e" containerID="d6c9e477d3286341e44a44f1309766324ac74b944439e9648a6a2c9cd0757e58" exitCode=2 Apr 16 20:16:53.885329 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:16:53.885172 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7594bc574f-tlq6c" event={"ID":"5e1e8da8-7519-4844-b6ec-eb288cf55d5e","Type":"ContainerDied","Data":"d6c9e477d3286341e44a44f1309766324ac74b944439e9648a6a2c9cd0757e58"} Apr 16 20:16:53.885329 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:16:53.885197 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7594bc574f-tlq6c" Apr 16 20:16:53.885329 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:16:53.885213 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7594bc574f-tlq6c" event={"ID":"5e1e8da8-7519-4844-b6ec-eb288cf55d5e","Type":"ContainerDied","Data":"820e889c09eaec9fda58b2f30723cecc8c337c8235fd98f5264a5d9145c00357"} Apr 16 20:16:53.885329 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:16:53.885236 2577 scope.go:117] "RemoveContainer" containerID="d6c9e477d3286341e44a44f1309766324ac74b944439e9648a6a2c9cd0757e58" Apr 16 20:16:53.893568 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:16:53.893384 2577 scope.go:117] "RemoveContainer" containerID="d6c9e477d3286341e44a44f1309766324ac74b944439e9648a6a2c9cd0757e58" Apr 16 20:16:53.893763 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:16:53.893620 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6c9e477d3286341e44a44f1309766324ac74b944439e9648a6a2c9cd0757e58\": container with ID starting with d6c9e477d3286341e44a44f1309766324ac74b944439e9648a6a2c9cd0757e58 not found: ID does not exist" containerID="d6c9e477d3286341e44a44f1309766324ac74b944439e9648a6a2c9cd0757e58" Apr 16 20:16:53.893763 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:16:53.893644 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6c9e477d3286341e44a44f1309766324ac74b944439e9648a6a2c9cd0757e58"} err="failed to get container status \"d6c9e477d3286341e44a44f1309766324ac74b944439e9648a6a2c9cd0757e58\": rpc error: code = NotFound desc = could not find container \"d6c9e477d3286341e44a44f1309766324ac74b944439e9648a6a2c9cd0757e58\": container with ID starting with d6c9e477d3286341e44a44f1309766324ac74b944439e9648a6a2c9cd0757e58 not found: ID does not exist" Apr 16 20:16:53.906049 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:16:53.906025 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7594bc574f-tlq6c"] Apr 16 20:16:53.909548 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:16:53.909528 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7594bc574f-tlq6c"] Apr 16 20:16:54.874598 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:16:54.874567 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e1e8da8-7519-4844-b6ec-eb288cf55d5e" path="/var/lib/kubelet/pods/5e1e8da8-7519-4844-b6ec-eb288cf55d5e/volumes" Apr 16 20:17:03.261198 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:17:03.261163 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4gtht"] Apr 16 20:17:03.261584 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:17:03.261485 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5e1e8da8-7519-4844-b6ec-eb288cf55d5e" containerName="console" Apr 16 20:17:03.261584 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:17:03.261497 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e1e8da8-7519-4844-b6ec-eb288cf55d5e" containerName="console" Apr 16 20:17:03.261584 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:17:03.261548 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="5e1e8da8-7519-4844-b6ec-eb288cf55d5e" containerName="console" Apr 16 20:17:03.264497 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:17:03.264482 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4gtht" Apr 16 20:17:03.267091 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:17:03.267069 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 20:17:03.267216 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:17:03.267106 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 20:17:03.267865 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:17:03.267850 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-2f7fr\"" Apr 16 20:17:03.272338 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:17:03.272317 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4gtht"] Apr 16 20:17:03.342222 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:17:03.342196 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6ca1cfa4-7d1b-4af8-a6eb-889e05824da7-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4gtht\" (UID: \"6ca1cfa4-7d1b-4af8-a6eb-889e05824da7\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4gtht" Apr 16 20:17:03.342331 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:17:03.342258 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b99n8\" (UniqueName: \"kubernetes.io/projected/6ca1cfa4-7d1b-4af8-a6eb-889e05824da7-kube-api-access-b99n8\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4gtht\" (UID: \"6ca1cfa4-7d1b-4af8-a6eb-889e05824da7\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4gtht" Apr 16 20:17:03.342331 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:17:03.342301 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6ca1cfa4-7d1b-4af8-a6eb-889e05824da7-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4gtht\" (UID: \"6ca1cfa4-7d1b-4af8-a6eb-889e05824da7\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4gtht" Apr 16 20:17:03.442678 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:17:03.442653 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6ca1cfa4-7d1b-4af8-a6eb-889e05824da7-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4gtht\" (UID: \"6ca1cfa4-7d1b-4af8-a6eb-889e05824da7\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4gtht" Apr 16 20:17:03.442798 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:17:03.442717 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b99n8\" (UniqueName: \"kubernetes.io/projected/6ca1cfa4-7d1b-4af8-a6eb-889e05824da7-kube-api-access-b99n8\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4gtht\" (UID: \"6ca1cfa4-7d1b-4af8-a6eb-889e05824da7\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4gtht" Apr 16 20:17:03.442798 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:17:03.442753 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6ca1cfa4-7d1b-4af8-a6eb-889e05824da7-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4gtht\" (UID: \"6ca1cfa4-7d1b-4af8-a6eb-889e05824da7\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4gtht" Apr 16 20:17:03.443109 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:17:03.443086 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6ca1cfa4-7d1b-4af8-a6eb-889e05824da7-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4gtht\" (UID: \"6ca1cfa4-7d1b-4af8-a6eb-889e05824da7\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4gtht" Apr 16 20:17:03.443144 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:17:03.443098 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6ca1cfa4-7d1b-4af8-a6eb-889e05824da7-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4gtht\" (UID: \"6ca1cfa4-7d1b-4af8-a6eb-889e05824da7\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4gtht" Apr 16 20:17:03.453556 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:17:03.453525 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b99n8\" (UniqueName: \"kubernetes.io/projected/6ca1cfa4-7d1b-4af8-a6eb-889e05824da7-kube-api-access-b99n8\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4gtht\" (UID: \"6ca1cfa4-7d1b-4af8-a6eb-889e05824da7\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4gtht" Apr 16 20:17:03.575097 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:17:03.575020 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4gtht" Apr 16 20:17:03.694740 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:17:03.694717 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4gtht"] Apr 16 20:17:03.697556 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:17:03.697528 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ca1cfa4_7d1b_4af8_a6eb_889e05824da7.slice/crio-669171c5b1b933d154f626350f841ee82f3c80f023de95e5da37ced34c8552de WatchSource:0}: Error finding container 669171c5b1b933d154f626350f841ee82f3c80f023de95e5da37ced34c8552de: Status 404 returned error can't find the container with id 669171c5b1b933d154f626350f841ee82f3c80f023de95e5da37ced34c8552de Apr 16 20:17:03.699684 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:17:03.699668 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 20:17:03.915157 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:17:03.915118 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4gtht" event={"ID":"6ca1cfa4-7d1b-4af8-a6eb-889e05824da7","Type":"ContainerStarted","Data":"669171c5b1b933d154f626350f841ee82f3c80f023de95e5da37ced34c8552de"} Apr 16 20:17:08.932016 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:17:08.931980 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4gtht" event={"ID":"6ca1cfa4-7d1b-4af8-a6eb-889e05824da7","Type":"ContainerStarted","Data":"c00d2dd3d94e4234ab1ba2bb722cb9d313df1951ed0c0330690495e005639585"} Apr 16 20:17:09.935276 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:17:09.935244 2577 generic.go:358] "Generic (PLEG): container finished" podID="6ca1cfa4-7d1b-4af8-a6eb-889e05824da7" containerID="c00d2dd3d94e4234ab1ba2bb722cb9d313df1951ed0c0330690495e005639585" exitCode=0 Apr 16 20:17:09.935628 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:17:09.935321 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4gtht" event={"ID":"6ca1cfa4-7d1b-4af8-a6eb-889e05824da7","Type":"ContainerDied","Data":"c00d2dd3d94e4234ab1ba2bb722cb9d313df1951ed0c0330690495e005639585"} Apr 16 20:17:11.942834 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:17:11.942809 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4gtht" event={"ID":"6ca1cfa4-7d1b-4af8-a6eb-889e05824da7","Type":"ContainerStarted","Data":"5ad6d147101db41f74f9f033c9221aa66172cfa60c4c39722153ed0ff4d17520"} Apr 16 20:17:12.947273 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:17:12.947238 2577 generic.go:358] "Generic (PLEG): container finished" podID="6ca1cfa4-7d1b-4af8-a6eb-889e05824da7" containerID="5ad6d147101db41f74f9f033c9221aa66172cfa60c4c39722153ed0ff4d17520" exitCode=0 Apr 16 20:17:12.947622 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:17:12.947330 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4gtht" event={"ID":"6ca1cfa4-7d1b-4af8-a6eb-889e05824da7","Type":"ContainerDied","Data":"5ad6d147101db41f74f9f033c9221aa66172cfa60c4c39722153ed0ff4d17520"} Apr 16 20:17:19.979181 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:17:19.979151 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4gtht" event={"ID":"6ca1cfa4-7d1b-4af8-a6eb-889e05824da7","Type":"ContainerStarted","Data":"72831f447a4aaf9603ab85e6e928d0dbfaa890cfa175e6c2448fa0ba02ead162"} Apr 16 20:17:19.998739 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:17:19.998324 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4gtht" podStartSLOduration=0.784676112 podStartE2EDuration="16.99830565s" podCreationTimestamp="2026-04-16 20:17:03 +0000 UTC" firstStartedPulling="2026-04-16 20:17:03.699804619 +0000 UTC m=+323.376072830" lastFinishedPulling="2026-04-16 20:17:19.913434158 +0000 UTC m=+339.589702368" observedRunningTime="2026-04-16 20:17:19.996566223 +0000 UTC m=+339.672834456" watchObservedRunningTime="2026-04-16 20:17:19.99830565 +0000 UTC m=+339.674573883" Apr 16 20:17:20.983605 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:17:20.983571 2577 generic.go:358] "Generic (PLEG): container finished" podID="6ca1cfa4-7d1b-4af8-a6eb-889e05824da7" containerID="72831f447a4aaf9603ab85e6e928d0dbfaa890cfa175e6c2448fa0ba02ead162" exitCode=0 Apr 16 20:17:20.983968 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:17:20.983619 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4gtht" event={"ID":"6ca1cfa4-7d1b-4af8-a6eb-889e05824da7","Type":"ContainerDied","Data":"72831f447a4aaf9603ab85e6e928d0dbfaa890cfa175e6c2448fa0ba02ead162"} Apr 16 20:17:22.104687 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:17:22.104662 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4gtht" Apr 16 20:17:22.200400 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:17:22.200375 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6ca1cfa4-7d1b-4af8-a6eb-889e05824da7-bundle\") pod \"6ca1cfa4-7d1b-4af8-a6eb-889e05824da7\" (UID: \"6ca1cfa4-7d1b-4af8-a6eb-889e05824da7\") " Apr 16 20:17:22.200525 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:17:22.200430 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6ca1cfa4-7d1b-4af8-a6eb-889e05824da7-util\") pod \"6ca1cfa4-7d1b-4af8-a6eb-889e05824da7\" (UID: \"6ca1cfa4-7d1b-4af8-a6eb-889e05824da7\") " Apr 16 20:17:22.200525 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:17:22.200470 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b99n8\" (UniqueName: \"kubernetes.io/projected/6ca1cfa4-7d1b-4af8-a6eb-889e05824da7-kube-api-access-b99n8\") pod \"6ca1cfa4-7d1b-4af8-a6eb-889e05824da7\" (UID: \"6ca1cfa4-7d1b-4af8-a6eb-889e05824da7\") " Apr 16 20:17:22.200962 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:17:22.200920 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ca1cfa4-7d1b-4af8-a6eb-889e05824da7-bundle" (OuterVolumeSpecName: "bundle") pod "6ca1cfa4-7d1b-4af8-a6eb-889e05824da7" (UID: "6ca1cfa4-7d1b-4af8-a6eb-889e05824da7"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:17:22.202710 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:17:22.202685 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ca1cfa4-7d1b-4af8-a6eb-889e05824da7-kube-api-access-b99n8" (OuterVolumeSpecName: "kube-api-access-b99n8") pod "6ca1cfa4-7d1b-4af8-a6eb-889e05824da7" (UID: "6ca1cfa4-7d1b-4af8-a6eb-889e05824da7"). InnerVolumeSpecName "kube-api-access-b99n8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:17:22.204502 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:17:22.204485 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ca1cfa4-7d1b-4af8-a6eb-889e05824da7-util" (OuterVolumeSpecName: "util") pod "6ca1cfa4-7d1b-4af8-a6eb-889e05824da7" (UID: "6ca1cfa4-7d1b-4af8-a6eb-889e05824da7"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:17:22.301252 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:17:22.301186 2577 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6ca1cfa4-7d1b-4af8-a6eb-889e05824da7-util\") on node \"ip-10-0-131-31.ec2.internal\" DevicePath \"\"" Apr 16 20:17:22.301252 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:17:22.301210 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-b99n8\" (UniqueName: \"kubernetes.io/projected/6ca1cfa4-7d1b-4af8-a6eb-889e05824da7-kube-api-access-b99n8\") on node \"ip-10-0-131-31.ec2.internal\" DevicePath \"\"" Apr 16 20:17:22.301252 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:17:22.301221 2577 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6ca1cfa4-7d1b-4af8-a6eb-889e05824da7-bundle\") on node \"ip-10-0-131-31.ec2.internal\" DevicePath \"\"" Apr 16 20:17:22.990305 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:17:22.990282 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4gtht" Apr 16 20:17:22.990305 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:17:22.990295 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4gtht" event={"ID":"6ca1cfa4-7d1b-4af8-a6eb-889e05824da7","Type":"ContainerDied","Data":"669171c5b1b933d154f626350f841ee82f3c80f023de95e5da37ced34c8552de"} Apr 16 20:17:22.990478 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:17:22.990322 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="669171c5b1b933d154f626350f841ee82f3c80f023de95e5da37ced34c8552de" Apr 16 20:17:25.129615 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:17:25.129586 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-8jltj"] Apr 16 20:17:25.130119 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:17:25.130065 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6ca1cfa4-7d1b-4af8-a6eb-889e05824da7" containerName="util" Apr 16 20:17:25.130119 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:17:25.130082 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ca1cfa4-7d1b-4af8-a6eb-889e05824da7" containerName="util" Apr 16 20:17:25.130119 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:17:25.130099 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6ca1cfa4-7d1b-4af8-a6eb-889e05824da7" containerName="pull" Apr 16 20:17:25.130119 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:17:25.130107 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ca1cfa4-7d1b-4af8-a6eb-889e05824da7" containerName="pull" Apr 16 20:17:25.130317 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:17:25.130123 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6ca1cfa4-7d1b-4af8-a6eb-889e05824da7" containerName="extract" Apr 16 20:17:25.130317 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:17:25.130131 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ca1cfa4-7d1b-4af8-a6eb-889e05824da7" containerName="extract" Apr 16 20:17:25.130317 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:17:25.130220 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="6ca1cfa4-7d1b-4af8-a6eb-889e05824da7" containerName="extract" Apr 16 20:17:25.134503 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:17:25.134482 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-8jltj" Apr 16 20:17:25.137150 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:17:25.137128 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 16 20:17:25.137276 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:17:25.137162 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 16 20:17:25.137276 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:17:25.137243 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 16 20:17:25.137276 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:17:25.137260 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-jphks\"" Apr 16 20:17:25.142564 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:17:25.142544 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-8jltj"] Apr 16 20:17:25.225212 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:17:25.225179 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9rt9\" (UniqueName: \"kubernetes.io/projected/ac76701b-9c10-48bf-ba32-4c043b253df6-kube-api-access-s9rt9\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-8jltj\" (UID: \"ac76701b-9c10-48bf-ba32-4c043b253df6\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-8jltj" Apr 16 20:17:25.225379 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:17:25.225238 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/ac76701b-9c10-48bf-ba32-4c043b253df6-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-8jltj\" (UID: \"ac76701b-9c10-48bf-ba32-4c043b253df6\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-8jltj" Apr 16 20:17:25.326111 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:17:25.326078 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/ac76701b-9c10-48bf-ba32-4c043b253df6-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-8jltj\" (UID: \"ac76701b-9c10-48bf-ba32-4c043b253df6\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-8jltj" Apr 16 20:17:25.326238 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:17:25.326136 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s9rt9\" (UniqueName: \"kubernetes.io/projected/ac76701b-9c10-48bf-ba32-4c043b253df6-kube-api-access-s9rt9\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-8jltj\" (UID: \"ac76701b-9c10-48bf-ba32-4c043b253df6\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-8jltj" Apr 16 20:17:25.328350 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:17:25.328328 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/ac76701b-9c10-48bf-ba32-4c043b253df6-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-8jltj\" (UID: \"ac76701b-9c10-48bf-ba32-4c043b253df6\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-8jltj" Apr 16 20:17:25.335013 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:17:25.334992 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9rt9\" (UniqueName: \"kubernetes.io/projected/ac76701b-9c10-48bf-ba32-4c043b253df6-kube-api-access-s9rt9\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-8jltj\" (UID: \"ac76701b-9c10-48bf-ba32-4c043b253df6\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-8jltj" Apr 16 20:17:25.445331 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:17:25.445252 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-8jltj" Apr 16 20:17:25.563427 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:17:25.563401 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-8jltj"] Apr 16 20:17:25.565638 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:17:25.565606 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac76701b_9c10_48bf_ba32_4c043b253df6.slice/crio-dcee1ab197f9b66fff4c7437d6a22db359e00d9d1d48445035f355b5a60b969a WatchSource:0}: Error finding container dcee1ab197f9b66fff4c7437d6a22db359e00d9d1d48445035f355b5a60b969a: Status 404 returned error can't find the container with id dcee1ab197f9b66fff4c7437d6a22db359e00d9d1d48445035f355b5a60b969a Apr 16 20:17:26.000079 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:17:26.000049 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-8jltj" event={"ID":"ac76701b-9c10-48bf-ba32-4c043b253df6","Type":"ContainerStarted","Data":"dcee1ab197f9b66fff4c7437d6a22db359e00d9d1d48445035f355b5a60b969a"} Apr 16 20:17:29.011973 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:17:29.011884 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-8jltj" event={"ID":"ac76701b-9c10-48bf-ba32-4c043b253df6","Type":"ContainerStarted","Data":"ec12d6ae0c42bce915b9b0ec177735d81044a05fd01edc4c2c719fbefecfb9c3"} Apr 16 20:17:29.012340 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:17:29.011995 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-8jltj" Apr 16 20:17:29.034720 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:17:29.034673 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-8jltj" podStartSLOduration=0.894330803 podStartE2EDuration="4.034659979s" podCreationTimestamp="2026-04-16 20:17:25 +0000 UTC" firstStartedPulling="2026-04-16 20:17:25.567285539 +0000 UTC m=+345.243553752" lastFinishedPulling="2026-04-16 20:17:28.707614709 +0000 UTC m=+348.383882928" observedRunningTime="2026-04-16 20:17:29.033391069 +0000 UTC m=+348.709659298" watchObservedRunningTime="2026-04-16 20:17:29.034659979 +0000 UTC m=+348.710928210" Apr 16 20:17:29.242093 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:17:29.242063 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-m8q4h"] Apr 16 20:17:29.245643 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:17:29.245620 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-m8q4h" Apr 16 20:17:29.248007 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:17:29.247970 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 16 20:17:29.248122 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:17:29.248042 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 16 20:17:29.248574 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:17:29.248544 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-r6d72\"" Apr 16 20:17:29.256338 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:17:29.256313 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-m8q4h"] Apr 16 20:17:29.363196 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:17:29.363171 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/0cb33345-c949-4c73-a43e-a004a725c6c3-certificates\") pod \"keda-operator-ffbb595cb-m8q4h\" (UID: \"0cb33345-c949-4c73-a43e-a004a725c6c3\") " pod="openshift-keda/keda-operator-ffbb595cb-m8q4h" Apr 16 20:17:29.363342 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:17:29.363207 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/0cb33345-c949-4c73-a43e-a004a725c6c3-cabundle0\") pod \"keda-operator-ffbb595cb-m8q4h\" (UID: \"0cb33345-c949-4c73-a43e-a004a725c6c3\") " pod="openshift-keda/keda-operator-ffbb595cb-m8q4h" Apr 16 20:17:29.363342 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:17:29.363239 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2xft\" (UniqueName: \"kubernetes.io/projected/0cb33345-c949-4c73-a43e-a004a725c6c3-kube-api-access-x2xft\") pod \"keda-operator-ffbb595cb-m8q4h\" (UID: \"0cb33345-c949-4c73-a43e-a004a725c6c3\") " pod="openshift-keda/keda-operator-ffbb595cb-m8q4h" Apr 16 20:17:29.463898 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:17:29.463821 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/0cb33345-c949-4c73-a43e-a004a725c6c3-certificates\") pod \"keda-operator-ffbb595cb-m8q4h\" (UID: \"0cb33345-c949-4c73-a43e-a004a725c6c3\") " pod="openshift-keda/keda-operator-ffbb595cb-m8q4h" Apr 16 20:17:29.463898 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:17:29.463869 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/0cb33345-c949-4c73-a43e-a004a725c6c3-cabundle0\") pod \"keda-operator-ffbb595cb-m8q4h\" (UID: \"0cb33345-c949-4c73-a43e-a004a725c6c3\") " pod="openshift-keda/keda-operator-ffbb595cb-m8q4h" Apr 16 20:17:29.464122 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:17:29.463907 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x2xft\" (UniqueName: \"kubernetes.io/projected/0cb33345-c949-4c73-a43e-a004a725c6c3-kube-api-access-x2xft\") pod \"keda-operator-ffbb595cb-m8q4h\" (UID: \"0cb33345-c949-4c73-a43e-a004a725c6c3\") " pod="openshift-keda/keda-operator-ffbb595cb-m8q4h" Apr 16 20:17:29.464122 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:17:29.463937 2577 projected.go:264] Couldn't get secret openshift-keda/keda-operator-certs: secret "keda-operator-certs" not found Apr 16 20:17:29.464122 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:17:29.463983 2577 secret.go:281] references non-existent secret key: ca.crt Apr 16 20:17:29.464122 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:17:29.463993 2577 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 20:17:29.464122 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:17:29.464008 2577 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-m8q4h: [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 16 20:17:29.464122 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:17:29.464073 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0cb33345-c949-4c73-a43e-a004a725c6c3-certificates podName:0cb33345-c949-4c73-a43e-a004a725c6c3 nodeName:}" failed. No retries permitted until 2026-04-16 20:17:29.964053316 +0000 UTC m=+349.640321540 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/0cb33345-c949-4c73-a43e-a004a725c6c3-certificates") pod "keda-operator-ffbb595cb-m8q4h" (UID: "0cb33345-c949-4c73-a43e-a004a725c6c3") : [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 16 20:17:29.464703 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:17:29.464681 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/0cb33345-c949-4c73-a43e-a004a725c6c3-cabundle0\") pod \"keda-operator-ffbb595cb-m8q4h\" (UID: \"0cb33345-c949-4c73-a43e-a004a725c6c3\") " pod="openshift-keda/keda-operator-ffbb595cb-m8q4h" Apr 16 20:17:29.474681 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:17:29.474654 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2xft\" (UniqueName: \"kubernetes.io/projected/0cb33345-c949-4c73-a43e-a004a725c6c3-kube-api-access-x2xft\") pod \"keda-operator-ffbb595cb-m8q4h\" (UID: \"0cb33345-c949-4c73-a43e-a004a725c6c3\") " pod="openshift-keda/keda-operator-ffbb595cb-m8q4h" Apr 16 20:17:29.968940 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:17:29.968910 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/0cb33345-c949-4c73-a43e-a004a725c6c3-certificates\") pod \"keda-operator-ffbb595cb-m8q4h\" (UID: \"0cb33345-c949-4c73-a43e-a004a725c6c3\") " pod="openshift-keda/keda-operator-ffbb595cb-m8q4h" Apr 16 20:17:29.969141 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:17:29.969075 2577 secret.go:281] references non-existent secret key: ca.crt Apr 16 20:17:29.969141 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:17:29.969094 2577 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 20:17:29.969141 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:17:29.969105 2577 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-m8q4h: references non-existent secret key: ca.crt Apr 16 20:17:29.969252 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:17:29.969162 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0cb33345-c949-4c73-a43e-a004a725c6c3-certificates podName:0cb33345-c949-4c73-a43e-a004a725c6c3 nodeName:}" failed. No retries permitted until 2026-04-16 20:17:30.969145738 +0000 UTC m=+350.645413948 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/0cb33345-c949-4c73-a43e-a004a725c6c3-certificates") pod "keda-operator-ffbb595cb-m8q4h" (UID: "0cb33345-c949-4c73-a43e-a004a725c6c3") : references non-existent secret key: ca.crt Apr 16 20:17:30.978707 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:17:30.978677 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/0cb33345-c949-4c73-a43e-a004a725c6c3-certificates\") pod \"keda-operator-ffbb595cb-m8q4h\" (UID: \"0cb33345-c949-4c73-a43e-a004a725c6c3\") " pod="openshift-keda/keda-operator-ffbb595cb-m8q4h" Apr 16 20:17:30.979198 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:17:30.978782 2577 secret.go:281] references non-existent secret key: ca.crt Apr 16 20:17:30.979198 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:17:30.978797 2577 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 20:17:30.979198 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:17:30.978809 2577 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-m8q4h: references non-existent secret key: ca.crt Apr 16 20:17:30.979198 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:17:30.978874 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0cb33345-c949-4c73-a43e-a004a725c6c3-certificates podName:0cb33345-c949-4c73-a43e-a004a725c6c3 nodeName:}" failed. No retries permitted until 2026-04-16 20:17:32.97885418 +0000 UTC m=+352.655122410 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/0cb33345-c949-4c73-a43e-a004a725c6c3-certificates") pod "keda-operator-ffbb595cb-m8q4h" (UID: "0cb33345-c949-4c73-a43e-a004a725c6c3") : references non-existent secret key: ca.crt Apr 16 20:17:32.994007 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:17:32.993972 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/0cb33345-c949-4c73-a43e-a004a725c6c3-certificates\") pod \"keda-operator-ffbb595cb-m8q4h\" (UID: \"0cb33345-c949-4c73-a43e-a004a725c6c3\") " pod="openshift-keda/keda-operator-ffbb595cb-m8q4h" Apr 16 20:17:32.996426 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:17:32.996404 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/0cb33345-c949-4c73-a43e-a004a725c6c3-certificates\") pod \"keda-operator-ffbb595cb-m8q4h\" (UID: \"0cb33345-c949-4c73-a43e-a004a725c6c3\") " pod="openshift-keda/keda-operator-ffbb595cb-m8q4h" Apr 16 20:17:33.162487 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:17:33.162452 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-m8q4h" Apr 16 20:17:33.291715 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:17:33.291643 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-m8q4h"] Apr 16 20:17:33.294574 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:17:33.294543 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0cb33345_c949_4c73_a43e_a004a725c6c3.slice/crio-a8e982902626c16c37737adaeef90e9f48a4b36d4e5aeb3c183c1b582b7f4453 WatchSource:0}: Error finding container a8e982902626c16c37737adaeef90e9f48a4b36d4e5aeb3c183c1b582b7f4453: Status 404 returned error can't find the container with id a8e982902626c16c37737adaeef90e9f48a4b36d4e5aeb3c183c1b582b7f4453 Apr 16 20:17:34.028534 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:17:34.028490 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-m8q4h" event={"ID":"0cb33345-c949-4c73-a43e-a004a725c6c3","Type":"ContainerStarted","Data":"a8e982902626c16c37737adaeef90e9f48a4b36d4e5aeb3c183c1b582b7f4453"} Apr 16 20:17:37.040543 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:17:37.040501 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-m8q4h" event={"ID":"0cb33345-c949-4c73-a43e-a004a725c6c3","Type":"ContainerStarted","Data":"a6ab62e97153b2d6d516f4fe384a37b1d7b5fef841e25f5138260c6682ea7b34"} Apr 16 20:17:37.040894 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:17:37.040620 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-m8q4h" Apr 16 20:17:37.071160 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:17:37.071115 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-m8q4h" podStartSLOduration=5.139811183 podStartE2EDuration="8.071102035s" podCreationTimestamp="2026-04-16 20:17:29 +0000 UTC" firstStartedPulling="2026-04-16 20:17:33.295781346 +0000 UTC m=+352.972049556" lastFinishedPulling="2026-04-16 20:17:36.227072198 +0000 UTC m=+355.903340408" observedRunningTime="2026-04-16 20:17:37.069249302 +0000 UTC m=+356.745517534" watchObservedRunningTime="2026-04-16 20:17:37.071102035 +0000 UTC m=+356.747370266" Apr 16 20:17:50.017892 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:17:50.017863 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-8jltj" Apr 16 20:17:58.045578 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:17:58.045549 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-m8q4h" Apr 16 20:18:23.851474 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:23.851439 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e549gpq"] Apr 16 20:18:23.860568 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:23.860543 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e549gpq" Apr 16 20:18:23.864446 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:23.864426 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 20:18:23.865371 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:23.865341 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 20:18:23.865502 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:23.865352 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-2f7fr\"" Apr 16 20:18:23.868211 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:23.867825 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e549gpq"] Apr 16 20:18:23.894939 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:23.894913 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7eeeffda-809a-4973-9b19-f3716da45b84-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e549gpq\" (UID: \"7eeeffda-809a-4973-9b19-f3716da45b84\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e549gpq" Apr 16 20:18:23.895048 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:23.894997 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7eeeffda-809a-4973-9b19-f3716da45b84-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e549gpq\" (UID: \"7eeeffda-809a-4973-9b19-f3716da45b84\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e549gpq" Apr 16 20:18:23.895048 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:23.895033 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7crf\" (UniqueName: \"kubernetes.io/projected/7eeeffda-809a-4973-9b19-f3716da45b84-kube-api-access-n7crf\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e549gpq\" (UID: \"7eeeffda-809a-4973-9b19-f3716da45b84\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e549gpq" Apr 16 20:18:23.995866 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:23.995837 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7eeeffda-809a-4973-9b19-f3716da45b84-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e549gpq\" (UID: \"7eeeffda-809a-4973-9b19-f3716da45b84\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e549gpq" Apr 16 20:18:23.995978 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:23.995881 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n7crf\" (UniqueName: \"kubernetes.io/projected/7eeeffda-809a-4973-9b19-f3716da45b84-kube-api-access-n7crf\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e549gpq\" (UID: \"7eeeffda-809a-4973-9b19-f3716da45b84\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e549gpq" Apr 16 20:18:23.995978 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:23.995925 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7eeeffda-809a-4973-9b19-f3716da45b84-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e549gpq\" (UID: \"7eeeffda-809a-4973-9b19-f3716da45b84\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e549gpq" Apr 16 20:18:23.996343 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:23.996324 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7eeeffda-809a-4973-9b19-f3716da45b84-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e549gpq\" (UID: \"7eeeffda-809a-4973-9b19-f3716da45b84\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e549gpq" Apr 16 20:18:23.996381 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:23.996335 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7eeeffda-809a-4973-9b19-f3716da45b84-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e549gpq\" (UID: \"7eeeffda-809a-4973-9b19-f3716da45b84\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e549gpq" Apr 16 20:18:24.003889 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:24.003860 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7crf\" (UniqueName: \"kubernetes.io/projected/7eeeffda-809a-4973-9b19-f3716da45b84-kube-api-access-n7crf\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e549gpq\" (UID: \"7eeeffda-809a-4973-9b19-f3716da45b84\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e549gpq" Apr 16 20:18:24.170309 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:24.170241 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e549gpq" Apr 16 20:18:24.494165 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:24.494142 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e549gpq"] Apr 16 20:18:24.496447 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:18:24.496408 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7eeeffda_809a_4973_9b19_f3716da45b84.slice/crio-e7b14d4003a5be8390d61c3863813c87a3b89b67fc6dd459e81abb1b52c380eb WatchSource:0}: Error finding container e7b14d4003a5be8390d61c3863813c87a3b89b67fc6dd459e81abb1b52c380eb: Status 404 returned error can't find the container with id e7b14d4003a5be8390d61c3863813c87a3b89b67fc6dd459e81abb1b52c380eb Apr 16 20:18:25.196603 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:25.196569 2577 generic.go:358] "Generic (PLEG): container finished" podID="7eeeffda-809a-4973-9b19-f3716da45b84" containerID="74a8a242ffa59d2a77de1eb31058e6671e926219b5d4d51272916e828990149b" exitCode=0 Apr 16 20:18:25.196989 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:25.196656 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e549gpq" event={"ID":"7eeeffda-809a-4973-9b19-f3716da45b84","Type":"ContainerDied","Data":"74a8a242ffa59d2a77de1eb31058e6671e926219b5d4d51272916e828990149b"} Apr 16 20:18:25.196989 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:25.196692 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e549gpq" event={"ID":"7eeeffda-809a-4973-9b19-f3716da45b84","Type":"ContainerStarted","Data":"e7b14d4003a5be8390d61c3863813c87a3b89b67fc6dd459e81abb1b52c380eb"} Apr 16 20:18:27.204836 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:27.204803 2577 generic.go:358] "Generic (PLEG): container finished" podID="7eeeffda-809a-4973-9b19-f3716da45b84" containerID="9bff458605cfeec8f640c95616bbaf39c4f6e863a4bbf42acc3f766e1f3a1083" exitCode=0 Apr 16 20:18:27.205237 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:27.204889 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e549gpq" event={"ID":"7eeeffda-809a-4973-9b19-f3716da45b84","Type":"ContainerDied","Data":"9bff458605cfeec8f640c95616bbaf39c4f6e863a4bbf42acc3f766e1f3a1083"} Apr 16 20:18:28.210042 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:28.210007 2577 generic.go:358] "Generic (PLEG): container finished" podID="7eeeffda-809a-4973-9b19-f3716da45b84" containerID="84838c1462c52eeccffa02076081fcf9bc50549bd9042c2621a67ebd652e67de" exitCode=0 Apr 16 20:18:28.210396 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:28.210090 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e549gpq" event={"ID":"7eeeffda-809a-4973-9b19-f3716da45b84","Type":"ContainerDied","Data":"84838c1462c52eeccffa02076081fcf9bc50549bd9042c2621a67ebd652e67de"} Apr 16 20:18:29.331892 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:29.331871 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e549gpq" Apr 16 20:18:29.440172 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:29.440142 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7eeeffda-809a-4973-9b19-f3716da45b84-util\") pod \"7eeeffda-809a-4973-9b19-f3716da45b84\" (UID: \"7eeeffda-809a-4973-9b19-f3716da45b84\") " Apr 16 20:18:29.440309 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:29.440186 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n7crf\" (UniqueName: \"kubernetes.io/projected/7eeeffda-809a-4973-9b19-f3716da45b84-kube-api-access-n7crf\") pod \"7eeeffda-809a-4973-9b19-f3716da45b84\" (UID: \"7eeeffda-809a-4973-9b19-f3716da45b84\") " Apr 16 20:18:29.440309 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:29.440234 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7eeeffda-809a-4973-9b19-f3716da45b84-bundle\") pod \"7eeeffda-809a-4973-9b19-f3716da45b84\" (UID: \"7eeeffda-809a-4973-9b19-f3716da45b84\") " Apr 16 20:18:29.440847 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:29.440822 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7eeeffda-809a-4973-9b19-f3716da45b84-bundle" (OuterVolumeSpecName: "bundle") pod "7eeeffda-809a-4973-9b19-f3716da45b84" (UID: "7eeeffda-809a-4973-9b19-f3716da45b84"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:18:29.442344 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:29.442317 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7eeeffda-809a-4973-9b19-f3716da45b84-kube-api-access-n7crf" (OuterVolumeSpecName: "kube-api-access-n7crf") pod "7eeeffda-809a-4973-9b19-f3716da45b84" (UID: "7eeeffda-809a-4973-9b19-f3716da45b84"). InnerVolumeSpecName "kube-api-access-n7crf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:18:29.445722 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:29.445689 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7eeeffda-809a-4973-9b19-f3716da45b84-util" (OuterVolumeSpecName: "util") pod "7eeeffda-809a-4973-9b19-f3716da45b84" (UID: "7eeeffda-809a-4973-9b19-f3716da45b84"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:18:29.541589 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:29.541523 2577 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7eeeffda-809a-4973-9b19-f3716da45b84-bundle\") on node \"ip-10-0-131-31.ec2.internal\" DevicePath \"\"" Apr 16 20:18:29.541589 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:29.541547 2577 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7eeeffda-809a-4973-9b19-f3716da45b84-util\") on node \"ip-10-0-131-31.ec2.internal\" DevicePath \"\"" Apr 16 20:18:29.541589 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:29.541556 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-n7crf\" (UniqueName: \"kubernetes.io/projected/7eeeffda-809a-4973-9b19-f3716da45b84-kube-api-access-n7crf\") on node \"ip-10-0-131-31.ec2.internal\" DevicePath \"\"" Apr 16 20:18:30.217164 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:30.217136 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e549gpq" Apr 16 20:18:30.217338 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:30.217135 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e549gpq" event={"ID":"7eeeffda-809a-4973-9b19-f3716da45b84","Type":"ContainerDied","Data":"e7b14d4003a5be8390d61c3863813c87a3b89b67fc6dd459e81abb1b52c380eb"} Apr 16 20:18:30.217338 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:30.217237 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e7b14d4003a5be8390d61c3863813c87a3b89b67fc6dd459e81abb1b52c380eb" Apr 16 20:18:36.727021 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:36.726987 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-542n7"] Apr 16 20:18:36.727374 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:36.727319 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7eeeffda-809a-4973-9b19-f3716da45b84" containerName="extract" Apr 16 20:18:36.727374 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:36.727331 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="7eeeffda-809a-4973-9b19-f3716da45b84" containerName="extract" Apr 16 20:18:36.727374 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:36.727340 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7eeeffda-809a-4973-9b19-f3716da45b84" containerName="pull" Apr 16 20:18:36.727374 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:36.727345 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="7eeeffda-809a-4973-9b19-f3716da45b84" containerName="pull" Apr 16 20:18:36.727374 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:36.727356 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7eeeffda-809a-4973-9b19-f3716da45b84" containerName="util" Apr 16 20:18:36.727374 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:36.727362 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="7eeeffda-809a-4973-9b19-f3716da45b84" containerName="util" Apr 16 20:18:36.727565 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:36.727412 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="7eeeffda-809a-4973-9b19-f3716da45b84" containerName="extract" Apr 16 20:18:36.730472 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:36.730456 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-542n7" Apr 16 20:18:36.737707 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:36.737675 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-m86qx\"" Apr 16 20:18:36.737861 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:36.737843 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 16 20:18:36.738912 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:36.738894 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 16 20:18:36.764016 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:36.763987 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-542n7"] Apr 16 20:18:36.803427 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:36.803395 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/91e2723f-4813-45ba-9081-548aeb87fcf3-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-542n7\" (UID: \"91e2723f-4813-45ba-9081-548aeb87fcf3\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-542n7" Apr 16 20:18:36.803553 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:36.803450 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwqsh\" (UniqueName: \"kubernetes.io/projected/91e2723f-4813-45ba-9081-548aeb87fcf3-kube-api-access-pwqsh\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-542n7\" (UID: \"91e2723f-4813-45ba-9081-548aeb87fcf3\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-542n7" Apr 16 20:18:36.903851 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:36.903816 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pwqsh\" (UniqueName: \"kubernetes.io/projected/91e2723f-4813-45ba-9081-548aeb87fcf3-kube-api-access-pwqsh\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-542n7\" (UID: \"91e2723f-4813-45ba-9081-548aeb87fcf3\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-542n7" Apr 16 20:18:36.904030 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:36.903883 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/91e2723f-4813-45ba-9081-548aeb87fcf3-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-542n7\" (UID: \"91e2723f-4813-45ba-9081-548aeb87fcf3\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-542n7" Apr 16 20:18:36.904248 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:36.904231 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/91e2723f-4813-45ba-9081-548aeb87fcf3-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-542n7\" (UID: \"91e2723f-4813-45ba-9081-548aeb87fcf3\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-542n7" Apr 16 20:18:36.917115 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:36.917084 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwqsh\" (UniqueName: \"kubernetes.io/projected/91e2723f-4813-45ba-9081-548aeb87fcf3-kube-api-access-pwqsh\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-542n7\" (UID: \"91e2723f-4813-45ba-9081-548aeb87fcf3\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-542n7" Apr 16 20:18:37.039517 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:37.039429 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-542n7" Apr 16 20:18:37.163935 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:37.163895 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-542n7"] Apr 16 20:18:37.166827 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:18:37.166793 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod91e2723f_4813_45ba_9081_548aeb87fcf3.slice/crio-48972e6877ad8e4d4fdd29addd5a240f616fce4374be8e189348894d96d2f3ab WatchSource:0}: Error finding container 48972e6877ad8e4d4fdd29addd5a240f616fce4374be8e189348894d96d2f3ab: Status 404 returned error can't find the container with id 48972e6877ad8e4d4fdd29addd5a240f616fce4374be8e189348894d96d2f3ab Apr 16 20:18:37.239868 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:37.239838 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-542n7" event={"ID":"91e2723f-4813-45ba-9081-548aeb87fcf3","Type":"ContainerStarted","Data":"48972e6877ad8e4d4fdd29addd5a240f616fce4374be8e189348894d96d2f3ab"} Apr 16 20:18:40.255445 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:40.255403 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-542n7" event={"ID":"91e2723f-4813-45ba-9081-548aeb87fcf3","Type":"ContainerStarted","Data":"c9cbe8281e54813a1e68d0fd02282f574c5e22a7ba8f3557aaeab9ee7f2a5983"} Apr 16 20:18:40.278849 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:40.278799 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-542n7" podStartSLOduration=2.040459378 podStartE2EDuration="4.278784554s" podCreationTimestamp="2026-04-16 20:18:36 +0000 UTC" firstStartedPulling="2026-04-16 20:18:37.169246146 +0000 UTC m=+416.845514359" lastFinishedPulling="2026-04-16 20:18:39.407571324 +0000 UTC m=+419.083839535" observedRunningTime="2026-04-16 20:18:40.277274703 +0000 UTC m=+419.953542932" watchObservedRunningTime="2026-04-16 20:18:40.278784554 +0000 UTC m=+419.955052785" Apr 16 20:18:45.513694 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:45.513656 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-xwpdk"] Apr 16 20:18:45.517192 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:45.517170 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-xwpdk" Apr 16 20:18:45.519591 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:45.519572 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 16 20:18:45.520397 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:45.520379 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-dqhhf\"" Apr 16 20:18:45.520494 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:45.520399 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 16 20:18:45.523297 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:45.523267 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-xwpdk"] Apr 16 20:18:45.678056 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:45.678025 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7c685624-d3f9-4bd4-9228-7e91d1c3a91d-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-xwpdk\" (UID: \"7c685624-d3f9-4bd4-9228-7e91d1c3a91d\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-xwpdk" Apr 16 20:18:45.678196 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:45.678104 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpmcd\" (UniqueName: \"kubernetes.io/projected/7c685624-d3f9-4bd4-9228-7e91d1c3a91d-kube-api-access-zpmcd\") pod \"cert-manager-cainjector-8966b78d4-xwpdk\" (UID: \"7c685624-d3f9-4bd4-9228-7e91d1c3a91d\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-xwpdk" Apr 16 20:18:45.778508 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:45.778429 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7c685624-d3f9-4bd4-9228-7e91d1c3a91d-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-xwpdk\" (UID: \"7c685624-d3f9-4bd4-9228-7e91d1c3a91d\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-xwpdk" Apr 16 20:18:45.778508 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:45.778505 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zpmcd\" (UniqueName: \"kubernetes.io/projected/7c685624-d3f9-4bd4-9228-7e91d1c3a91d-kube-api-access-zpmcd\") pod \"cert-manager-cainjector-8966b78d4-xwpdk\" (UID: \"7c685624-d3f9-4bd4-9228-7e91d1c3a91d\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-xwpdk" Apr 16 20:18:45.786433 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:45.786400 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7c685624-d3f9-4bd4-9228-7e91d1c3a91d-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-xwpdk\" (UID: \"7c685624-d3f9-4bd4-9228-7e91d1c3a91d\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-xwpdk" Apr 16 20:18:45.786540 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:45.786451 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpmcd\" (UniqueName: \"kubernetes.io/projected/7c685624-d3f9-4bd4-9228-7e91d1c3a91d-kube-api-access-zpmcd\") pod \"cert-manager-cainjector-8966b78d4-xwpdk\" (UID: \"7c685624-d3f9-4bd4-9228-7e91d1c3a91d\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-xwpdk" Apr 16 20:18:45.836865 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:45.836830 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-xwpdk" Apr 16 20:18:45.944030 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:45.944001 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fkclbt"] Apr 16 20:18:45.948767 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:45.948745 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fkclbt" Apr 16 20:18:45.951294 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:45.951269 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 20:18:45.951538 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:45.951269 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 20:18:45.951538 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:45.951336 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-2f7fr\"" Apr 16 20:18:45.955189 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:45.955162 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fkclbt"] Apr 16 20:18:45.972391 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:45.972373 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-xwpdk"] Apr 16 20:18:45.975132 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:18:45.975097 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7c685624_d3f9_4bd4_9228_7e91d1c3a91d.slice/crio-72a96191546098c5ad334bb7ac1496858e5da18c649981724b162de9ec30aeff WatchSource:0}: Error finding container 72a96191546098c5ad334bb7ac1496858e5da18c649981724b162de9ec30aeff: Status 404 returned error can't find the container with id 72a96191546098c5ad334bb7ac1496858e5da18c649981724b162de9ec30aeff Apr 16 20:18:46.081359 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:46.081332 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1b8d40f3-1b92-4374-87f8-aa7443689376-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fkclbt\" (UID: \"1b8d40f3-1b92-4374-87f8-aa7443689376\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fkclbt" Apr 16 20:18:46.081500 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:46.081372 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6znf\" (UniqueName: \"kubernetes.io/projected/1b8d40f3-1b92-4374-87f8-aa7443689376-kube-api-access-r6znf\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fkclbt\" (UID: \"1b8d40f3-1b92-4374-87f8-aa7443689376\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fkclbt" Apr 16 20:18:46.081500 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:46.081397 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1b8d40f3-1b92-4374-87f8-aa7443689376-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fkclbt\" (UID: \"1b8d40f3-1b92-4374-87f8-aa7443689376\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fkclbt" Apr 16 20:18:46.182231 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:46.182198 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1b8d40f3-1b92-4374-87f8-aa7443689376-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fkclbt\" (UID: \"1b8d40f3-1b92-4374-87f8-aa7443689376\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fkclbt" Apr 16 20:18:46.182361 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:46.182240 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r6znf\" (UniqueName: \"kubernetes.io/projected/1b8d40f3-1b92-4374-87f8-aa7443689376-kube-api-access-r6znf\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fkclbt\" (UID: \"1b8d40f3-1b92-4374-87f8-aa7443689376\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fkclbt" Apr 16 20:18:46.182361 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:46.182262 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1b8d40f3-1b92-4374-87f8-aa7443689376-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fkclbt\" (UID: \"1b8d40f3-1b92-4374-87f8-aa7443689376\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fkclbt" Apr 16 20:18:46.182649 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:46.182628 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1b8d40f3-1b92-4374-87f8-aa7443689376-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fkclbt\" (UID: \"1b8d40f3-1b92-4374-87f8-aa7443689376\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fkclbt" Apr 16 20:18:46.182688 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:46.182639 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1b8d40f3-1b92-4374-87f8-aa7443689376-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fkclbt\" (UID: \"1b8d40f3-1b92-4374-87f8-aa7443689376\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fkclbt" Apr 16 20:18:46.190067 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:46.190041 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6znf\" (UniqueName: \"kubernetes.io/projected/1b8d40f3-1b92-4374-87f8-aa7443689376-kube-api-access-r6znf\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fkclbt\" (UID: \"1b8d40f3-1b92-4374-87f8-aa7443689376\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fkclbt" Apr 16 20:18:46.259733 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:46.259700 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fkclbt" Apr 16 20:18:46.282077 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:46.282042 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-xwpdk" event={"ID":"7c685624-d3f9-4bd4-9228-7e91d1c3a91d","Type":"ContainerStarted","Data":"72a96191546098c5ad334bb7ac1496858e5da18c649981724b162de9ec30aeff"} Apr 16 20:18:46.380386 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:46.380364 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fkclbt"] Apr 16 20:18:46.381706 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:18:46.381675 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b8d40f3_1b92_4374_87f8_aa7443689376.slice/crio-aec123870ce100827ad9b882d0420dc3c381c451f9b715e09baeadec6e371339 WatchSource:0}: Error finding container aec123870ce100827ad9b882d0420dc3c381c451f9b715e09baeadec6e371339: Status 404 returned error can't find the container with id aec123870ce100827ad9b882d0420dc3c381c451f9b715e09baeadec6e371339 Apr 16 20:18:47.287895 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:47.287857 2577 generic.go:358] "Generic (PLEG): container finished" podID="1b8d40f3-1b92-4374-87f8-aa7443689376" containerID="dec9b011f92f1fa20d9118df4f70df8ae2802bf1e33cdd598ec9cb2a6bed0146" exitCode=0 Apr 16 20:18:47.288313 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:47.287927 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fkclbt" event={"ID":"1b8d40f3-1b92-4374-87f8-aa7443689376","Type":"ContainerDied","Data":"dec9b011f92f1fa20d9118df4f70df8ae2802bf1e33cdd598ec9cb2a6bed0146"} Apr 16 20:18:47.288313 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:47.287970 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fkclbt" event={"ID":"1b8d40f3-1b92-4374-87f8-aa7443689376","Type":"ContainerStarted","Data":"aec123870ce100827ad9b882d0420dc3c381c451f9b715e09baeadec6e371339"} Apr 16 20:18:49.298720 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:49.298669 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-xwpdk" event={"ID":"7c685624-d3f9-4bd4-9228-7e91d1c3a91d","Type":"ContainerStarted","Data":"71233eaf4e4c9d7d08d791e115eee097b064c45fb1c68548a4a987c13008b929"} Apr 16 20:18:49.314329 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:49.314283 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-8966b78d4-xwpdk" podStartSLOduration=1.561548908 podStartE2EDuration="4.314266842s" podCreationTimestamp="2026-04-16 20:18:45 +0000 UTC" firstStartedPulling="2026-04-16 20:18:45.977150429 +0000 UTC m=+425.653418639" lastFinishedPulling="2026-04-16 20:18:48.729868364 +0000 UTC m=+428.406136573" observedRunningTime="2026-04-16 20:18:49.313026875 +0000 UTC m=+428.989295110" watchObservedRunningTime="2026-04-16 20:18:49.314266842 +0000 UTC m=+428.990535073" Apr 16 20:18:51.307432 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:51.307389 2577 generic.go:358] "Generic (PLEG): container finished" podID="1b8d40f3-1b92-4374-87f8-aa7443689376" containerID="fb1118693bef751d821444aa16ea1d8cfbf3dca485c4f73551b283d694692941" exitCode=0 Apr 16 20:18:51.307432 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:51.307433 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fkclbt" event={"ID":"1b8d40f3-1b92-4374-87f8-aa7443689376","Type":"ContainerDied","Data":"fb1118693bef751d821444aa16ea1d8cfbf3dca485c4f73551b283d694692941"} Apr 16 20:18:52.312599 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:52.312568 2577 generic.go:358] "Generic (PLEG): container finished" podID="1b8d40f3-1b92-4374-87f8-aa7443689376" containerID="a0ad717b38e9cf65aa552a36f81a4f523610c062b8537ed986048e731f4920c5" exitCode=0 Apr 16 20:18:52.312938 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:52.312651 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fkclbt" event={"ID":"1b8d40f3-1b92-4374-87f8-aa7443689376","Type":"ContainerDied","Data":"a0ad717b38e9cf65aa552a36f81a4f523610c062b8537ed986048e731f4920c5"} Apr 16 20:18:53.435516 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:53.435494 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fkclbt" Apr 16 20:18:53.549127 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:53.549094 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r6znf\" (UniqueName: \"kubernetes.io/projected/1b8d40f3-1b92-4374-87f8-aa7443689376-kube-api-access-r6znf\") pod \"1b8d40f3-1b92-4374-87f8-aa7443689376\" (UID: \"1b8d40f3-1b92-4374-87f8-aa7443689376\") " Apr 16 20:18:53.549313 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:53.549293 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1b8d40f3-1b92-4374-87f8-aa7443689376-util\") pod \"1b8d40f3-1b92-4374-87f8-aa7443689376\" (UID: \"1b8d40f3-1b92-4374-87f8-aa7443689376\") " Apr 16 20:18:53.549362 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:53.549354 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1b8d40f3-1b92-4374-87f8-aa7443689376-bundle\") pod \"1b8d40f3-1b92-4374-87f8-aa7443689376\" (UID: \"1b8d40f3-1b92-4374-87f8-aa7443689376\") " Apr 16 20:18:53.549733 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:53.549705 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b8d40f3-1b92-4374-87f8-aa7443689376-bundle" (OuterVolumeSpecName: "bundle") pod "1b8d40f3-1b92-4374-87f8-aa7443689376" (UID: "1b8d40f3-1b92-4374-87f8-aa7443689376"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:18:53.551236 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:53.551213 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b8d40f3-1b92-4374-87f8-aa7443689376-kube-api-access-r6znf" (OuterVolumeSpecName: "kube-api-access-r6znf") pod "1b8d40f3-1b92-4374-87f8-aa7443689376" (UID: "1b8d40f3-1b92-4374-87f8-aa7443689376"). InnerVolumeSpecName "kube-api-access-r6znf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:18:53.554071 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:53.554037 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b8d40f3-1b92-4374-87f8-aa7443689376-util" (OuterVolumeSpecName: "util") pod "1b8d40f3-1b92-4374-87f8-aa7443689376" (UID: "1b8d40f3-1b92-4374-87f8-aa7443689376"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:18:53.650539 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:53.650510 2577 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1b8d40f3-1b92-4374-87f8-aa7443689376-util\") on node \"ip-10-0-131-31.ec2.internal\" DevicePath \"\"" Apr 16 20:18:53.650539 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:53.650538 2577 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1b8d40f3-1b92-4374-87f8-aa7443689376-bundle\") on node \"ip-10-0-131-31.ec2.internal\" DevicePath \"\"" Apr 16 20:18:53.650691 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:53.650547 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-r6znf\" (UniqueName: \"kubernetes.io/projected/1b8d40f3-1b92-4374-87f8-aa7443689376-kube-api-access-r6znf\") on node \"ip-10-0-131-31.ec2.internal\" DevicePath \"\"" Apr 16 20:18:54.322787 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:54.322742 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fkclbt" event={"ID":"1b8d40f3-1b92-4374-87f8-aa7443689376","Type":"ContainerDied","Data":"aec123870ce100827ad9b882d0420dc3c381c451f9b715e09baeadec6e371339"} Apr 16 20:18:54.322787 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:54.322785 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aec123870ce100827ad9b882d0420dc3c381c451f9b715e09baeadec6e371339" Apr 16 20:18:54.322787 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:18:54.322760 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fkclbt" Apr 16 20:19:14.170194 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:14.170114 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wtq76"] Apr 16 20:19:14.170600 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:14.170475 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1b8d40f3-1b92-4374-87f8-aa7443689376" containerName="pull" Apr 16 20:19:14.170600 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:14.170492 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b8d40f3-1b92-4374-87f8-aa7443689376" containerName="pull" Apr 16 20:19:14.170600 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:14.170508 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1b8d40f3-1b92-4374-87f8-aa7443689376" containerName="util" Apr 16 20:19:14.170600 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:14.170520 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b8d40f3-1b92-4374-87f8-aa7443689376" containerName="util" Apr 16 20:19:14.170600 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:14.170528 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1b8d40f3-1b92-4374-87f8-aa7443689376" containerName="extract" Apr 16 20:19:14.170600 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:14.170534 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b8d40f3-1b92-4374-87f8-aa7443689376" containerName="extract" Apr 16 20:19:14.170600 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:14.170591 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="1b8d40f3-1b92-4374-87f8-aa7443689376" containerName="extract" Apr 16 20:19:14.177691 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:14.177661 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wtq76" Apr 16 20:19:14.180043 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:14.180018 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 20:19:14.180141 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:14.180021 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-2f7fr\"" Apr 16 20:19:14.180141 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:14.180124 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wtq76"] Apr 16 20:19:14.180897 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:14.180860 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 20:19:14.213182 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:14.213152 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54nhl\" (UniqueName: \"kubernetes.io/projected/eecd2eac-77cc-462f-b4c3-948301ef7e33-kube-api-access-54nhl\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wtq76\" (UID: \"eecd2eac-77cc-462f-b4c3-948301ef7e33\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wtq76" Apr 16 20:19:14.213329 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:14.213218 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/eecd2eac-77cc-462f-b4c3-948301ef7e33-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wtq76\" (UID: \"eecd2eac-77cc-462f-b4c3-948301ef7e33\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wtq76" Apr 16 20:19:14.213329 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:14.213267 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/eecd2eac-77cc-462f-b4c3-948301ef7e33-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wtq76\" (UID: \"eecd2eac-77cc-462f-b4c3-948301ef7e33\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wtq76" Apr 16 20:19:14.314325 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:14.314293 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-54nhl\" (UniqueName: \"kubernetes.io/projected/eecd2eac-77cc-462f-b4c3-948301ef7e33-kube-api-access-54nhl\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wtq76\" (UID: \"eecd2eac-77cc-462f-b4c3-948301ef7e33\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wtq76" Apr 16 20:19:14.314469 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:14.314342 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/eecd2eac-77cc-462f-b4c3-948301ef7e33-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wtq76\" (UID: \"eecd2eac-77cc-462f-b4c3-948301ef7e33\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wtq76" Apr 16 20:19:14.314509 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:14.314464 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/eecd2eac-77cc-462f-b4c3-948301ef7e33-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wtq76\" (UID: \"eecd2eac-77cc-462f-b4c3-948301ef7e33\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wtq76" Apr 16 20:19:14.314633 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:14.314619 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/eecd2eac-77cc-462f-b4c3-948301ef7e33-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wtq76\" (UID: \"eecd2eac-77cc-462f-b4c3-948301ef7e33\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wtq76" Apr 16 20:19:14.314793 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:14.314776 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/eecd2eac-77cc-462f-b4c3-948301ef7e33-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wtq76\" (UID: \"eecd2eac-77cc-462f-b4c3-948301ef7e33\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wtq76" Apr 16 20:19:14.321469 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:14.321443 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-54nhl\" (UniqueName: \"kubernetes.io/projected/eecd2eac-77cc-462f-b4c3-948301ef7e33-kube-api-access-54nhl\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wtq76\" (UID: \"eecd2eac-77cc-462f-b4c3-948301ef7e33\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wtq76" Apr 16 20:19:14.487551 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:14.487457 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wtq76" Apr 16 20:19:14.607142 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:14.607118 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wtq76"] Apr 16 20:19:14.609031 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:19:14.609002 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeecd2eac_77cc_462f_b4c3_948301ef7e33.slice/crio-9fbe9e2c265c253deadc6059dde7fa76ee3444e6087977e3c459149089164692 WatchSource:0}: Error finding container 9fbe9e2c265c253deadc6059dde7fa76ee3444e6087977e3c459149089164692: Status 404 returned error can't find the container with id 9fbe9e2c265c253deadc6059dde7fa76ee3444e6087977e3c459149089164692 Apr 16 20:19:15.398778 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:15.398745 2577 generic.go:358] "Generic (PLEG): container finished" podID="eecd2eac-77cc-462f-b4c3-948301ef7e33" containerID="e88922483695d89fde7b87965c94016ce4f603eccf13475571f5a1cf144e5c65" exitCode=0 Apr 16 20:19:15.399156 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:15.398840 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wtq76" event={"ID":"eecd2eac-77cc-462f-b4c3-948301ef7e33","Type":"ContainerDied","Data":"e88922483695d89fde7b87965c94016ce4f603eccf13475571f5a1cf144e5c65"} Apr 16 20:19:15.399156 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:15.398875 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wtq76" event={"ID":"eecd2eac-77cc-462f-b4c3-948301ef7e33","Type":"ContainerStarted","Data":"9fbe9e2c265c253deadc6059dde7fa76ee3444e6087977e3c459149089164692"} Apr 16 20:19:16.404537 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:16.404459 2577 generic.go:358] "Generic (PLEG): container finished" podID="eecd2eac-77cc-462f-b4c3-948301ef7e33" containerID="426d6768c6e1326aa6bcfd5cb6cfc43c0f6b4a41af22c578d9865b1ae109abc3" exitCode=0 Apr 16 20:19:16.404537 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:16.404509 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wtq76" event={"ID":"eecd2eac-77cc-462f-b4c3-948301ef7e33","Type":"ContainerDied","Data":"426d6768c6e1326aa6bcfd5cb6cfc43c0f6b4a41af22c578d9865b1ae109abc3"} Apr 16 20:19:17.410181 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:17.410150 2577 generic.go:358] "Generic (PLEG): container finished" podID="eecd2eac-77cc-462f-b4c3-948301ef7e33" containerID="bd933077d722b59c4c28787524d8e0a6231077fc93346b3f62cccf038fe961c1" exitCode=0 Apr 16 20:19:17.410558 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:17.410212 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wtq76" event={"ID":"eecd2eac-77cc-462f-b4c3-948301ef7e33","Type":"ContainerDied","Data":"bd933077d722b59c4c28787524d8e0a6231077fc93346b3f62cccf038fe961c1"} Apr 16 20:19:18.537188 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:18.537159 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wtq76" Apr 16 20:19:18.653651 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:18.653623 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/eecd2eac-77cc-462f-b4c3-948301ef7e33-bundle\") pod \"eecd2eac-77cc-462f-b4c3-948301ef7e33\" (UID: \"eecd2eac-77cc-462f-b4c3-948301ef7e33\") " Apr 16 20:19:18.653651 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:18.653659 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-54nhl\" (UniqueName: \"kubernetes.io/projected/eecd2eac-77cc-462f-b4c3-948301ef7e33-kube-api-access-54nhl\") pod \"eecd2eac-77cc-462f-b4c3-948301ef7e33\" (UID: \"eecd2eac-77cc-462f-b4c3-948301ef7e33\") " Apr 16 20:19:18.653852 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:18.653753 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/eecd2eac-77cc-462f-b4c3-948301ef7e33-util\") pod \"eecd2eac-77cc-462f-b4c3-948301ef7e33\" (UID: \"eecd2eac-77cc-462f-b4c3-948301ef7e33\") " Apr 16 20:19:18.654894 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:18.654860 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eecd2eac-77cc-462f-b4c3-948301ef7e33-bundle" (OuterVolumeSpecName: "bundle") pod "eecd2eac-77cc-462f-b4c3-948301ef7e33" (UID: "eecd2eac-77cc-462f-b4c3-948301ef7e33"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:19:18.655847 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:18.655825 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eecd2eac-77cc-462f-b4c3-948301ef7e33-kube-api-access-54nhl" (OuterVolumeSpecName: "kube-api-access-54nhl") pod "eecd2eac-77cc-462f-b4c3-948301ef7e33" (UID: "eecd2eac-77cc-462f-b4c3-948301ef7e33"). InnerVolumeSpecName "kube-api-access-54nhl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:19:18.659477 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:18.659454 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eecd2eac-77cc-462f-b4c3-948301ef7e33-util" (OuterVolumeSpecName: "util") pod "eecd2eac-77cc-462f-b4c3-948301ef7e33" (UID: "eecd2eac-77cc-462f-b4c3-948301ef7e33"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:19:18.755192 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:18.755120 2577 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/eecd2eac-77cc-462f-b4c3-948301ef7e33-util\") on node \"ip-10-0-131-31.ec2.internal\" DevicePath \"\"" Apr 16 20:19:18.755192 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:18.755155 2577 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/eecd2eac-77cc-462f-b4c3-948301ef7e33-bundle\") on node \"ip-10-0-131-31.ec2.internal\" DevicePath \"\"" Apr 16 20:19:18.755192 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:18.755165 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-54nhl\" (UniqueName: \"kubernetes.io/projected/eecd2eac-77cc-462f-b4c3-948301ef7e33-kube-api-access-54nhl\") on node \"ip-10-0-131-31.ec2.internal\" DevicePath \"\"" Apr 16 20:19:19.419391 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:19.419354 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wtq76" event={"ID":"eecd2eac-77cc-462f-b4c3-948301ef7e33","Type":"ContainerDied","Data":"9fbe9e2c265c253deadc6059dde7fa76ee3444e6087977e3c459149089164692"} Apr 16 20:19:19.419391 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:19.419388 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wtq76" Apr 16 20:19:19.419597 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:19.419392 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9fbe9e2c265c253deadc6059dde7fa76ee3444e6087977e3c459149089164692" Apr 16 20:19:28.804556 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:28.804523 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2nk2sz"] Apr 16 20:19:28.805119 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:28.805069 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="eecd2eac-77cc-462f-b4c3-948301ef7e33" containerName="pull" Apr 16 20:19:28.805119 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:28.805089 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="eecd2eac-77cc-462f-b4c3-948301ef7e33" containerName="pull" Apr 16 20:19:28.805119 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:28.805117 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="eecd2eac-77cc-462f-b4c3-948301ef7e33" containerName="extract" Apr 16 20:19:28.805296 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:28.805125 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="eecd2eac-77cc-462f-b4c3-948301ef7e33" containerName="extract" Apr 16 20:19:28.805296 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:28.805146 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="eecd2eac-77cc-462f-b4c3-948301ef7e33" containerName="util" Apr 16 20:19:28.805296 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:28.805156 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="eecd2eac-77cc-462f-b4c3-948301ef7e33" containerName="util" Apr 16 20:19:28.805296 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:28.805240 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="eecd2eac-77cc-462f-b4c3-948301ef7e33" containerName="extract" Apr 16 20:19:28.809654 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:28.809629 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2nk2sz" Apr 16 20:19:28.812466 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:28.812446 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 20:19:28.812565 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:28.812443 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-2f7fr\"" Apr 16 20:19:28.813174 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:28.813160 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 20:19:28.823508 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:28.823486 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2nk2sz"] Apr 16 20:19:28.939436 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:28.939405 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/55da57ed-39b6-456d-9409-4ceb9477adf6-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2nk2sz\" (UID: \"55da57ed-39b6-456d-9409-4ceb9477adf6\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2nk2sz" Apr 16 20:19:28.939436 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:28.939438 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/55da57ed-39b6-456d-9409-4ceb9477adf6-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2nk2sz\" (UID: \"55da57ed-39b6-456d-9409-4ceb9477adf6\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2nk2sz" Apr 16 20:19:28.939652 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:28.939488 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jbd7\" (UniqueName: \"kubernetes.io/projected/55da57ed-39b6-456d-9409-4ceb9477adf6-kube-api-access-6jbd7\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2nk2sz\" (UID: \"55da57ed-39b6-456d-9409-4ceb9477adf6\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2nk2sz" Apr 16 20:19:29.040710 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:29.040673 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6jbd7\" (UniqueName: \"kubernetes.io/projected/55da57ed-39b6-456d-9409-4ceb9477adf6-kube-api-access-6jbd7\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2nk2sz\" (UID: \"55da57ed-39b6-456d-9409-4ceb9477adf6\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2nk2sz" Apr 16 20:19:29.040879 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:29.040735 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/55da57ed-39b6-456d-9409-4ceb9477adf6-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2nk2sz\" (UID: \"55da57ed-39b6-456d-9409-4ceb9477adf6\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2nk2sz" Apr 16 20:19:29.040879 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:29.040763 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/55da57ed-39b6-456d-9409-4ceb9477adf6-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2nk2sz\" (UID: \"55da57ed-39b6-456d-9409-4ceb9477adf6\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2nk2sz" Apr 16 20:19:29.041057 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:29.041033 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/55da57ed-39b6-456d-9409-4ceb9477adf6-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2nk2sz\" (UID: \"55da57ed-39b6-456d-9409-4ceb9477adf6\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2nk2sz" Apr 16 20:19:29.041146 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:29.041128 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/55da57ed-39b6-456d-9409-4ceb9477adf6-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2nk2sz\" (UID: \"55da57ed-39b6-456d-9409-4ceb9477adf6\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2nk2sz" Apr 16 20:19:29.051518 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:29.051498 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jbd7\" (UniqueName: \"kubernetes.io/projected/55da57ed-39b6-456d-9409-4ceb9477adf6-kube-api-access-6jbd7\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2nk2sz\" (UID: \"55da57ed-39b6-456d-9409-4ceb9477adf6\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2nk2sz" Apr 16 20:19:29.118823 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:29.118795 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2nk2sz" Apr 16 20:19:29.244479 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:29.244454 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2nk2sz"] Apr 16 20:19:29.245366 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:19:29.245329 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55da57ed_39b6_456d_9409_4ceb9477adf6.slice/crio-f058416419a7cccb86f2d15085f0852f27bc6103c1440c000055d2b11ba47d32 WatchSource:0}: Error finding container f058416419a7cccb86f2d15085f0852f27bc6103c1440c000055d2b11ba47d32: Status 404 returned error can't find the container with id f058416419a7cccb86f2d15085f0852f27bc6103c1440c000055d2b11ba47d32 Apr 16 20:19:29.452443 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:29.452362 2577 generic.go:358] "Generic (PLEG): container finished" podID="55da57ed-39b6-456d-9409-4ceb9477adf6" containerID="ecedfaf2987fc9aad78a981e81e4f4d524b0ff4a35bd10c2c4c11421a757ea74" exitCode=0 Apr 16 20:19:29.452443 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:29.452432 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2nk2sz" event={"ID":"55da57ed-39b6-456d-9409-4ceb9477adf6","Type":"ContainerDied","Data":"ecedfaf2987fc9aad78a981e81e4f4d524b0ff4a35bd10c2c4c11421a757ea74"} Apr 16 20:19:29.452606 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:29.452454 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2nk2sz" event={"ID":"55da57ed-39b6-456d-9409-4ceb9477adf6","Type":"ContainerStarted","Data":"f058416419a7cccb86f2d15085f0852f27bc6103c1440c000055d2b11ba47d32"} Apr 16 20:19:30.457278 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:30.457247 2577 generic.go:358] "Generic (PLEG): container finished" podID="55da57ed-39b6-456d-9409-4ceb9477adf6" containerID="7b930dcdccf2f7c691f2ea1515f55c968d1d6f11b49b736237aec3b9773aa1d8" exitCode=0 Apr 16 20:19:30.457650 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:30.457295 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2nk2sz" event={"ID":"55da57ed-39b6-456d-9409-4ceb9477adf6","Type":"ContainerDied","Data":"7b930dcdccf2f7c691f2ea1515f55c968d1d6f11b49b736237aec3b9773aa1d8"} Apr 16 20:19:31.462661 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:31.462622 2577 generic.go:358] "Generic (PLEG): container finished" podID="55da57ed-39b6-456d-9409-4ceb9477adf6" containerID="1c6ae6eac55c6896b3bfdfac122953fabab2104f486970d60788fa97f62c7ade" exitCode=0 Apr 16 20:19:31.463009 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:31.462708 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2nk2sz" event={"ID":"55da57ed-39b6-456d-9409-4ceb9477adf6","Type":"ContainerDied","Data":"1c6ae6eac55c6896b3bfdfac122953fabab2104f486970d60788fa97f62c7ade"} Apr 16 20:19:31.772588 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:31.772509 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-986f4df7-frzwd"] Apr 16 20:19:31.775036 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:31.775022 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-986f4df7-frzwd" Apr 16 20:19:31.777922 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:31.777896 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 16 20:19:31.778054 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:31.777941 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 16 20:19:31.778054 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:31.778010 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 16 20:19:31.778278 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:31.778261 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 16 20:19:31.778372 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:31.778354 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-qc9df\"" Apr 16 20:19:31.778438 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:31.778387 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 16 20:19:31.786736 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:31.786712 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-986f4df7-frzwd"] Apr 16 20:19:31.862977 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:31.862937 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/ce9b2f6b-9ec4-4a0d-8778-697c8af4b720-manager-config\") pod \"lws-controller-manager-986f4df7-frzwd\" (UID: \"ce9b2f6b-9ec4-4a0d-8778-697c8af4b720\") " pod="openshift-lws-operator/lws-controller-manager-986f4df7-frzwd" Apr 16 20:19:31.863151 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:31.863007 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jc7d\" (UniqueName: \"kubernetes.io/projected/ce9b2f6b-9ec4-4a0d-8778-697c8af4b720-kube-api-access-4jc7d\") pod \"lws-controller-manager-986f4df7-frzwd\" (UID: \"ce9b2f6b-9ec4-4a0d-8778-697c8af4b720\") " pod="openshift-lws-operator/lws-controller-manager-986f4df7-frzwd" Apr 16 20:19:31.863151 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:31.863058 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/ce9b2f6b-9ec4-4a0d-8778-697c8af4b720-metrics-cert\") pod \"lws-controller-manager-986f4df7-frzwd\" (UID: \"ce9b2f6b-9ec4-4a0d-8778-697c8af4b720\") " pod="openshift-lws-operator/lws-controller-manager-986f4df7-frzwd" Apr 16 20:19:31.863151 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:31.863134 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ce9b2f6b-9ec4-4a0d-8778-697c8af4b720-cert\") pod \"lws-controller-manager-986f4df7-frzwd\" (UID: \"ce9b2f6b-9ec4-4a0d-8778-697c8af4b720\") " pod="openshift-lws-operator/lws-controller-manager-986f4df7-frzwd" Apr 16 20:19:31.963998 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:31.963944 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/ce9b2f6b-9ec4-4a0d-8778-697c8af4b720-manager-config\") pod \"lws-controller-manager-986f4df7-frzwd\" (UID: \"ce9b2f6b-9ec4-4a0d-8778-697c8af4b720\") " pod="openshift-lws-operator/lws-controller-manager-986f4df7-frzwd" Apr 16 20:19:31.964167 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:31.964030 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4jc7d\" (UniqueName: \"kubernetes.io/projected/ce9b2f6b-9ec4-4a0d-8778-697c8af4b720-kube-api-access-4jc7d\") pod \"lws-controller-manager-986f4df7-frzwd\" (UID: \"ce9b2f6b-9ec4-4a0d-8778-697c8af4b720\") " pod="openshift-lws-operator/lws-controller-manager-986f4df7-frzwd" Apr 16 20:19:31.964167 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:31.964059 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/ce9b2f6b-9ec4-4a0d-8778-697c8af4b720-metrics-cert\") pod \"lws-controller-manager-986f4df7-frzwd\" (UID: \"ce9b2f6b-9ec4-4a0d-8778-697c8af4b720\") " pod="openshift-lws-operator/lws-controller-manager-986f4df7-frzwd" Apr 16 20:19:31.964167 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:31.964101 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ce9b2f6b-9ec4-4a0d-8778-697c8af4b720-cert\") pod \"lws-controller-manager-986f4df7-frzwd\" (UID: \"ce9b2f6b-9ec4-4a0d-8778-697c8af4b720\") " pod="openshift-lws-operator/lws-controller-manager-986f4df7-frzwd" Apr 16 20:19:31.964655 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:31.964631 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/ce9b2f6b-9ec4-4a0d-8778-697c8af4b720-manager-config\") pod \"lws-controller-manager-986f4df7-frzwd\" (UID: \"ce9b2f6b-9ec4-4a0d-8778-697c8af4b720\") " pod="openshift-lws-operator/lws-controller-manager-986f4df7-frzwd" Apr 16 20:19:31.966547 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:31.966530 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ce9b2f6b-9ec4-4a0d-8778-697c8af4b720-cert\") pod \"lws-controller-manager-986f4df7-frzwd\" (UID: \"ce9b2f6b-9ec4-4a0d-8778-697c8af4b720\") " pod="openshift-lws-operator/lws-controller-manager-986f4df7-frzwd" Apr 16 20:19:31.966698 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:31.966677 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/ce9b2f6b-9ec4-4a0d-8778-697c8af4b720-metrics-cert\") pod \"lws-controller-manager-986f4df7-frzwd\" (UID: \"ce9b2f6b-9ec4-4a0d-8778-697c8af4b720\") " pod="openshift-lws-operator/lws-controller-manager-986f4df7-frzwd" Apr 16 20:19:31.972017 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:31.971995 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jc7d\" (UniqueName: \"kubernetes.io/projected/ce9b2f6b-9ec4-4a0d-8778-697c8af4b720-kube-api-access-4jc7d\") pod \"lws-controller-manager-986f4df7-frzwd\" (UID: \"ce9b2f6b-9ec4-4a0d-8778-697c8af4b720\") " pod="openshift-lws-operator/lws-controller-manager-986f4df7-frzwd" Apr 16 20:19:32.085551 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:32.085522 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-986f4df7-frzwd" Apr 16 20:19:32.245482 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:32.245448 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-986f4df7-frzwd"] Apr 16 20:19:32.249054 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:19:32.249015 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce9b2f6b_9ec4_4a0d_8778_697c8af4b720.slice/crio-098db4542431701ae435a7127957bbb2b5f2ac97c75ebddf999efeef342bb695 WatchSource:0}: Error finding container 098db4542431701ae435a7127957bbb2b5f2ac97c75ebddf999efeef342bb695: Status 404 returned error can't find the container with id 098db4542431701ae435a7127957bbb2b5f2ac97c75ebddf999efeef342bb695 Apr 16 20:19:32.467534 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:32.467452 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-986f4df7-frzwd" event={"ID":"ce9b2f6b-9ec4-4a0d-8778-697c8af4b720","Type":"ContainerStarted","Data":"098db4542431701ae435a7127957bbb2b5f2ac97c75ebddf999efeef342bb695"} Apr 16 20:19:32.581829 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:32.581808 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2nk2sz" Apr 16 20:19:32.674874 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:32.674841 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/55da57ed-39b6-456d-9409-4ceb9477adf6-bundle\") pod \"55da57ed-39b6-456d-9409-4ceb9477adf6\" (UID: \"55da57ed-39b6-456d-9409-4ceb9477adf6\") " Apr 16 20:19:32.674874 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:32.674880 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6jbd7\" (UniqueName: \"kubernetes.io/projected/55da57ed-39b6-456d-9409-4ceb9477adf6-kube-api-access-6jbd7\") pod \"55da57ed-39b6-456d-9409-4ceb9477adf6\" (UID: \"55da57ed-39b6-456d-9409-4ceb9477adf6\") " Apr 16 20:19:32.675134 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:32.674911 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/55da57ed-39b6-456d-9409-4ceb9477adf6-util\") pod \"55da57ed-39b6-456d-9409-4ceb9477adf6\" (UID: \"55da57ed-39b6-456d-9409-4ceb9477adf6\") " Apr 16 20:19:32.675731 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:32.675706 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55da57ed-39b6-456d-9409-4ceb9477adf6-bundle" (OuterVolumeSpecName: "bundle") pod "55da57ed-39b6-456d-9409-4ceb9477adf6" (UID: "55da57ed-39b6-456d-9409-4ceb9477adf6"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:19:32.676956 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:32.676924 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55da57ed-39b6-456d-9409-4ceb9477adf6-kube-api-access-6jbd7" (OuterVolumeSpecName: "kube-api-access-6jbd7") pod "55da57ed-39b6-456d-9409-4ceb9477adf6" (UID: "55da57ed-39b6-456d-9409-4ceb9477adf6"). InnerVolumeSpecName "kube-api-access-6jbd7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:19:32.680477 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:32.680458 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55da57ed-39b6-456d-9409-4ceb9477adf6-util" (OuterVolumeSpecName: "util") pod "55da57ed-39b6-456d-9409-4ceb9477adf6" (UID: "55da57ed-39b6-456d-9409-4ceb9477adf6"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:19:32.776494 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:32.776415 2577 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/55da57ed-39b6-456d-9409-4ceb9477adf6-bundle\") on node \"ip-10-0-131-31.ec2.internal\" DevicePath \"\"" Apr 16 20:19:32.776494 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:32.776444 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6jbd7\" (UniqueName: \"kubernetes.io/projected/55da57ed-39b6-456d-9409-4ceb9477adf6-kube-api-access-6jbd7\") on node \"ip-10-0-131-31.ec2.internal\" DevicePath \"\"" Apr 16 20:19:32.776494 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:32.776455 2577 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/55da57ed-39b6-456d-9409-4ceb9477adf6-util\") on node \"ip-10-0-131-31.ec2.internal\" DevicePath \"\"" Apr 16 20:19:33.473537 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:33.473504 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2nk2sz" event={"ID":"55da57ed-39b6-456d-9409-4ceb9477adf6","Type":"ContainerDied","Data":"f058416419a7cccb86f2d15085f0852f27bc6103c1440c000055d2b11ba47d32"} Apr 16 20:19:33.473537 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:33.473538 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f058416419a7cccb86f2d15085f0852f27bc6103c1440c000055d2b11ba47d32" Apr 16 20:19:33.473908 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:33.473604 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2nk2sz" Apr 16 20:19:34.477767 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:34.477733 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-986f4df7-frzwd" event={"ID":"ce9b2f6b-9ec4-4a0d-8778-697c8af4b720","Type":"ContainerStarted","Data":"85dbeafa55790e90f0a5a9c213b2572e82ef28b9ae2b4bd24b8b86d11fbb3013"} Apr 16 20:19:34.478137 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:34.477784 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-986f4df7-frzwd" Apr 16 20:19:34.495869 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:34.495818 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-986f4df7-frzwd" podStartSLOduration=1.889843242 podStartE2EDuration="3.49580433s" podCreationTimestamp="2026-04-16 20:19:31 +0000 UTC" firstStartedPulling="2026-04-16 20:19:32.251345468 +0000 UTC m=+471.927613682" lastFinishedPulling="2026-04-16 20:19:33.85730655 +0000 UTC m=+473.533574770" observedRunningTime="2026-04-16 20:19:34.492848064 +0000 UTC m=+474.169116296" watchObservedRunningTime="2026-04-16 20:19:34.49580433 +0000 UTC m=+474.172072561" Apr 16 20:19:45.483175 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:45.483148 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-986f4df7-frzwd" Apr 16 20:19:59.684219 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:59.684183 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30d7kkr"] Apr 16 20:19:59.684580 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:59.684521 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="55da57ed-39b6-456d-9409-4ceb9477adf6" containerName="pull" Apr 16 20:19:59.684580 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:59.684532 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="55da57ed-39b6-456d-9409-4ceb9477adf6" containerName="pull" Apr 16 20:19:59.684580 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:59.684546 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="55da57ed-39b6-456d-9409-4ceb9477adf6" containerName="extract" Apr 16 20:19:59.684580 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:59.684552 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="55da57ed-39b6-456d-9409-4ceb9477adf6" containerName="extract" Apr 16 20:19:59.684580 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:59.684572 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="55da57ed-39b6-456d-9409-4ceb9477adf6" containerName="util" Apr 16 20:19:59.684580 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:59.684577 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="55da57ed-39b6-456d-9409-4ceb9477adf6" containerName="util" Apr 16 20:19:59.684809 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:59.684632 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="55da57ed-39b6-456d-9409-4ceb9477adf6" containerName="extract" Apr 16 20:19:59.689174 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:59.689157 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30d7kkr" Apr 16 20:19:59.692127 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:59.692103 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 20:19:59.692886 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:59.692863 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-2f7fr\"" Apr 16 20:19:59.693013 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:59.692909 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 20:19:59.696051 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:59.696026 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30d7kkr"] Apr 16 20:19:59.785285 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:59.785250 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec8846885"] Apr 16 20:19:59.788816 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:59.788796 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec8846885" Apr 16 20:19:59.790789 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:59.790768 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qphcr\" (UniqueName: \"kubernetes.io/projected/a3a46512-57bb-415b-b19d-6ae729dcd3b6-kube-api-access-qphcr\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30d7kkr\" (UID: \"a3a46512-57bb-415b-b19d-6ae729dcd3b6\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30d7kkr" Apr 16 20:19:59.790904 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:59.790806 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a3a46512-57bb-415b-b19d-6ae729dcd3b6-bundle\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30d7kkr\" (UID: \"a3a46512-57bb-415b-b19d-6ae729dcd3b6\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30d7kkr" Apr 16 20:19:59.790904 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:59.790831 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a3a46512-57bb-415b-b19d-6ae729dcd3b6-util\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30d7kkr\" (UID: \"a3a46512-57bb-415b-b19d-6ae729dcd3b6\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30d7kkr" Apr 16 20:19:59.796564 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:59.796542 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec8846885"] Apr 16 20:19:59.882261 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:59.882223 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503kz8fx"] Apr 16 20:19:59.885986 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:59.885940 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503kz8fx" Apr 16 20:19:59.892002 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:59.891974 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9fe049ee-02b5-4841-a8ea-22bb6211d9e1-bundle\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503kz8fx\" (UID: \"9fe049ee-02b5-4841-a8ea-22bb6211d9e1\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503kz8fx" Apr 16 20:19:59.892197 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:59.892181 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9fe049ee-02b5-4841-a8ea-22bb6211d9e1-util\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503kz8fx\" (UID: \"9fe049ee-02b5-4841-a8ea-22bb6211d9e1\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503kz8fx" Apr 16 20:19:59.892333 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:59.892316 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e8c670cf-d59c-47cd-9811-82fe2868cb03-util\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec8846885\" (UID: \"e8c670cf-d59c-47cd-9811-82fe2868cb03\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec8846885" Apr 16 20:19:59.892484 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:59.892469 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qphcr\" (UniqueName: \"kubernetes.io/projected/a3a46512-57bb-415b-b19d-6ae729dcd3b6-kube-api-access-qphcr\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30d7kkr\" (UID: \"a3a46512-57bb-415b-b19d-6ae729dcd3b6\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30d7kkr" Apr 16 20:19:59.892588 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:59.892576 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e8c670cf-d59c-47cd-9811-82fe2868cb03-bundle\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec8846885\" (UID: \"e8c670cf-d59c-47cd-9811-82fe2868cb03\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec8846885" Apr 16 20:19:59.892687 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:59.892673 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ts5qr\" (UniqueName: \"kubernetes.io/projected/e8c670cf-d59c-47cd-9811-82fe2868cb03-kube-api-access-ts5qr\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec8846885\" (UID: \"e8c670cf-d59c-47cd-9811-82fe2868cb03\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec8846885" Apr 16 20:19:59.892796 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:59.892783 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4f2jh\" (UniqueName: \"kubernetes.io/projected/9fe049ee-02b5-4841-a8ea-22bb6211d9e1-kube-api-access-4f2jh\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503kz8fx\" (UID: \"9fe049ee-02b5-4841-a8ea-22bb6211d9e1\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503kz8fx" Apr 16 20:19:59.892894 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:59.892881 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a3a46512-57bb-415b-b19d-6ae729dcd3b6-bundle\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30d7kkr\" (UID: \"a3a46512-57bb-415b-b19d-6ae729dcd3b6\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30d7kkr" Apr 16 20:19:59.893034 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:59.893012 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503kz8fx"] Apr 16 20:19:59.893111 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:59.893014 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a3a46512-57bb-415b-b19d-6ae729dcd3b6-util\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30d7kkr\" (UID: \"a3a46512-57bb-415b-b19d-6ae729dcd3b6\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30d7kkr" Apr 16 20:19:59.893369 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:59.893343 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a3a46512-57bb-415b-b19d-6ae729dcd3b6-bundle\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30d7kkr\" (UID: \"a3a46512-57bb-415b-b19d-6ae729dcd3b6\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30d7kkr" Apr 16 20:19:59.893530 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:59.893506 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a3a46512-57bb-415b-b19d-6ae729dcd3b6-util\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30d7kkr\" (UID: \"a3a46512-57bb-415b-b19d-6ae729dcd3b6\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30d7kkr" Apr 16 20:19:59.900861 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:59.900828 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qphcr\" (UniqueName: \"kubernetes.io/projected/a3a46512-57bb-415b-b19d-6ae729dcd3b6-kube-api-access-qphcr\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30d7kkr\" (UID: \"a3a46512-57bb-415b-b19d-6ae729dcd3b6\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30d7kkr" Apr 16 20:19:59.983398 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:59.983321 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bn77bq"] Apr 16 20:19:59.987188 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:59.987172 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bn77bq" Apr 16 20:19:59.993613 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:59.993583 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cce797e7-a63a-4e5d-b701-29f038dc7783-bundle\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bn77bq\" (UID: \"cce797e7-a63a-4e5d-b701-29f038dc7783\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bn77bq" Apr 16 20:19:59.993738 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:59.993626 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9fe049ee-02b5-4841-a8ea-22bb6211d9e1-bundle\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503kz8fx\" (UID: \"9fe049ee-02b5-4841-a8ea-22bb6211d9e1\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503kz8fx" Apr 16 20:19:59.993738 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:59.993655 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9fe049ee-02b5-4841-a8ea-22bb6211d9e1-util\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503kz8fx\" (UID: \"9fe049ee-02b5-4841-a8ea-22bb6211d9e1\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503kz8fx" Apr 16 20:19:59.993738 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:59.993678 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cce797e7-a63a-4e5d-b701-29f038dc7783-util\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bn77bq\" (UID: \"cce797e7-a63a-4e5d-b701-29f038dc7783\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bn77bq" Apr 16 20:19:59.993738 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:59.993703 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e8c670cf-d59c-47cd-9811-82fe2868cb03-util\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec8846885\" (UID: \"e8c670cf-d59c-47cd-9811-82fe2868cb03\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec8846885" Apr 16 20:19:59.993738 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:59.993720 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9t6p\" (UniqueName: \"kubernetes.io/projected/cce797e7-a63a-4e5d-b701-29f038dc7783-kube-api-access-g9t6p\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bn77bq\" (UID: \"cce797e7-a63a-4e5d-b701-29f038dc7783\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bn77bq" Apr 16 20:19:59.994019 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:59.993823 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e8c670cf-d59c-47cd-9811-82fe2868cb03-bundle\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec8846885\" (UID: \"e8c670cf-d59c-47cd-9811-82fe2868cb03\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec8846885" Apr 16 20:19:59.994019 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:59.993861 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ts5qr\" (UniqueName: \"kubernetes.io/projected/e8c670cf-d59c-47cd-9811-82fe2868cb03-kube-api-access-ts5qr\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec8846885\" (UID: \"e8c670cf-d59c-47cd-9811-82fe2868cb03\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec8846885" Apr 16 20:19:59.994019 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:59.993904 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4f2jh\" (UniqueName: \"kubernetes.io/projected/9fe049ee-02b5-4841-a8ea-22bb6211d9e1-kube-api-access-4f2jh\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503kz8fx\" (UID: \"9fe049ee-02b5-4841-a8ea-22bb6211d9e1\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503kz8fx" Apr 16 20:19:59.994168 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:59.994099 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e8c670cf-d59c-47cd-9811-82fe2868cb03-util\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec8846885\" (UID: \"e8c670cf-d59c-47cd-9811-82fe2868cb03\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec8846885" Apr 16 20:19:59.994363 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:59.994339 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9fe049ee-02b5-4841-a8ea-22bb6211d9e1-util\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503kz8fx\" (UID: \"9fe049ee-02b5-4841-a8ea-22bb6211d9e1\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503kz8fx" Apr 16 20:19:59.994475 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:59.994359 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e8c670cf-d59c-47cd-9811-82fe2868cb03-bundle\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec8846885\" (UID: \"e8c670cf-d59c-47cd-9811-82fe2868cb03\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec8846885" Apr 16 20:19:59.994475 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:59.994445 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9fe049ee-02b5-4841-a8ea-22bb6211d9e1-bundle\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503kz8fx\" (UID: \"9fe049ee-02b5-4841-a8ea-22bb6211d9e1\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503kz8fx" Apr 16 20:19:59.994584 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:59.994572 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bn77bq"] Apr 16 20:19:59.999592 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:19:59.999571 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30d7kkr" Apr 16 20:20:00.002878 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:00.002838 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ts5qr\" (UniqueName: \"kubernetes.io/projected/e8c670cf-d59c-47cd-9811-82fe2868cb03-kube-api-access-ts5qr\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec8846885\" (UID: \"e8c670cf-d59c-47cd-9811-82fe2868cb03\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec8846885" Apr 16 20:20:00.002878 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:00.002854 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4f2jh\" (UniqueName: \"kubernetes.io/projected/9fe049ee-02b5-4841-a8ea-22bb6211d9e1-kube-api-access-4f2jh\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503kz8fx\" (UID: \"9fe049ee-02b5-4841-a8ea-22bb6211d9e1\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503kz8fx" Apr 16 20:20:00.094931 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:00.094893 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cce797e7-a63a-4e5d-b701-29f038dc7783-bundle\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bn77bq\" (UID: \"cce797e7-a63a-4e5d-b701-29f038dc7783\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bn77bq" Apr 16 20:20:00.095119 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:00.094981 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cce797e7-a63a-4e5d-b701-29f038dc7783-util\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bn77bq\" (UID: \"cce797e7-a63a-4e5d-b701-29f038dc7783\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bn77bq" Apr 16 20:20:00.095119 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:00.095027 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g9t6p\" (UniqueName: \"kubernetes.io/projected/cce797e7-a63a-4e5d-b701-29f038dc7783-kube-api-access-g9t6p\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bn77bq\" (UID: \"cce797e7-a63a-4e5d-b701-29f038dc7783\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bn77bq" Apr 16 20:20:00.095396 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:00.095372 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cce797e7-a63a-4e5d-b701-29f038dc7783-util\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bn77bq\" (UID: \"cce797e7-a63a-4e5d-b701-29f038dc7783\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bn77bq" Apr 16 20:20:00.095471 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:00.095423 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cce797e7-a63a-4e5d-b701-29f038dc7783-bundle\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bn77bq\" (UID: \"cce797e7-a63a-4e5d-b701-29f038dc7783\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bn77bq" Apr 16 20:20:00.099721 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:00.099699 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec8846885" Apr 16 20:20:00.103993 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:00.103971 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9t6p\" (UniqueName: \"kubernetes.io/projected/cce797e7-a63a-4e5d-b701-29f038dc7783-kube-api-access-g9t6p\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bn77bq\" (UID: \"cce797e7-a63a-4e5d-b701-29f038dc7783\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bn77bq" Apr 16 20:20:00.122074 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:00.122052 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30d7kkr"] Apr 16 20:20:00.123369 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:20:00.123344 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3a46512_57bb_415b_b19d_6ae729dcd3b6.slice/crio-fe4d7665320e0094c205dcab90556e117ad47ba83e6f7e55051fa15e6651c747 WatchSource:0}: Error finding container fe4d7665320e0094c205dcab90556e117ad47ba83e6f7e55051fa15e6651c747: Status 404 returned error can't find the container with id fe4d7665320e0094c205dcab90556e117ad47ba83e6f7e55051fa15e6651c747 Apr 16 20:20:00.196991 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:00.196923 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503kz8fx" Apr 16 20:20:00.237669 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:00.229974 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec8846885"] Apr 16 20:20:00.237669 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:20:00.230146 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode8c670cf_d59c_47cd_9811_82fe2868cb03.slice/crio-edbdf4220a8bb49d7f2b7b5473a32f8593b2bea4f141135a5c04ac3bbfa3f2f0 WatchSource:0}: Error finding container edbdf4220a8bb49d7f2b7b5473a32f8593b2bea4f141135a5c04ac3bbfa3f2f0: Status 404 returned error can't find the container with id edbdf4220a8bb49d7f2b7b5473a32f8593b2bea4f141135a5c04ac3bbfa3f2f0 Apr 16 20:20:00.299720 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:00.299247 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bn77bq" Apr 16 20:20:00.327003 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:00.326703 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503kz8fx"] Apr 16 20:20:00.329863 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:20:00.329832 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9fe049ee_02b5_4841_a8ea_22bb6211d9e1.slice/crio-4d4f26bf9cadec60d1a9fb83d81cfc8b4590d422fb3fbfbd1d6e0df5664b1d14 WatchSource:0}: Error finding container 4d4f26bf9cadec60d1a9fb83d81cfc8b4590d422fb3fbfbd1d6e0df5664b1d14: Status 404 returned error can't find the container with id 4d4f26bf9cadec60d1a9fb83d81cfc8b4590d422fb3fbfbd1d6e0df5664b1d14 Apr 16 20:20:00.431462 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:00.431434 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bn77bq"] Apr 16 20:20:00.434848 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:20:00.434819 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcce797e7_a63a_4e5d_b701_29f038dc7783.slice/crio-34acbacdbd54c5fffd15ba30eea680009e4ca83a37ad2e7e1a65c7256f3b3f01 WatchSource:0}: Error finding container 34acbacdbd54c5fffd15ba30eea680009e4ca83a37ad2e7e1a65c7256f3b3f01: Status 404 returned error can't find the container with id 34acbacdbd54c5fffd15ba30eea680009e4ca83a37ad2e7e1a65c7256f3b3f01 Apr 16 20:20:00.581213 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:00.581181 2577 generic.go:358] "Generic (PLEG): container finished" podID="9fe049ee-02b5-4841-a8ea-22bb6211d9e1" containerID="00ad906ed8a2e5a52e7f49a67491ab3218ee72dfb4ee4d0926edb7aa3bd8c24d" exitCode=0 Apr 16 20:20:00.581343 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:00.581258 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503kz8fx" event={"ID":"9fe049ee-02b5-4841-a8ea-22bb6211d9e1","Type":"ContainerDied","Data":"00ad906ed8a2e5a52e7f49a67491ab3218ee72dfb4ee4d0926edb7aa3bd8c24d"} Apr 16 20:20:00.581343 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:00.581289 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503kz8fx" event={"ID":"9fe049ee-02b5-4841-a8ea-22bb6211d9e1","Type":"ContainerStarted","Data":"4d4f26bf9cadec60d1a9fb83d81cfc8b4590d422fb3fbfbd1d6e0df5664b1d14"} Apr 16 20:20:00.582647 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:00.582564 2577 generic.go:358] "Generic (PLEG): container finished" podID="e8c670cf-d59c-47cd-9811-82fe2868cb03" containerID="445262ed5e74d714999e380942f8ad14af4a57547d24a27c99ec442a53d4d414" exitCode=0 Apr 16 20:20:00.582647 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:00.582622 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec8846885" event={"ID":"e8c670cf-d59c-47cd-9811-82fe2868cb03","Type":"ContainerDied","Data":"445262ed5e74d714999e380942f8ad14af4a57547d24a27c99ec442a53d4d414"} Apr 16 20:20:00.582647 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:00.582645 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec8846885" event={"ID":"e8c670cf-d59c-47cd-9811-82fe2868cb03","Type":"ContainerStarted","Data":"edbdf4220a8bb49d7f2b7b5473a32f8593b2bea4f141135a5c04ac3bbfa3f2f0"} Apr 16 20:20:00.583911 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:00.583892 2577 generic.go:358] "Generic (PLEG): container finished" podID="a3a46512-57bb-415b-b19d-6ae729dcd3b6" containerID="60c8365681e9097e3b53b95ec2b020cfc85bb606d7d38ac45459408ae726b09d" exitCode=0 Apr 16 20:20:00.583992 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:00.583978 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30d7kkr" event={"ID":"a3a46512-57bb-415b-b19d-6ae729dcd3b6","Type":"ContainerDied","Data":"60c8365681e9097e3b53b95ec2b020cfc85bb606d7d38ac45459408ae726b09d"} Apr 16 20:20:00.584056 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:00.584002 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30d7kkr" event={"ID":"a3a46512-57bb-415b-b19d-6ae729dcd3b6","Type":"ContainerStarted","Data":"fe4d7665320e0094c205dcab90556e117ad47ba83e6f7e55051fa15e6651c747"} Apr 16 20:20:00.585356 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:00.585333 2577 generic.go:358] "Generic (PLEG): container finished" podID="cce797e7-a63a-4e5d-b701-29f038dc7783" containerID="840f6d665b1d6328d72ef532139db50b2b7c14eb80b735436bb5ac687c3ce551" exitCode=0 Apr 16 20:20:00.585435 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:00.585421 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bn77bq" event={"ID":"cce797e7-a63a-4e5d-b701-29f038dc7783","Type":"ContainerDied","Data":"840f6d665b1d6328d72ef532139db50b2b7c14eb80b735436bb5ac687c3ce551"} Apr 16 20:20:00.585487 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:00.585446 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bn77bq" event={"ID":"cce797e7-a63a-4e5d-b701-29f038dc7783","Type":"ContainerStarted","Data":"34acbacdbd54c5fffd15ba30eea680009e4ca83a37ad2e7e1a65c7256f3b3f01"} Apr 16 20:20:01.592437 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:01.592407 2577 generic.go:358] "Generic (PLEG): container finished" podID="e8c670cf-d59c-47cd-9811-82fe2868cb03" containerID="ea307de210bb74cbaf5a1460b2bd337ba1e581d2cd0e69910220592faa23342e" exitCode=0 Apr 16 20:20:01.592838 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:01.592501 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec8846885" event={"ID":"e8c670cf-d59c-47cd-9811-82fe2868cb03","Type":"ContainerDied","Data":"ea307de210bb74cbaf5a1460b2bd337ba1e581d2cd0e69910220592faa23342e"} Apr 16 20:20:01.594125 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:01.594097 2577 generic.go:358] "Generic (PLEG): container finished" podID="cce797e7-a63a-4e5d-b701-29f038dc7783" containerID="3ef97cebf0561a9a1aba2e9ad25509741befc12e150b796aaa9a485d97b683ae" exitCode=0 Apr 16 20:20:01.594249 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:01.594153 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bn77bq" event={"ID":"cce797e7-a63a-4e5d-b701-29f038dc7783","Type":"ContainerDied","Data":"3ef97cebf0561a9a1aba2e9ad25509741befc12e150b796aaa9a485d97b683ae"} Apr 16 20:20:02.599414 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:02.599382 2577 generic.go:358] "Generic (PLEG): container finished" podID="9fe049ee-02b5-4841-a8ea-22bb6211d9e1" containerID="12cb437bf39bd2c15684119e2497edb3d0d0154066d117aaf146bc4a7cf38b53" exitCode=0 Apr 16 20:20:02.599817 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:02.599469 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503kz8fx" event={"ID":"9fe049ee-02b5-4841-a8ea-22bb6211d9e1","Type":"ContainerDied","Data":"12cb437bf39bd2c15684119e2497edb3d0d0154066d117aaf146bc4a7cf38b53"} Apr 16 20:20:02.601400 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:02.601374 2577 generic.go:358] "Generic (PLEG): container finished" podID="e8c670cf-d59c-47cd-9811-82fe2868cb03" containerID="3dcabf8230dd977aeef34f47de56e4f0abb0213c331f673982002bf5e3c6d97a" exitCode=0 Apr 16 20:20:02.601494 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:02.601406 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec8846885" event={"ID":"e8c670cf-d59c-47cd-9811-82fe2868cb03","Type":"ContainerDied","Data":"3dcabf8230dd977aeef34f47de56e4f0abb0213c331f673982002bf5e3c6d97a"} Apr 16 20:20:02.603042 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:02.603024 2577 generic.go:358] "Generic (PLEG): container finished" podID="a3a46512-57bb-415b-b19d-6ae729dcd3b6" containerID="9335e092a954e4040d749be22c286e864d258b9e77301565b34c7aa1e2baa8b7" exitCode=0 Apr 16 20:20:02.603124 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:02.603080 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30d7kkr" event={"ID":"a3a46512-57bb-415b-b19d-6ae729dcd3b6","Type":"ContainerDied","Data":"9335e092a954e4040d749be22c286e864d258b9e77301565b34c7aa1e2baa8b7"} Apr 16 20:20:02.604987 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:02.604969 2577 generic.go:358] "Generic (PLEG): container finished" podID="cce797e7-a63a-4e5d-b701-29f038dc7783" containerID="e62cdb5e7578d300dec1358f5cb1ace54a570a87548e72d46a2bcd0b129364e5" exitCode=0 Apr 16 20:20:02.605059 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:02.605001 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bn77bq" event={"ID":"cce797e7-a63a-4e5d-b701-29f038dc7783","Type":"ContainerDied","Data":"e62cdb5e7578d300dec1358f5cb1ace54a570a87548e72d46a2bcd0b129364e5"} Apr 16 20:20:03.611442 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:03.611407 2577 generic.go:358] "Generic (PLEG): container finished" podID="9fe049ee-02b5-4841-a8ea-22bb6211d9e1" containerID="329e1c402adc6d09485679f0c0a6fded63f3e35738a5244093bedd3797a32b6d" exitCode=0 Apr 16 20:20:03.611826 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:03.611489 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503kz8fx" event={"ID":"9fe049ee-02b5-4841-a8ea-22bb6211d9e1","Type":"ContainerDied","Data":"329e1c402adc6d09485679f0c0a6fded63f3e35738a5244093bedd3797a32b6d"} Apr 16 20:20:03.613285 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:03.613261 2577 generic.go:358] "Generic (PLEG): container finished" podID="a3a46512-57bb-415b-b19d-6ae729dcd3b6" containerID="e481ebb70fb3e2f6c8e3c69161aceb3bdd92e19f4180ab67fd51732d284b153f" exitCode=0 Apr 16 20:20:03.613408 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:03.613338 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30d7kkr" event={"ID":"a3a46512-57bb-415b-b19d-6ae729dcd3b6","Type":"ContainerDied","Data":"e481ebb70fb3e2f6c8e3c69161aceb3bdd92e19f4180ab67fd51732d284b153f"} Apr 16 20:20:03.763374 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:03.763344 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bn77bq" Apr 16 20:20:03.766320 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:03.766302 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec8846885" Apr 16 20:20:03.827759 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:03.827730 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cce797e7-a63a-4e5d-b701-29f038dc7783-bundle\") pod \"cce797e7-a63a-4e5d-b701-29f038dc7783\" (UID: \"cce797e7-a63a-4e5d-b701-29f038dc7783\") " Apr 16 20:20:03.827916 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:03.827802 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ts5qr\" (UniqueName: \"kubernetes.io/projected/e8c670cf-d59c-47cd-9811-82fe2868cb03-kube-api-access-ts5qr\") pod \"e8c670cf-d59c-47cd-9811-82fe2868cb03\" (UID: \"e8c670cf-d59c-47cd-9811-82fe2868cb03\") " Apr 16 20:20:03.827916 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:03.827827 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cce797e7-a63a-4e5d-b701-29f038dc7783-util\") pod \"cce797e7-a63a-4e5d-b701-29f038dc7783\" (UID: \"cce797e7-a63a-4e5d-b701-29f038dc7783\") " Apr 16 20:20:03.827916 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:03.827890 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e8c670cf-d59c-47cd-9811-82fe2868cb03-bundle\") pod \"e8c670cf-d59c-47cd-9811-82fe2868cb03\" (UID: \"e8c670cf-d59c-47cd-9811-82fe2868cb03\") " Apr 16 20:20:03.828082 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:03.827914 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9t6p\" (UniqueName: \"kubernetes.io/projected/cce797e7-a63a-4e5d-b701-29f038dc7783-kube-api-access-g9t6p\") pod \"cce797e7-a63a-4e5d-b701-29f038dc7783\" (UID: \"cce797e7-a63a-4e5d-b701-29f038dc7783\") " Apr 16 20:20:03.828082 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:03.827974 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e8c670cf-d59c-47cd-9811-82fe2868cb03-util\") pod \"e8c670cf-d59c-47cd-9811-82fe2868cb03\" (UID: \"e8c670cf-d59c-47cd-9811-82fe2868cb03\") " Apr 16 20:20:03.828393 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:03.828363 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cce797e7-a63a-4e5d-b701-29f038dc7783-bundle" (OuterVolumeSpecName: "bundle") pod "cce797e7-a63a-4e5d-b701-29f038dc7783" (UID: "cce797e7-a63a-4e5d-b701-29f038dc7783"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:20:03.828504 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:03.828399 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8c670cf-d59c-47cd-9811-82fe2868cb03-bundle" (OuterVolumeSpecName: "bundle") pod "e8c670cf-d59c-47cd-9811-82fe2868cb03" (UID: "e8c670cf-d59c-47cd-9811-82fe2868cb03"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:20:03.830068 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:03.830043 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cce797e7-a63a-4e5d-b701-29f038dc7783-kube-api-access-g9t6p" (OuterVolumeSpecName: "kube-api-access-g9t6p") pod "cce797e7-a63a-4e5d-b701-29f038dc7783" (UID: "cce797e7-a63a-4e5d-b701-29f038dc7783"). InnerVolumeSpecName "kube-api-access-g9t6p". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:20:03.830297 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:03.830278 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8c670cf-d59c-47cd-9811-82fe2868cb03-kube-api-access-ts5qr" (OuterVolumeSpecName: "kube-api-access-ts5qr") pod "e8c670cf-d59c-47cd-9811-82fe2868cb03" (UID: "e8c670cf-d59c-47cd-9811-82fe2868cb03"). InnerVolumeSpecName "kube-api-access-ts5qr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:20:03.833345 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:03.833326 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cce797e7-a63a-4e5d-b701-29f038dc7783-util" (OuterVolumeSpecName: "util") pod "cce797e7-a63a-4e5d-b701-29f038dc7783" (UID: "cce797e7-a63a-4e5d-b701-29f038dc7783"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:20:03.833401 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:03.833339 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8c670cf-d59c-47cd-9811-82fe2868cb03-util" (OuterVolumeSpecName: "util") pod "e8c670cf-d59c-47cd-9811-82fe2868cb03" (UID: "e8c670cf-d59c-47cd-9811-82fe2868cb03"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:20:03.929163 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:03.928993 2577 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e8c670cf-d59c-47cd-9811-82fe2868cb03-util\") on node \"ip-10-0-131-31.ec2.internal\" DevicePath \"\"" Apr 16 20:20:03.929163 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:03.929022 2577 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cce797e7-a63a-4e5d-b701-29f038dc7783-bundle\") on node \"ip-10-0-131-31.ec2.internal\" DevicePath \"\"" Apr 16 20:20:03.929163 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:03.929038 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ts5qr\" (UniqueName: \"kubernetes.io/projected/e8c670cf-d59c-47cd-9811-82fe2868cb03-kube-api-access-ts5qr\") on node \"ip-10-0-131-31.ec2.internal\" DevicePath \"\"" Apr 16 20:20:03.929163 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:03.929052 2577 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cce797e7-a63a-4e5d-b701-29f038dc7783-util\") on node \"ip-10-0-131-31.ec2.internal\" DevicePath \"\"" Apr 16 20:20:03.929163 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:03.929066 2577 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e8c670cf-d59c-47cd-9811-82fe2868cb03-bundle\") on node \"ip-10-0-131-31.ec2.internal\" DevicePath \"\"" Apr 16 20:20:03.929163 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:03.929081 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-g9t6p\" (UniqueName: \"kubernetes.io/projected/cce797e7-a63a-4e5d-b701-29f038dc7783-kube-api-access-g9t6p\") on node \"ip-10-0-131-31.ec2.internal\" DevicePath \"\"" Apr 16 20:20:04.619026 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:04.618989 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec8846885" Apr 16 20:20:04.619427 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:04.618991 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec8846885" event={"ID":"e8c670cf-d59c-47cd-9811-82fe2868cb03","Type":"ContainerDied","Data":"edbdf4220a8bb49d7f2b7b5473a32f8593b2bea4f141135a5c04ac3bbfa3f2f0"} Apr 16 20:20:04.619427 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:04.619099 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="edbdf4220a8bb49d7f2b7b5473a32f8593b2bea4f141135a5c04ac3bbfa3f2f0" Apr 16 20:20:04.620584 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:04.620559 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bn77bq" Apr 16 20:20:04.620687 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:04.620584 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bn77bq" event={"ID":"cce797e7-a63a-4e5d-b701-29f038dc7783","Type":"ContainerDied","Data":"34acbacdbd54c5fffd15ba30eea680009e4ca83a37ad2e7e1a65c7256f3b3f01"} Apr 16 20:20:04.620687 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:04.620608 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="34acbacdbd54c5fffd15ba30eea680009e4ca83a37ad2e7e1a65c7256f3b3f01" Apr 16 20:20:04.765588 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:04.765569 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30d7kkr" Apr 16 20:20:04.768793 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:04.768775 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503kz8fx" Apr 16 20:20:04.836581 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:04.836550 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9fe049ee-02b5-4841-a8ea-22bb6211d9e1-bundle\") pod \"9fe049ee-02b5-4841-a8ea-22bb6211d9e1\" (UID: \"9fe049ee-02b5-4841-a8ea-22bb6211d9e1\") " Apr 16 20:20:04.836581 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:04.836587 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a3a46512-57bb-415b-b19d-6ae729dcd3b6-util\") pod \"a3a46512-57bb-415b-b19d-6ae729dcd3b6\" (UID: \"a3a46512-57bb-415b-b19d-6ae729dcd3b6\") " Apr 16 20:20:04.836804 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:04.836613 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4f2jh\" (UniqueName: \"kubernetes.io/projected/9fe049ee-02b5-4841-a8ea-22bb6211d9e1-kube-api-access-4f2jh\") pod \"9fe049ee-02b5-4841-a8ea-22bb6211d9e1\" (UID: \"9fe049ee-02b5-4841-a8ea-22bb6211d9e1\") " Apr 16 20:20:04.836804 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:04.836634 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a3a46512-57bb-415b-b19d-6ae729dcd3b6-bundle\") pod \"a3a46512-57bb-415b-b19d-6ae729dcd3b6\" (UID: \"a3a46512-57bb-415b-b19d-6ae729dcd3b6\") " Apr 16 20:20:04.836804 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:04.836655 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qphcr\" (UniqueName: \"kubernetes.io/projected/a3a46512-57bb-415b-b19d-6ae729dcd3b6-kube-api-access-qphcr\") pod \"a3a46512-57bb-415b-b19d-6ae729dcd3b6\" (UID: \"a3a46512-57bb-415b-b19d-6ae729dcd3b6\") " Apr 16 20:20:04.836804 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:04.836690 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9fe049ee-02b5-4841-a8ea-22bb6211d9e1-util\") pod \"9fe049ee-02b5-4841-a8ea-22bb6211d9e1\" (UID: \"9fe049ee-02b5-4841-a8ea-22bb6211d9e1\") " Apr 16 20:20:04.837429 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:04.837368 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9fe049ee-02b5-4841-a8ea-22bb6211d9e1-bundle" (OuterVolumeSpecName: "bundle") pod "9fe049ee-02b5-4841-a8ea-22bb6211d9e1" (UID: "9fe049ee-02b5-4841-a8ea-22bb6211d9e1"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:20:04.837538 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:04.837442 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3a46512-57bb-415b-b19d-6ae729dcd3b6-bundle" (OuterVolumeSpecName: "bundle") pod "a3a46512-57bb-415b-b19d-6ae729dcd3b6" (UID: "a3a46512-57bb-415b-b19d-6ae729dcd3b6"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:20:04.838933 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:04.838909 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3a46512-57bb-415b-b19d-6ae729dcd3b6-kube-api-access-qphcr" (OuterVolumeSpecName: "kube-api-access-qphcr") pod "a3a46512-57bb-415b-b19d-6ae729dcd3b6" (UID: "a3a46512-57bb-415b-b19d-6ae729dcd3b6"). InnerVolumeSpecName "kube-api-access-qphcr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:20:04.839122 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:04.839099 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9fe049ee-02b5-4841-a8ea-22bb6211d9e1-kube-api-access-4f2jh" (OuterVolumeSpecName: "kube-api-access-4f2jh") pod "9fe049ee-02b5-4841-a8ea-22bb6211d9e1" (UID: "9fe049ee-02b5-4841-a8ea-22bb6211d9e1"). InnerVolumeSpecName "kube-api-access-4f2jh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:20:04.842972 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:04.842927 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3a46512-57bb-415b-b19d-6ae729dcd3b6-util" (OuterVolumeSpecName: "util") pod "a3a46512-57bb-415b-b19d-6ae729dcd3b6" (UID: "a3a46512-57bb-415b-b19d-6ae729dcd3b6"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:20:04.844515 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:04.844493 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9fe049ee-02b5-4841-a8ea-22bb6211d9e1-util" (OuterVolumeSpecName: "util") pod "9fe049ee-02b5-4841-a8ea-22bb6211d9e1" (UID: "9fe049ee-02b5-4841-a8ea-22bb6211d9e1"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:20:04.937401 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:04.937374 2577 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9fe049ee-02b5-4841-a8ea-22bb6211d9e1-util\") on node \"ip-10-0-131-31.ec2.internal\" DevicePath \"\"" Apr 16 20:20:04.937528 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:04.937404 2577 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9fe049ee-02b5-4841-a8ea-22bb6211d9e1-bundle\") on node \"ip-10-0-131-31.ec2.internal\" DevicePath \"\"" Apr 16 20:20:04.937528 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:04.937419 2577 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a3a46512-57bb-415b-b19d-6ae729dcd3b6-util\") on node \"ip-10-0-131-31.ec2.internal\" DevicePath \"\"" Apr 16 20:20:04.937528 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:04.937434 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4f2jh\" (UniqueName: \"kubernetes.io/projected/9fe049ee-02b5-4841-a8ea-22bb6211d9e1-kube-api-access-4f2jh\") on node \"ip-10-0-131-31.ec2.internal\" DevicePath \"\"" Apr 16 20:20:04.937528 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:04.937451 2577 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a3a46512-57bb-415b-b19d-6ae729dcd3b6-bundle\") on node \"ip-10-0-131-31.ec2.internal\" DevicePath \"\"" Apr 16 20:20:04.937528 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:04.937467 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qphcr\" (UniqueName: \"kubernetes.io/projected/a3a46512-57bb-415b-b19d-6ae729dcd3b6-kube-api-access-qphcr\") on node \"ip-10-0-131-31.ec2.internal\" DevicePath \"\"" Apr 16 20:20:05.625824 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:05.625792 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503kz8fx" event={"ID":"9fe049ee-02b5-4841-a8ea-22bb6211d9e1","Type":"ContainerDied","Data":"4d4f26bf9cadec60d1a9fb83d81cfc8b4590d422fb3fbfbd1d6e0df5664b1d14"} Apr 16 20:20:05.625824 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:05.625815 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503kz8fx" Apr 16 20:20:05.625824 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:05.625826 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4d4f26bf9cadec60d1a9fb83d81cfc8b4590d422fb3fbfbd1d6e0df5664b1d14" Apr 16 20:20:05.627653 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:05.627629 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30d7kkr" event={"ID":"a3a46512-57bb-415b-b19d-6ae729dcd3b6","Type":"ContainerDied","Data":"fe4d7665320e0094c205dcab90556e117ad47ba83e6f7e55051fa15e6651c747"} Apr 16 20:20:05.627653 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:05.627654 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fe4d7665320e0094c205dcab90556e117ad47ba83e6f7e55051fa15e6651c747" Apr 16 20:20:05.627838 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:05.627680 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30d7kkr" Apr 16 20:20:08.623393 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:08.623362 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7b77b95f98-xqbnk"] Apr 16 20:20:08.623931 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:08.623870 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cce797e7-a63a-4e5d-b701-29f038dc7783" containerName="extract" Apr 16 20:20:08.623931 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:08.623890 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="cce797e7-a63a-4e5d-b701-29f038dc7783" containerName="extract" Apr 16 20:20:08.623931 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:08.623915 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e8c670cf-d59c-47cd-9811-82fe2868cb03" containerName="extract" Apr 16 20:20:08.623931 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:08.623923 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8c670cf-d59c-47cd-9811-82fe2868cb03" containerName="extract" Apr 16 20:20:08.623931 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:08.623931 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a3a46512-57bb-415b-b19d-6ae729dcd3b6" containerName="pull" Apr 16 20:20:08.624225 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:08.623940 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3a46512-57bb-415b-b19d-6ae729dcd3b6" containerName="pull" Apr 16 20:20:08.624225 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:08.623970 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cce797e7-a63a-4e5d-b701-29f038dc7783" containerName="util" Apr 16 20:20:08.624225 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:08.623977 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="cce797e7-a63a-4e5d-b701-29f038dc7783" containerName="util" Apr 16 20:20:08.624225 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:08.623988 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9fe049ee-02b5-4841-a8ea-22bb6211d9e1" containerName="extract" Apr 16 20:20:08.624225 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:08.623996 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fe049ee-02b5-4841-a8ea-22bb6211d9e1" containerName="extract" Apr 16 20:20:08.624225 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:08.624007 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9fe049ee-02b5-4841-a8ea-22bb6211d9e1" containerName="util" Apr 16 20:20:08.624225 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:08.624015 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fe049ee-02b5-4841-a8ea-22bb6211d9e1" containerName="util" Apr 16 20:20:08.624225 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:08.624025 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cce797e7-a63a-4e5d-b701-29f038dc7783" containerName="pull" Apr 16 20:20:08.624225 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:08.624033 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="cce797e7-a63a-4e5d-b701-29f038dc7783" containerName="pull" Apr 16 20:20:08.624225 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:08.624054 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e8c670cf-d59c-47cd-9811-82fe2868cb03" containerName="pull" Apr 16 20:20:08.624225 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:08.624062 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8c670cf-d59c-47cd-9811-82fe2868cb03" containerName="pull" Apr 16 20:20:08.624225 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:08.624098 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9fe049ee-02b5-4841-a8ea-22bb6211d9e1" containerName="pull" Apr 16 20:20:08.624225 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:08.624108 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fe049ee-02b5-4841-a8ea-22bb6211d9e1" containerName="pull" Apr 16 20:20:08.624225 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:08.624121 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e8c670cf-d59c-47cd-9811-82fe2868cb03" containerName="util" Apr 16 20:20:08.624225 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:08.624129 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8c670cf-d59c-47cd-9811-82fe2868cb03" containerName="util" Apr 16 20:20:08.624225 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:08.624144 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a3a46512-57bb-415b-b19d-6ae729dcd3b6" containerName="util" Apr 16 20:20:08.624225 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:08.624152 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3a46512-57bb-415b-b19d-6ae729dcd3b6" containerName="util" Apr 16 20:20:08.624225 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:08.624160 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a3a46512-57bb-415b-b19d-6ae729dcd3b6" containerName="extract" Apr 16 20:20:08.624225 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:08.624169 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3a46512-57bb-415b-b19d-6ae729dcd3b6" containerName="extract" Apr 16 20:20:08.625255 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:08.624253 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="9fe049ee-02b5-4841-a8ea-22bb6211d9e1" containerName="extract" Apr 16 20:20:08.625255 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:08.624266 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="e8c670cf-d59c-47cd-9811-82fe2868cb03" containerName="extract" Apr 16 20:20:08.625255 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:08.624280 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="a3a46512-57bb-415b-b19d-6ae729dcd3b6" containerName="extract" Apr 16 20:20:08.625255 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:08.624294 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="cce797e7-a63a-4e5d-b701-29f038dc7783" containerName="extract" Apr 16 20:20:08.627872 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:08.627846 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7b77b95f98-xqbnk" Apr 16 20:20:08.638846 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:08.638816 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7b77b95f98-xqbnk"] Apr 16 20:20:08.667175 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:08.667142 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9511771d-426a-40a5-9e10-90e1f3c18744-console-oauth-config\") pod \"console-7b77b95f98-xqbnk\" (UID: \"9511771d-426a-40a5-9e10-90e1f3c18744\") " pod="openshift-console/console-7b77b95f98-xqbnk" Apr 16 20:20:08.667345 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:08.667188 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9511771d-426a-40a5-9e10-90e1f3c18744-console-config\") pod \"console-7b77b95f98-xqbnk\" (UID: \"9511771d-426a-40a5-9e10-90e1f3c18744\") " pod="openshift-console/console-7b77b95f98-xqbnk" Apr 16 20:20:08.667345 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:08.667268 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9511771d-426a-40a5-9e10-90e1f3c18744-service-ca\") pod \"console-7b77b95f98-xqbnk\" (UID: \"9511771d-426a-40a5-9e10-90e1f3c18744\") " pod="openshift-console/console-7b77b95f98-xqbnk" Apr 16 20:20:08.667345 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:08.667317 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q94xh\" (UniqueName: \"kubernetes.io/projected/9511771d-426a-40a5-9e10-90e1f3c18744-kube-api-access-q94xh\") pod \"console-7b77b95f98-xqbnk\" (UID: \"9511771d-426a-40a5-9e10-90e1f3c18744\") " pod="openshift-console/console-7b77b95f98-xqbnk" Apr 16 20:20:08.667513 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:08.667345 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9511771d-426a-40a5-9e10-90e1f3c18744-oauth-serving-cert\") pod \"console-7b77b95f98-xqbnk\" (UID: \"9511771d-426a-40a5-9e10-90e1f3c18744\") " pod="openshift-console/console-7b77b95f98-xqbnk" Apr 16 20:20:08.667513 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:08.667386 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9511771d-426a-40a5-9e10-90e1f3c18744-trusted-ca-bundle\") pod \"console-7b77b95f98-xqbnk\" (UID: \"9511771d-426a-40a5-9e10-90e1f3c18744\") " pod="openshift-console/console-7b77b95f98-xqbnk" Apr 16 20:20:08.667513 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:08.667456 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9511771d-426a-40a5-9e10-90e1f3c18744-console-serving-cert\") pod \"console-7b77b95f98-xqbnk\" (UID: \"9511771d-426a-40a5-9e10-90e1f3c18744\") " pod="openshift-console/console-7b77b95f98-xqbnk" Apr 16 20:20:08.768783 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:08.768744 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9511771d-426a-40a5-9e10-90e1f3c18744-console-oauth-config\") pod \"console-7b77b95f98-xqbnk\" (UID: \"9511771d-426a-40a5-9e10-90e1f3c18744\") " pod="openshift-console/console-7b77b95f98-xqbnk" Apr 16 20:20:08.768783 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:08.768789 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9511771d-426a-40a5-9e10-90e1f3c18744-console-config\") pod \"console-7b77b95f98-xqbnk\" (UID: \"9511771d-426a-40a5-9e10-90e1f3c18744\") " pod="openshift-console/console-7b77b95f98-xqbnk" Apr 16 20:20:08.769040 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:08.768828 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9511771d-426a-40a5-9e10-90e1f3c18744-service-ca\") pod \"console-7b77b95f98-xqbnk\" (UID: \"9511771d-426a-40a5-9e10-90e1f3c18744\") " pod="openshift-console/console-7b77b95f98-xqbnk" Apr 16 20:20:08.769040 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:08.768870 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q94xh\" (UniqueName: \"kubernetes.io/projected/9511771d-426a-40a5-9e10-90e1f3c18744-kube-api-access-q94xh\") pod \"console-7b77b95f98-xqbnk\" (UID: \"9511771d-426a-40a5-9e10-90e1f3c18744\") " pod="openshift-console/console-7b77b95f98-xqbnk" Apr 16 20:20:08.769040 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:08.768911 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9511771d-426a-40a5-9e10-90e1f3c18744-oauth-serving-cert\") pod \"console-7b77b95f98-xqbnk\" (UID: \"9511771d-426a-40a5-9e10-90e1f3c18744\") " pod="openshift-console/console-7b77b95f98-xqbnk" Apr 16 20:20:08.769040 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:08.768938 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9511771d-426a-40a5-9e10-90e1f3c18744-trusted-ca-bundle\") pod \"console-7b77b95f98-xqbnk\" (UID: \"9511771d-426a-40a5-9e10-90e1f3c18744\") " pod="openshift-console/console-7b77b95f98-xqbnk" Apr 16 20:20:08.769040 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:08.769017 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9511771d-426a-40a5-9e10-90e1f3c18744-console-serving-cert\") pod \"console-7b77b95f98-xqbnk\" (UID: \"9511771d-426a-40a5-9e10-90e1f3c18744\") " pod="openshift-console/console-7b77b95f98-xqbnk" Apr 16 20:20:08.769779 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:08.769751 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9511771d-426a-40a5-9e10-90e1f3c18744-service-ca\") pod \"console-7b77b95f98-xqbnk\" (UID: \"9511771d-426a-40a5-9e10-90e1f3c18744\") " pod="openshift-console/console-7b77b95f98-xqbnk" Apr 16 20:20:08.769901 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:08.769777 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9511771d-426a-40a5-9e10-90e1f3c18744-oauth-serving-cert\") pod \"console-7b77b95f98-xqbnk\" (UID: \"9511771d-426a-40a5-9e10-90e1f3c18744\") " pod="openshift-console/console-7b77b95f98-xqbnk" Apr 16 20:20:08.769901 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:08.769889 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9511771d-426a-40a5-9e10-90e1f3c18744-trusted-ca-bundle\") pod \"console-7b77b95f98-xqbnk\" (UID: \"9511771d-426a-40a5-9e10-90e1f3c18744\") " pod="openshift-console/console-7b77b95f98-xqbnk" Apr 16 20:20:08.770070 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:08.769935 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9511771d-426a-40a5-9e10-90e1f3c18744-console-config\") pod \"console-7b77b95f98-xqbnk\" (UID: \"9511771d-426a-40a5-9e10-90e1f3c18744\") " pod="openshift-console/console-7b77b95f98-xqbnk" Apr 16 20:20:08.771444 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:08.771421 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9511771d-426a-40a5-9e10-90e1f3c18744-console-oauth-config\") pod \"console-7b77b95f98-xqbnk\" (UID: \"9511771d-426a-40a5-9e10-90e1f3c18744\") " pod="openshift-console/console-7b77b95f98-xqbnk" Apr 16 20:20:08.771533 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:08.771443 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9511771d-426a-40a5-9e10-90e1f3c18744-console-serving-cert\") pod \"console-7b77b95f98-xqbnk\" (UID: \"9511771d-426a-40a5-9e10-90e1f3c18744\") " pod="openshift-console/console-7b77b95f98-xqbnk" Apr 16 20:20:08.778496 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:08.778477 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q94xh\" (UniqueName: \"kubernetes.io/projected/9511771d-426a-40a5-9e10-90e1f3c18744-kube-api-access-q94xh\") pod \"console-7b77b95f98-xqbnk\" (UID: \"9511771d-426a-40a5-9e10-90e1f3c18744\") " pod="openshift-console/console-7b77b95f98-xqbnk" Apr 16 20:20:08.938697 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:08.938606 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7b77b95f98-xqbnk" Apr 16 20:20:09.061706 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:09.061684 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7b77b95f98-xqbnk"] Apr 16 20:20:09.064056 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:20:09.064026 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9511771d_426a_40a5_9e10_90e1f3c18744.slice/crio-032dc3783f75a0fc86d7d489ca5f784eea11495282ee5bbb6f2b6e5b09f6ae5c WatchSource:0}: Error finding container 032dc3783f75a0fc86d7d489ca5f784eea11495282ee5bbb6f2b6e5b09f6ae5c: Status 404 returned error can't find the container with id 032dc3783f75a0fc86d7d489ca5f784eea11495282ee5bbb6f2b6e5b09f6ae5c Apr 16 20:20:09.643930 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:09.643889 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7b77b95f98-xqbnk" event={"ID":"9511771d-426a-40a5-9e10-90e1f3c18744","Type":"ContainerStarted","Data":"6906dcf73a35f5886fcede60e781affc5d7803ff110546c541e03ce4af828086"} Apr 16 20:20:09.644318 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:09.643937 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7b77b95f98-xqbnk" event={"ID":"9511771d-426a-40a5-9e10-90e1f3c18744","Type":"ContainerStarted","Data":"032dc3783f75a0fc86d7d489ca5f784eea11495282ee5bbb6f2b6e5b09f6ae5c"} Apr 16 20:20:09.662845 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:09.662791 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7b77b95f98-xqbnk" podStartSLOduration=1.662772501 podStartE2EDuration="1.662772501s" podCreationTimestamp="2026-04-16 20:20:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:20:09.661449777 +0000 UTC m=+509.337718013" watchObservedRunningTime="2026-04-16 20:20:09.662772501 +0000 UTC m=+509.339040734" Apr 16 20:20:18.912764 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:18.912730 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/dns-operator-controller-manager-844548ff4c-wzs24"] Apr 16 20:20:18.915362 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:18.915347 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-wzs24" Apr 16 20:20:18.917696 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:18.917675 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 16 20:20:18.917696 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:18.917683 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-manager-dockercfg-nzn5z\"" Apr 16 20:20:18.917864 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:18.917685 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 16 20:20:18.918405 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:18.918392 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-env\"" Apr 16 20:20:18.925978 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:18.925943 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-844548ff4c-wzs24"] Apr 16 20:20:18.939455 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:18.939432 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7b77b95f98-xqbnk" Apr 16 20:20:18.939559 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:18.939467 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7b77b95f98-xqbnk" Apr 16 20:20:18.944422 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:18.944404 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7b77b95f98-xqbnk" Apr 16 20:20:19.061814 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:19.061779 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d56xt\" (UniqueName: \"kubernetes.io/projected/bdaf6977-5831-4576-9c76-41b5f95830db-kube-api-access-d56xt\") pod \"dns-operator-controller-manager-844548ff4c-wzs24\" (UID: \"bdaf6977-5831-4576-9c76-41b5f95830db\") " pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-wzs24" Apr 16 20:20:19.163348 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:19.163258 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d56xt\" (UniqueName: \"kubernetes.io/projected/bdaf6977-5831-4576-9c76-41b5f95830db-kube-api-access-d56xt\") pod \"dns-operator-controller-manager-844548ff4c-wzs24\" (UID: \"bdaf6977-5831-4576-9c76-41b5f95830db\") " pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-wzs24" Apr 16 20:20:19.174988 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:19.174931 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d56xt\" (UniqueName: \"kubernetes.io/projected/bdaf6977-5831-4576-9c76-41b5f95830db-kube-api-access-d56xt\") pod \"dns-operator-controller-manager-844548ff4c-wzs24\" (UID: \"bdaf6977-5831-4576-9c76-41b5f95830db\") " pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-wzs24" Apr 16 20:20:19.227076 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:19.227036 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-wzs24" Apr 16 20:20:19.345578 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:19.345548 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-844548ff4c-wzs24"] Apr 16 20:20:19.346919 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:20:19.346890 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbdaf6977_5831_4576_9c76_41b5f95830db.slice/crio-67cc28d558268c203a325da6d14d22269ce6b3cb2326de2ed69ef9b32700b6c0 WatchSource:0}: Error finding container 67cc28d558268c203a325da6d14d22269ce6b3cb2326de2ed69ef9b32700b6c0: Status 404 returned error can't find the container with id 67cc28d558268c203a325da6d14d22269ce6b3cb2326de2ed69ef9b32700b6c0 Apr 16 20:20:19.679139 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:19.679108 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-wzs24" event={"ID":"bdaf6977-5831-4576-9c76-41b5f95830db","Type":"ContainerStarted","Data":"67cc28d558268c203a325da6d14d22269ce6b3cb2326de2ed69ef9b32700b6c0"} Apr 16 20:20:19.683225 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:19.683204 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7b77b95f98-xqbnk" Apr 16 20:20:19.747023 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:19.746985 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-d4f8855db-zvqt2"] Apr 16 20:20:22.693423 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:22.693374 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-wzs24" event={"ID":"bdaf6977-5831-4576-9c76-41b5f95830db","Type":"ContainerStarted","Data":"1192b014e419cda02204695b7e6d842071f80dd86d8cf0e70315ab5361cec9fc"} Apr 16 20:20:22.693792 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:22.693446 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-wzs24" Apr 16 20:20:22.712617 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:22.712573 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-wzs24" podStartSLOduration=1.7446997629999998 podStartE2EDuration="4.712559664s" podCreationTimestamp="2026-04-16 20:20:18 +0000 UTC" firstStartedPulling="2026-04-16 20:20:19.349027804 +0000 UTC m=+519.025296013" lastFinishedPulling="2026-04-16 20:20:22.316887696 +0000 UTC m=+521.993155914" observedRunningTime="2026-04-16 20:20:22.70986623 +0000 UTC m=+522.386134462" watchObservedRunningTime="2026-04-16 20:20:22.712559664 +0000 UTC m=+522.388827896" Apr 16 20:20:24.357964 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:24.357920 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6c886788f8-rlglg"] Apr 16 20:20:24.361301 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:24.361282 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-rlglg" Apr 16 20:20:24.363578 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:24.363554 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"plugin-serving-cert\"" Apr 16 20:20:24.363673 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:24.363561 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-2bn4s\"" Apr 16 20:20:24.363673 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:24.363561 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kuadrant-console-nginx-conf\"" Apr 16 20:20:24.368153 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:24.368134 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6c886788f8-rlglg"] Apr 16 20:20:24.509674 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:24.509642 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/6570b1e5-6d2f-4da4-a328-7d2ae8157394-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-rlglg\" (UID: \"6570b1e5-6d2f-4da4-a328-7d2ae8157394\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-rlglg" Apr 16 20:20:24.509817 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:24.509687 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jpkcz\" (UniqueName: \"kubernetes.io/projected/6570b1e5-6d2f-4da4-a328-7d2ae8157394-kube-api-access-jpkcz\") pod \"kuadrant-console-plugin-6c886788f8-rlglg\" (UID: \"6570b1e5-6d2f-4da4-a328-7d2ae8157394\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-rlglg" Apr 16 20:20:24.509817 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:24.509799 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/6570b1e5-6d2f-4da4-a328-7d2ae8157394-nginx-conf\") pod \"kuadrant-console-plugin-6c886788f8-rlglg\" (UID: \"6570b1e5-6d2f-4da4-a328-7d2ae8157394\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-rlglg" Apr 16 20:20:24.610614 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:24.610537 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/6570b1e5-6d2f-4da4-a328-7d2ae8157394-nginx-conf\") pod \"kuadrant-console-plugin-6c886788f8-rlglg\" (UID: \"6570b1e5-6d2f-4da4-a328-7d2ae8157394\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-rlglg" Apr 16 20:20:24.610614 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:24.610586 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/6570b1e5-6d2f-4da4-a328-7d2ae8157394-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-rlglg\" (UID: \"6570b1e5-6d2f-4da4-a328-7d2ae8157394\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-rlglg" Apr 16 20:20:24.610828 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:24.610616 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jpkcz\" (UniqueName: \"kubernetes.io/projected/6570b1e5-6d2f-4da4-a328-7d2ae8157394-kube-api-access-jpkcz\") pod \"kuadrant-console-plugin-6c886788f8-rlglg\" (UID: \"6570b1e5-6d2f-4da4-a328-7d2ae8157394\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-rlglg" Apr 16 20:20:24.610828 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:20:24.610739 2577 secret.go:189] Couldn't get secret kuadrant-system/plugin-serving-cert: secret "plugin-serving-cert" not found Apr 16 20:20:24.610828 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:20:24.610803 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6570b1e5-6d2f-4da4-a328-7d2ae8157394-plugin-serving-cert podName:6570b1e5-6d2f-4da4-a328-7d2ae8157394 nodeName:}" failed. No retries permitted until 2026-04-16 20:20:25.110786805 +0000 UTC m=+524.787055016 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/6570b1e5-6d2f-4da4-a328-7d2ae8157394-plugin-serving-cert") pod "kuadrant-console-plugin-6c886788f8-rlglg" (UID: "6570b1e5-6d2f-4da4-a328-7d2ae8157394") : secret "plugin-serving-cert" not found Apr 16 20:20:24.611269 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:24.611248 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/6570b1e5-6d2f-4da4-a328-7d2ae8157394-nginx-conf\") pod \"kuadrant-console-plugin-6c886788f8-rlglg\" (UID: \"6570b1e5-6d2f-4da4-a328-7d2ae8157394\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-rlglg" Apr 16 20:20:24.619282 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:24.619263 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jpkcz\" (UniqueName: \"kubernetes.io/projected/6570b1e5-6d2f-4da4-a328-7d2ae8157394-kube-api-access-jpkcz\") pod \"kuadrant-console-plugin-6c886788f8-rlglg\" (UID: \"6570b1e5-6d2f-4da4-a328-7d2ae8157394\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-rlglg" Apr 16 20:20:25.115370 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:25.115338 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/6570b1e5-6d2f-4da4-a328-7d2ae8157394-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-rlglg\" (UID: \"6570b1e5-6d2f-4da4-a328-7d2ae8157394\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-rlglg" Apr 16 20:20:25.117586 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:25.117567 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/6570b1e5-6d2f-4da4-a328-7d2ae8157394-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-rlglg\" (UID: \"6570b1e5-6d2f-4da4-a328-7d2ae8157394\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-rlglg" Apr 16 20:20:25.271487 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:25.271448 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-rlglg" Apr 16 20:20:25.598286 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:25.598259 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6c886788f8-rlglg"] Apr 16 20:20:25.600013 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:20:25.599982 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6570b1e5_6d2f_4da4_a328_7d2ae8157394.slice/crio-31d2546842ab536ada09e38216c0625b9724bfb843b0a09f65f5a05414e07f0a WatchSource:0}: Error finding container 31d2546842ab536ada09e38216c0625b9724bfb843b0a09f65f5a05414e07f0a: Status 404 returned error can't find the container with id 31d2546842ab536ada09e38216c0625b9724bfb843b0a09f65f5a05414e07f0a Apr 16 20:20:25.705772 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:25.705737 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-rlglg" event={"ID":"6570b1e5-6d2f-4da4-a328-7d2ae8157394","Type":"ContainerStarted","Data":"31d2546842ab536ada09e38216c0625b9724bfb843b0a09f65f5a05414e07f0a"} Apr 16 20:20:30.727097 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:30.727064 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-rlglg" event={"ID":"6570b1e5-6d2f-4da4-a328-7d2ae8157394","Type":"ContainerStarted","Data":"f607df704f4493d4efc2388f864fdcfe6ee8080623f9deb0694d802c0514f616"} Apr 16 20:20:30.743168 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:30.743072 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-rlglg" podStartSLOduration=1.860199277 podStartE2EDuration="6.743058416s" podCreationTimestamp="2026-04-16 20:20:24 +0000 UTC" firstStartedPulling="2026-04-16 20:20:25.601487326 +0000 UTC m=+525.277755544" lastFinishedPulling="2026-04-16 20:20:30.484346473 +0000 UTC m=+530.160614683" observedRunningTime="2026-04-16 20:20:30.741223513 +0000 UTC m=+530.417491749" watchObservedRunningTime="2026-04-16 20:20:30.743058416 +0000 UTC m=+530.419326650" Apr 16 20:20:33.700186 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:33.700153 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-wzs24" Apr 16 20:20:44.769171 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:44.769127 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-d4f8855db-zvqt2" podUID="7b16828e-c4dc-448b-b30f-ca6119b260b7" containerName="console" containerID="cri-o://fd65eea43f9a6a38420150e83cbb8059c2d5f96d4774f1ab48060c9c9a481044" gracePeriod=15 Apr 16 20:20:45.017343 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:45.017309 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-d4f8855db-zvqt2_7b16828e-c4dc-448b-b30f-ca6119b260b7/console/0.log" Apr 16 20:20:45.017447 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:45.017383 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-d4f8855db-zvqt2" Apr 16 20:20:45.085039 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:45.085018 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7b16828e-c4dc-448b-b30f-ca6119b260b7-console-serving-cert\") pod \"7b16828e-c4dc-448b-b30f-ca6119b260b7\" (UID: \"7b16828e-c4dc-448b-b30f-ca6119b260b7\") " Apr 16 20:20:45.085202 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:45.085058 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7b16828e-c4dc-448b-b30f-ca6119b260b7-oauth-serving-cert\") pod \"7b16828e-c4dc-448b-b30f-ca6119b260b7\" (UID: \"7b16828e-c4dc-448b-b30f-ca6119b260b7\") " Apr 16 20:20:45.085202 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:45.085087 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7b16828e-c4dc-448b-b30f-ca6119b260b7-console-config\") pod \"7b16828e-c4dc-448b-b30f-ca6119b260b7\" (UID: \"7b16828e-c4dc-448b-b30f-ca6119b260b7\") " Apr 16 20:20:45.085202 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:45.085106 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vnrnf\" (UniqueName: \"kubernetes.io/projected/7b16828e-c4dc-448b-b30f-ca6119b260b7-kube-api-access-vnrnf\") pod \"7b16828e-c4dc-448b-b30f-ca6119b260b7\" (UID: \"7b16828e-c4dc-448b-b30f-ca6119b260b7\") " Apr 16 20:20:45.085202 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:45.085133 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7b16828e-c4dc-448b-b30f-ca6119b260b7-console-oauth-config\") pod \"7b16828e-c4dc-448b-b30f-ca6119b260b7\" (UID: \"7b16828e-c4dc-448b-b30f-ca6119b260b7\") " Apr 16 20:20:45.085202 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:45.085174 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7b16828e-c4dc-448b-b30f-ca6119b260b7-service-ca\") pod \"7b16828e-c4dc-448b-b30f-ca6119b260b7\" (UID: \"7b16828e-c4dc-448b-b30f-ca6119b260b7\") " Apr 16 20:20:45.085464 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:45.085205 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b16828e-c4dc-448b-b30f-ca6119b260b7-trusted-ca-bundle\") pod \"7b16828e-c4dc-448b-b30f-ca6119b260b7\" (UID: \"7b16828e-c4dc-448b-b30f-ca6119b260b7\") " Apr 16 20:20:45.085603 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:45.085511 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b16828e-c4dc-448b-b30f-ca6119b260b7-console-config" (OuterVolumeSpecName: "console-config") pod "7b16828e-c4dc-448b-b30f-ca6119b260b7" (UID: "7b16828e-c4dc-448b-b30f-ca6119b260b7"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:20:45.085603 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:45.085555 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b16828e-c4dc-448b-b30f-ca6119b260b7-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "7b16828e-c4dc-448b-b30f-ca6119b260b7" (UID: "7b16828e-c4dc-448b-b30f-ca6119b260b7"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:20:45.085783 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:45.085604 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b16828e-c4dc-448b-b30f-ca6119b260b7-service-ca" (OuterVolumeSpecName: "service-ca") pod "7b16828e-c4dc-448b-b30f-ca6119b260b7" (UID: "7b16828e-c4dc-448b-b30f-ca6119b260b7"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:20:45.085783 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:45.085726 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b16828e-c4dc-448b-b30f-ca6119b260b7-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "7b16828e-c4dc-448b-b30f-ca6119b260b7" (UID: "7b16828e-c4dc-448b-b30f-ca6119b260b7"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:20:45.087429 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:45.087410 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b16828e-c4dc-448b-b30f-ca6119b260b7-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "7b16828e-c4dc-448b-b30f-ca6119b260b7" (UID: "7b16828e-c4dc-448b-b30f-ca6119b260b7"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:20:45.087798 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:45.087780 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b16828e-c4dc-448b-b30f-ca6119b260b7-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "7b16828e-c4dc-448b-b30f-ca6119b260b7" (UID: "7b16828e-c4dc-448b-b30f-ca6119b260b7"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:20:45.087854 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:45.087804 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b16828e-c4dc-448b-b30f-ca6119b260b7-kube-api-access-vnrnf" (OuterVolumeSpecName: "kube-api-access-vnrnf") pod "7b16828e-c4dc-448b-b30f-ca6119b260b7" (UID: "7b16828e-c4dc-448b-b30f-ca6119b260b7"). InnerVolumeSpecName "kube-api-access-vnrnf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:20:45.186612 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:45.186567 2577 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7b16828e-c4dc-448b-b30f-ca6119b260b7-console-oauth-config\") on node \"ip-10-0-131-31.ec2.internal\" DevicePath \"\"" Apr 16 20:20:45.186612 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:45.186606 2577 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7b16828e-c4dc-448b-b30f-ca6119b260b7-service-ca\") on node \"ip-10-0-131-31.ec2.internal\" DevicePath \"\"" Apr 16 20:20:45.186612 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:45.186617 2577 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b16828e-c4dc-448b-b30f-ca6119b260b7-trusted-ca-bundle\") on node \"ip-10-0-131-31.ec2.internal\" DevicePath \"\"" Apr 16 20:20:45.186612 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:45.186626 2577 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7b16828e-c4dc-448b-b30f-ca6119b260b7-console-serving-cert\") on node \"ip-10-0-131-31.ec2.internal\" DevicePath \"\"" Apr 16 20:20:45.186877 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:45.186635 2577 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7b16828e-c4dc-448b-b30f-ca6119b260b7-oauth-serving-cert\") on node \"ip-10-0-131-31.ec2.internal\" DevicePath \"\"" Apr 16 20:20:45.186877 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:45.186645 2577 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7b16828e-c4dc-448b-b30f-ca6119b260b7-console-config\") on node \"ip-10-0-131-31.ec2.internal\" DevicePath \"\"" Apr 16 20:20:45.186877 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:45.186655 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vnrnf\" (UniqueName: \"kubernetes.io/projected/7b16828e-c4dc-448b-b30f-ca6119b260b7-kube-api-access-vnrnf\") on node \"ip-10-0-131-31.ec2.internal\" DevicePath \"\"" Apr 16 20:20:45.784662 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:45.784629 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-d4f8855db-zvqt2_7b16828e-c4dc-448b-b30f-ca6119b260b7/console/0.log" Apr 16 20:20:45.785169 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:45.784668 2577 generic.go:358] "Generic (PLEG): container finished" podID="7b16828e-c4dc-448b-b30f-ca6119b260b7" containerID="fd65eea43f9a6a38420150e83cbb8059c2d5f96d4774f1ab48060c9c9a481044" exitCode=2 Apr 16 20:20:45.785169 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:45.784734 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-d4f8855db-zvqt2" Apr 16 20:20:45.785169 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:45.784750 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-d4f8855db-zvqt2" event={"ID":"7b16828e-c4dc-448b-b30f-ca6119b260b7","Type":"ContainerDied","Data":"fd65eea43f9a6a38420150e83cbb8059c2d5f96d4774f1ab48060c9c9a481044"} Apr 16 20:20:45.785169 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:45.784788 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-d4f8855db-zvqt2" event={"ID":"7b16828e-c4dc-448b-b30f-ca6119b260b7","Type":"ContainerDied","Data":"b71a2b3f1a9355fe9789604b6028d53420fc4abe36bc51a35f7fb979851d76d4"} Apr 16 20:20:45.785169 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:45.784804 2577 scope.go:117] "RemoveContainer" containerID="fd65eea43f9a6a38420150e83cbb8059c2d5f96d4774f1ab48060c9c9a481044" Apr 16 20:20:45.794924 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:45.794905 2577 scope.go:117] "RemoveContainer" containerID="fd65eea43f9a6a38420150e83cbb8059c2d5f96d4774f1ab48060c9c9a481044" Apr 16 20:20:45.795234 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:20:45.795214 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd65eea43f9a6a38420150e83cbb8059c2d5f96d4774f1ab48060c9c9a481044\": container with ID starting with fd65eea43f9a6a38420150e83cbb8059c2d5f96d4774f1ab48060c9c9a481044 not found: ID does not exist" containerID="fd65eea43f9a6a38420150e83cbb8059c2d5f96d4774f1ab48060c9c9a481044" Apr 16 20:20:45.795288 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:45.795245 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd65eea43f9a6a38420150e83cbb8059c2d5f96d4774f1ab48060c9c9a481044"} err="failed to get container status \"fd65eea43f9a6a38420150e83cbb8059c2d5f96d4774f1ab48060c9c9a481044\": rpc error: code = NotFound desc = could not find container \"fd65eea43f9a6a38420150e83cbb8059c2d5f96d4774f1ab48060c9c9a481044\": container with ID starting with fd65eea43f9a6a38420150e83cbb8059c2d5f96d4774f1ab48060c9c9a481044 not found: ID does not exist" Apr 16 20:20:45.808696 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:45.808672 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-d4f8855db-zvqt2"] Apr 16 20:20:45.812516 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:45.812495 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-d4f8855db-zvqt2"] Apr 16 20:20:46.875537 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:20:46.875501 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b16828e-c4dc-448b-b30f-ca6119b260b7" path="/var/lib/kubelet/pods/7b16828e-c4dc-448b-b30f-ca6119b260b7/volumes" Apr 16 20:21:05.414501 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:21:05.414460 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-h7fjx"] Apr 16 20:21:05.415129 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:21:05.415108 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7b16828e-c4dc-448b-b30f-ca6119b260b7" containerName="console" Apr 16 20:21:05.415216 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:21:05.415131 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b16828e-c4dc-448b-b30f-ca6119b260b7" containerName="console" Apr 16 20:21:05.415270 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:21:05.415214 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="7b16828e-c4dc-448b-b30f-ca6119b260b7" containerName="console" Apr 16 20:21:05.431509 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:21:05.431448 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-h7fjx"] Apr 16 20:21:05.431671 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:21:05.431555 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-67566c68b4-h7fjx" Apr 16 20:21:05.434038 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:21:05.434012 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 16 20:21:05.443613 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:21:05.443591 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-h7fjx"] Apr 16 20:21:05.561108 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:21:05.561060 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/1908133b-0394-499d-ae09-9aaf7ea5ddfb-config-file\") pod \"limitador-limitador-67566c68b4-h7fjx\" (UID: \"1908133b-0394-499d-ae09-9aaf7ea5ddfb\") " pod="kuadrant-system/limitador-limitador-67566c68b4-h7fjx" Apr 16 20:21:05.561291 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:21:05.561132 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gchn\" (UniqueName: \"kubernetes.io/projected/1908133b-0394-499d-ae09-9aaf7ea5ddfb-kube-api-access-5gchn\") pod \"limitador-limitador-67566c68b4-h7fjx\" (UID: \"1908133b-0394-499d-ae09-9aaf7ea5ddfb\") " pod="kuadrant-system/limitador-limitador-67566c68b4-h7fjx" Apr 16 20:21:05.662193 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:21:05.662146 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/1908133b-0394-499d-ae09-9aaf7ea5ddfb-config-file\") pod \"limitador-limitador-67566c68b4-h7fjx\" (UID: \"1908133b-0394-499d-ae09-9aaf7ea5ddfb\") " pod="kuadrant-system/limitador-limitador-67566c68b4-h7fjx" Apr 16 20:21:05.662368 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:21:05.662215 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5gchn\" (UniqueName: \"kubernetes.io/projected/1908133b-0394-499d-ae09-9aaf7ea5ddfb-kube-api-access-5gchn\") pod \"limitador-limitador-67566c68b4-h7fjx\" (UID: \"1908133b-0394-499d-ae09-9aaf7ea5ddfb\") " pod="kuadrant-system/limitador-limitador-67566c68b4-h7fjx" Apr 16 20:21:05.662762 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:21:05.662741 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/1908133b-0394-499d-ae09-9aaf7ea5ddfb-config-file\") pod \"limitador-limitador-67566c68b4-h7fjx\" (UID: \"1908133b-0394-499d-ae09-9aaf7ea5ddfb\") " pod="kuadrant-system/limitador-limitador-67566c68b4-h7fjx" Apr 16 20:21:05.669745 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:21:05.669686 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gchn\" (UniqueName: \"kubernetes.io/projected/1908133b-0394-499d-ae09-9aaf7ea5ddfb-kube-api-access-5gchn\") pod \"limitador-limitador-67566c68b4-h7fjx\" (UID: \"1908133b-0394-499d-ae09-9aaf7ea5ddfb\") " pod="kuadrant-system/limitador-limitador-67566c68b4-h7fjx" Apr 16 20:21:05.742602 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:21:05.742563 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-67566c68b4-h7fjx" Apr 16 20:21:05.796749 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:21:05.796716 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-674b59b84c-b6qd8"] Apr 16 20:21:05.800717 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:21:05.800694 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-b6qd8" Apr 16 20:21:05.803193 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:21:05.803162 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-68t9k\"" Apr 16 20:21:05.806172 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:21:05.805830 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-674b59b84c-b6qd8"] Apr 16 20:21:05.864204 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:21:05.864164 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gx8f\" (UniqueName: \"kubernetes.io/projected/515bee93-1d2b-4df9-822f-ba03f5c99872-kube-api-access-7gx8f\") pod \"authorino-674b59b84c-b6qd8\" (UID: \"515bee93-1d2b-4df9-822f-ba03f5c99872\") " pod="kuadrant-system/authorino-674b59b84c-b6qd8" Apr 16 20:21:05.889291 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:21:05.889268 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-h7fjx"] Apr 16 20:21:05.891622 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:21:05.891596 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1908133b_0394_499d_ae09_9aaf7ea5ddfb.slice/crio-b86b8f2ec25344106a3227bf802e09cd609d2d9bd56335d70ad759241d3dcce5 WatchSource:0}: Error finding container b86b8f2ec25344106a3227bf802e09cd609d2d9bd56335d70ad759241d3dcce5: Status 404 returned error can't find the container with id b86b8f2ec25344106a3227bf802e09cd609d2d9bd56335d70ad759241d3dcce5 Apr 16 20:21:05.950780 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:21:05.950708 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-79cbc94b89-lztl5"] Apr 16 20:21:05.953083 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:21:05.953068 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-lztl5" Apr 16 20:21:05.959986 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:21:05.959940 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-lztl5"] Apr 16 20:21:05.965575 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:21:05.965551 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7gx8f\" (UniqueName: \"kubernetes.io/projected/515bee93-1d2b-4df9-822f-ba03f5c99872-kube-api-access-7gx8f\") pod \"authorino-674b59b84c-b6qd8\" (UID: \"515bee93-1d2b-4df9-822f-ba03f5c99872\") " pod="kuadrant-system/authorino-674b59b84c-b6qd8" Apr 16 20:21:05.975260 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:21:05.975236 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gx8f\" (UniqueName: \"kubernetes.io/projected/515bee93-1d2b-4df9-822f-ba03f5c99872-kube-api-access-7gx8f\") pod \"authorino-674b59b84c-b6qd8\" (UID: \"515bee93-1d2b-4df9-822f-ba03f5c99872\") " pod="kuadrant-system/authorino-674b59b84c-b6qd8" Apr 16 20:21:06.066704 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:21:06.066674 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w998b\" (UniqueName: \"kubernetes.io/projected/991095fe-bfc1-4624-a615-634d41441de7-kube-api-access-w998b\") pod \"authorino-79cbc94b89-lztl5\" (UID: \"991095fe-bfc1-4624-a615-634d41441de7\") " pod="kuadrant-system/authorino-79cbc94b89-lztl5" Apr 16 20:21:06.115604 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:21:06.115569 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-b6qd8" Apr 16 20:21:06.169035 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:21:06.168334 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w998b\" (UniqueName: \"kubernetes.io/projected/991095fe-bfc1-4624-a615-634d41441de7-kube-api-access-w998b\") pod \"authorino-79cbc94b89-lztl5\" (UID: \"991095fe-bfc1-4624-a615-634d41441de7\") " pod="kuadrant-system/authorino-79cbc94b89-lztl5" Apr 16 20:21:06.180707 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:21:06.180673 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w998b\" (UniqueName: \"kubernetes.io/projected/991095fe-bfc1-4624-a615-634d41441de7-kube-api-access-w998b\") pod \"authorino-79cbc94b89-lztl5\" (UID: \"991095fe-bfc1-4624-a615-634d41441de7\") " pod="kuadrant-system/authorino-79cbc94b89-lztl5" Apr 16 20:21:06.232590 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:21:06.232563 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-674b59b84c-b6qd8"] Apr 16 20:21:06.234464 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:21:06.234435 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod515bee93_1d2b_4df9_822f_ba03f5c99872.slice/crio-1e6d8cdfeb2a91a7101d88b5995e00dfb287ecb4b29ead0985ea37b2fef9c9ff WatchSource:0}: Error finding container 1e6d8cdfeb2a91a7101d88b5995e00dfb287ecb4b29ead0985ea37b2fef9c9ff: Status 404 returned error can't find the container with id 1e6d8cdfeb2a91a7101d88b5995e00dfb287ecb4b29ead0985ea37b2fef9c9ff Apr 16 20:21:06.263377 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:21:06.263343 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-lztl5" Apr 16 20:21:06.379049 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:21:06.379022 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-lztl5"] Apr 16 20:21:06.380510 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:21:06.380486 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod991095fe_bfc1_4624_a615_634d41441de7.slice/crio-8ca075d1276b18b317fcb05be18afdc2a30106613ec1ba8a468f7479b3836936 WatchSource:0}: Error finding container 8ca075d1276b18b317fcb05be18afdc2a30106613ec1ba8a468f7479b3836936: Status 404 returned error can't find the container with id 8ca075d1276b18b317fcb05be18afdc2a30106613ec1ba8a468f7479b3836936 Apr 16 20:21:06.867497 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:21:06.867446 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-b6qd8" event={"ID":"515bee93-1d2b-4df9-822f-ba03f5c99872","Type":"ContainerStarted","Data":"1e6d8cdfeb2a91a7101d88b5995e00dfb287ecb4b29ead0985ea37b2fef9c9ff"} Apr 16 20:21:06.868827 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:21:06.868790 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-67566c68b4-h7fjx" event={"ID":"1908133b-0394-499d-ae09-9aaf7ea5ddfb","Type":"ContainerStarted","Data":"b86b8f2ec25344106a3227bf802e09cd609d2d9bd56335d70ad759241d3dcce5"} Apr 16 20:21:06.870201 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:21:06.870171 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-lztl5" event={"ID":"991095fe-bfc1-4624-a615-634d41441de7","Type":"ContainerStarted","Data":"8ca075d1276b18b317fcb05be18afdc2a30106613ec1ba8a468f7479b3836936"} Apr 16 20:21:08.884135 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:21:08.884098 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-b6qd8" event={"ID":"515bee93-1d2b-4df9-822f-ba03f5c99872","Type":"ContainerStarted","Data":"e101e7508ffe6c8ad1815cae5d96fbcc7ef5b2dcc00e7c66c24d2ba2e15c8610"} Apr 16 20:21:08.886841 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:21:08.886665 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-lztl5" event={"ID":"991095fe-bfc1-4624-a615-634d41441de7","Type":"ContainerStarted","Data":"903d40b2e4f0e9b8e0268c34d4d391ad750a7bc6346b03d0f7dc63ce99db7cf5"} Apr 16 20:21:08.899998 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:21:08.899078 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-674b59b84c-b6qd8" podStartSLOduration=1.345578683 podStartE2EDuration="3.899055783s" podCreationTimestamp="2026-04-16 20:21:05 +0000 UTC" firstStartedPulling="2026-04-16 20:21:06.235748022 +0000 UTC m=+565.912016232" lastFinishedPulling="2026-04-16 20:21:08.789225122 +0000 UTC m=+568.465493332" observedRunningTime="2026-04-16 20:21:08.897826151 +0000 UTC m=+568.574094384" watchObservedRunningTime="2026-04-16 20:21:08.899055783 +0000 UTC m=+568.575324015" Apr 16 20:21:08.920715 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:21:08.920407 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-79cbc94b89-lztl5" podStartSLOduration=1.502941517 podStartE2EDuration="3.92038563s" podCreationTimestamp="2026-04-16 20:21:05 +0000 UTC" firstStartedPulling="2026-04-16 20:21:06.381821221 +0000 UTC m=+566.058089431" lastFinishedPulling="2026-04-16 20:21:08.799265334 +0000 UTC m=+568.475533544" observedRunningTime="2026-04-16 20:21:08.913861492 +0000 UTC m=+568.590129727" watchObservedRunningTime="2026-04-16 20:21:08.92038563 +0000 UTC m=+568.596653863" Apr 16 20:21:08.939969 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:21:08.939721 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-674b59b84c-b6qd8"] Apr 16 20:21:09.892176 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:21:09.892134 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-67566c68b4-h7fjx" event={"ID":"1908133b-0394-499d-ae09-9aaf7ea5ddfb","Type":"ContainerStarted","Data":"60a8cfa5479f908761f9ed058e11952c2b4521b8ce6e3e9eed72b44a125701a3"} Apr 16 20:21:09.892590 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:21:09.892497 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-67566c68b4-h7fjx" Apr 16 20:21:09.909201 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:21:09.909150 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-67566c68b4-h7fjx" podStartSLOduration=1.519265941 podStartE2EDuration="4.909136945s" podCreationTimestamp="2026-04-16 20:21:05 +0000 UTC" firstStartedPulling="2026-04-16 20:21:05.893544426 +0000 UTC m=+565.569812649" lastFinishedPulling="2026-04-16 20:21:09.283415435 +0000 UTC m=+568.959683653" observedRunningTime="2026-04-16 20:21:09.906724157 +0000 UTC m=+569.582992393" watchObservedRunningTime="2026-04-16 20:21:09.909136945 +0000 UTC m=+569.585405177" Apr 16 20:21:10.895326 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:21:10.895287 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-674b59b84c-b6qd8" podUID="515bee93-1d2b-4df9-822f-ba03f5c99872" containerName="authorino" containerID="cri-o://e101e7508ffe6c8ad1815cae5d96fbcc7ef5b2dcc00e7c66c24d2ba2e15c8610" gracePeriod=30 Apr 16 20:21:11.123883 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:21:11.123860 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-b6qd8" Apr 16 20:21:11.213342 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:21:11.213256 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7gx8f\" (UniqueName: \"kubernetes.io/projected/515bee93-1d2b-4df9-822f-ba03f5c99872-kube-api-access-7gx8f\") pod \"515bee93-1d2b-4df9-822f-ba03f5c99872\" (UID: \"515bee93-1d2b-4df9-822f-ba03f5c99872\") " Apr 16 20:21:11.215421 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:21:11.215400 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/515bee93-1d2b-4df9-822f-ba03f5c99872-kube-api-access-7gx8f" (OuterVolumeSpecName: "kube-api-access-7gx8f") pod "515bee93-1d2b-4df9-822f-ba03f5c99872" (UID: "515bee93-1d2b-4df9-822f-ba03f5c99872"). InnerVolumeSpecName "kube-api-access-7gx8f". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:21:11.314427 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:21:11.314399 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7gx8f\" (UniqueName: \"kubernetes.io/projected/515bee93-1d2b-4df9-822f-ba03f5c99872-kube-api-access-7gx8f\") on node \"ip-10-0-131-31.ec2.internal\" DevicePath \"\"" Apr 16 20:21:11.899661 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:21:11.899626 2577 generic.go:358] "Generic (PLEG): container finished" podID="515bee93-1d2b-4df9-822f-ba03f5c99872" containerID="e101e7508ffe6c8ad1815cae5d96fbcc7ef5b2dcc00e7c66c24d2ba2e15c8610" exitCode=0 Apr 16 20:21:11.900065 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:21:11.899672 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-b6qd8" Apr 16 20:21:11.900065 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:21:11.899707 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-b6qd8" event={"ID":"515bee93-1d2b-4df9-822f-ba03f5c99872","Type":"ContainerDied","Data":"e101e7508ffe6c8ad1815cae5d96fbcc7ef5b2dcc00e7c66c24d2ba2e15c8610"} Apr 16 20:21:11.900065 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:21:11.899739 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-b6qd8" event={"ID":"515bee93-1d2b-4df9-822f-ba03f5c99872","Type":"ContainerDied","Data":"1e6d8cdfeb2a91a7101d88b5995e00dfb287ecb4b29ead0985ea37b2fef9c9ff"} Apr 16 20:21:11.900065 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:21:11.899758 2577 scope.go:117] "RemoveContainer" containerID="e101e7508ffe6c8ad1815cae5d96fbcc7ef5b2dcc00e7c66c24d2ba2e15c8610" Apr 16 20:21:11.908507 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:21:11.908491 2577 scope.go:117] "RemoveContainer" containerID="e101e7508ffe6c8ad1815cae5d96fbcc7ef5b2dcc00e7c66c24d2ba2e15c8610" Apr 16 20:21:11.908741 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:21:11.908719 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e101e7508ffe6c8ad1815cae5d96fbcc7ef5b2dcc00e7c66c24d2ba2e15c8610\": container with ID starting with e101e7508ffe6c8ad1815cae5d96fbcc7ef5b2dcc00e7c66c24d2ba2e15c8610 not found: ID does not exist" containerID="e101e7508ffe6c8ad1815cae5d96fbcc7ef5b2dcc00e7c66c24d2ba2e15c8610" Apr 16 20:21:11.908788 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:21:11.908750 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e101e7508ffe6c8ad1815cae5d96fbcc7ef5b2dcc00e7c66c24d2ba2e15c8610"} err="failed to get container status \"e101e7508ffe6c8ad1815cae5d96fbcc7ef5b2dcc00e7c66c24d2ba2e15c8610\": rpc error: code = NotFound desc = could not find container \"e101e7508ffe6c8ad1815cae5d96fbcc7ef5b2dcc00e7c66c24d2ba2e15c8610\": container with ID starting with e101e7508ffe6c8ad1815cae5d96fbcc7ef5b2dcc00e7c66c24d2ba2e15c8610 not found: ID does not exist" Apr 16 20:21:11.919013 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:21:11.918986 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-674b59b84c-b6qd8"] Apr 16 20:21:11.922989 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:21:11.922970 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-674b59b84c-b6qd8"] Apr 16 20:21:12.875680 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:21:12.875650 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="515bee93-1d2b-4df9-822f-ba03f5c99872" path="/var/lib/kubelet/pods/515bee93-1d2b-4df9-822f-ba03f5c99872/volumes" Apr 16 20:21:20.896535 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:21:20.896508 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-67566c68b4-h7fjx" Apr 16 20:21:30.006956 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:21:30.006906 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-68bd676465-xrzmd"] Apr 16 20:21:30.007325 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:21:30.007310 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="515bee93-1d2b-4df9-822f-ba03f5c99872" containerName="authorino" Apr 16 20:21:30.007377 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:21:30.007327 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="515bee93-1d2b-4df9-822f-ba03f5c99872" containerName="authorino" Apr 16 20:21:30.007411 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:21:30.007396 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="515bee93-1d2b-4df9-822f-ba03f5c99872" containerName="authorino" Apr 16 20:21:30.010194 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:21:30.010179 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-68bd676465-xrzmd" Apr 16 20:21:30.012330 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:21:30.012307 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-server-cert\"" Apr 16 20:21:30.016010 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:21:30.015940 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-68bd676465-xrzmd"] Apr 16 20:21:30.176215 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:21:30.176184 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmmvw\" (UniqueName: \"kubernetes.io/projected/aad07ab7-8103-4561-997c-1467c2b60d4a-kube-api-access-xmmvw\") pod \"authorino-68bd676465-xrzmd\" (UID: \"aad07ab7-8103-4561-997c-1467c2b60d4a\") " pod="kuadrant-system/authorino-68bd676465-xrzmd" Apr 16 20:21:30.176360 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:21:30.176277 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/aad07ab7-8103-4561-997c-1467c2b60d4a-tls-cert\") pod \"authorino-68bd676465-xrzmd\" (UID: \"aad07ab7-8103-4561-997c-1467c2b60d4a\") " pod="kuadrant-system/authorino-68bd676465-xrzmd" Apr 16 20:21:30.277702 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:21:30.277610 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xmmvw\" (UniqueName: \"kubernetes.io/projected/aad07ab7-8103-4561-997c-1467c2b60d4a-kube-api-access-xmmvw\") pod \"authorino-68bd676465-xrzmd\" (UID: \"aad07ab7-8103-4561-997c-1467c2b60d4a\") " pod="kuadrant-system/authorino-68bd676465-xrzmd" Apr 16 20:21:30.277702 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:21:30.277691 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/aad07ab7-8103-4561-997c-1467c2b60d4a-tls-cert\") pod \"authorino-68bd676465-xrzmd\" (UID: \"aad07ab7-8103-4561-997c-1467c2b60d4a\") " pod="kuadrant-system/authorino-68bd676465-xrzmd" Apr 16 20:21:30.280098 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:21:30.280073 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/aad07ab7-8103-4561-997c-1467c2b60d4a-tls-cert\") pod \"authorino-68bd676465-xrzmd\" (UID: \"aad07ab7-8103-4561-997c-1467c2b60d4a\") " pod="kuadrant-system/authorino-68bd676465-xrzmd" Apr 16 20:21:30.284861 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:21:30.284837 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmmvw\" (UniqueName: \"kubernetes.io/projected/aad07ab7-8103-4561-997c-1467c2b60d4a-kube-api-access-xmmvw\") pod \"authorino-68bd676465-xrzmd\" (UID: \"aad07ab7-8103-4561-997c-1467c2b60d4a\") " pod="kuadrant-system/authorino-68bd676465-xrzmd" Apr 16 20:21:30.320079 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:21:30.320053 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-68bd676465-xrzmd" Apr 16 20:21:30.439301 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:21:30.439274 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-68bd676465-xrzmd"] Apr 16 20:21:30.440763 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:21:30.440736 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaad07ab7_8103_4561_997c_1467c2b60d4a.slice/crio-993a3e4e1fd3dd9432bcaebc31aa214118d069d7e4f77af28b03c2dcd55513b5 WatchSource:0}: Error finding container 993a3e4e1fd3dd9432bcaebc31aa214118d069d7e4f77af28b03c2dcd55513b5: Status 404 returned error can't find the container with id 993a3e4e1fd3dd9432bcaebc31aa214118d069d7e4f77af28b03c2dcd55513b5 Apr 16 20:21:30.976766 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:21:30.976740 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-68bd676465-xrzmd" event={"ID":"aad07ab7-8103-4561-997c-1467c2b60d4a","Type":"ContainerStarted","Data":"993a3e4e1fd3dd9432bcaebc31aa214118d069d7e4f77af28b03c2dcd55513b5"} Apr 16 20:21:31.982306 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:21:31.982271 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-68bd676465-xrzmd" event={"ID":"aad07ab7-8103-4561-997c-1467c2b60d4a","Type":"ContainerStarted","Data":"f1a31651d0667f47024369aa924d0bb34901230c0cb3c0161c55c4c240c9b72f"} Apr 16 20:21:31.996823 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:21:31.996775 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-68bd676465-xrzmd" podStartSLOduration=2.494549508 podStartE2EDuration="2.996762211s" podCreationTimestamp="2026-04-16 20:21:29 +0000 UTC" firstStartedPulling="2026-04-16 20:21:30.442127684 +0000 UTC m=+590.118395897" lastFinishedPulling="2026-04-16 20:21:30.944340386 +0000 UTC m=+590.620608600" observedRunningTime="2026-04-16 20:21:31.995558643 +0000 UTC m=+591.671826866" watchObservedRunningTime="2026-04-16 20:21:31.996762211 +0000 UTC m=+591.673030442" Apr 16 20:21:32.021605 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:21:32.021573 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-lztl5"] Apr 16 20:21:32.021819 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:21:32.021783 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-79cbc94b89-lztl5" podUID="991095fe-bfc1-4624-a615-634d41441de7" containerName="authorino" containerID="cri-o://903d40b2e4f0e9b8e0268c34d4d391ad750a7bc6346b03d0f7dc63ce99db7cf5" gracePeriod=30 Apr 16 20:21:32.267966 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:21:32.267923 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-lztl5" Apr 16 20:21:32.297924 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:21:32.297888 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w998b\" (UniqueName: \"kubernetes.io/projected/991095fe-bfc1-4624-a615-634d41441de7-kube-api-access-w998b\") pod \"991095fe-bfc1-4624-a615-634d41441de7\" (UID: \"991095fe-bfc1-4624-a615-634d41441de7\") " Apr 16 20:21:32.299926 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:21:32.299900 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/991095fe-bfc1-4624-a615-634d41441de7-kube-api-access-w998b" (OuterVolumeSpecName: "kube-api-access-w998b") pod "991095fe-bfc1-4624-a615-634d41441de7" (UID: "991095fe-bfc1-4624-a615-634d41441de7"). InnerVolumeSpecName "kube-api-access-w998b". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:21:32.399203 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:21:32.399172 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-w998b\" (UniqueName: \"kubernetes.io/projected/991095fe-bfc1-4624-a615-634d41441de7-kube-api-access-w998b\") on node \"ip-10-0-131-31.ec2.internal\" DevicePath \"\"" Apr 16 20:21:32.987799 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:21:32.987764 2577 generic.go:358] "Generic (PLEG): container finished" podID="991095fe-bfc1-4624-a615-634d41441de7" containerID="903d40b2e4f0e9b8e0268c34d4d391ad750a7bc6346b03d0f7dc63ce99db7cf5" exitCode=0 Apr 16 20:21:32.988223 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:21:32.987813 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-lztl5" Apr 16 20:21:32.988223 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:21:32.987852 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-lztl5" event={"ID":"991095fe-bfc1-4624-a615-634d41441de7","Type":"ContainerDied","Data":"903d40b2e4f0e9b8e0268c34d4d391ad750a7bc6346b03d0f7dc63ce99db7cf5"} Apr 16 20:21:32.988223 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:21:32.987894 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-lztl5" event={"ID":"991095fe-bfc1-4624-a615-634d41441de7","Type":"ContainerDied","Data":"8ca075d1276b18b317fcb05be18afdc2a30106613ec1ba8a468f7479b3836936"} Apr 16 20:21:32.988223 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:21:32.987917 2577 scope.go:117] "RemoveContainer" containerID="903d40b2e4f0e9b8e0268c34d4d391ad750a7bc6346b03d0f7dc63ce99db7cf5" Apr 16 20:21:32.996470 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:21:32.996451 2577 scope.go:117] "RemoveContainer" containerID="903d40b2e4f0e9b8e0268c34d4d391ad750a7bc6346b03d0f7dc63ce99db7cf5" Apr 16 20:21:32.996729 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:21:32.996711 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"903d40b2e4f0e9b8e0268c34d4d391ad750a7bc6346b03d0f7dc63ce99db7cf5\": container with ID starting with 903d40b2e4f0e9b8e0268c34d4d391ad750a7bc6346b03d0f7dc63ce99db7cf5 not found: ID does not exist" containerID="903d40b2e4f0e9b8e0268c34d4d391ad750a7bc6346b03d0f7dc63ce99db7cf5" Apr 16 20:21:32.996775 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:21:32.996740 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"903d40b2e4f0e9b8e0268c34d4d391ad750a7bc6346b03d0f7dc63ce99db7cf5"} err="failed to get container status \"903d40b2e4f0e9b8e0268c34d4d391ad750a7bc6346b03d0f7dc63ce99db7cf5\": rpc error: code = NotFound desc = could not find container \"903d40b2e4f0e9b8e0268c34d4d391ad750a7bc6346b03d0f7dc63ce99db7cf5\": container with ID starting with 903d40b2e4f0e9b8e0268c34d4d391ad750a7bc6346b03d0f7dc63ce99db7cf5 not found: ID does not exist" Apr 16 20:21:33.004112 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:21:33.004087 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-lztl5"] Apr 16 20:21:33.007832 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:21:33.007810 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-lztl5"] Apr 16 20:21:34.875325 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:21:34.875295 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="991095fe-bfc1-4624-a615-634d41441de7" path="/var/lib/kubelet/pods/991095fe-bfc1-4624-a615-634d41441de7/volumes" Apr 16 20:21:40.794325 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:21:40.794304 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-gbsjj_066e6004-882a-4b2c-a626-ecac83f8b502/console-operator/2.log" Apr 16 20:21:40.796245 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:21:40.796228 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-gbsjj_066e6004-882a-4b2c-a626-ecac83f8b502/console-operator/2.log" Apr 16 20:23:44.895737 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:23:44.895706 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56ff6b644b94gzs"] Apr 16 20:23:44.896139 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:23:44.896088 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="991095fe-bfc1-4624-a615-634d41441de7" containerName="authorino" Apr 16 20:23:44.896139 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:23:44.896101 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="991095fe-bfc1-4624-a615-634d41441de7" containerName="authorino" Apr 16 20:23:44.896692 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:23:44.896669 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="991095fe-bfc1-4624-a615-634d41441de7" containerName="authorino" Apr 16 20:23:44.901263 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:23:44.901236 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56ff6b644b94gzs" Apr 16 20:23:44.905123 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:23:44.905091 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-s7skk\"" Apr 16 20:23:44.905251 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:23:44.905131 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvad71fa5348b85aebd404221bba611457-kserve-self-signed-certs\"" Apr 16 20:23:44.905251 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:23:44.905139 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 16 20:23:44.905251 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:23:44.905187 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 20:23:44.913153 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:23:44.913132 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56ff6b644b94gzs"] Apr 16 20:23:45.007866 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:23:45.007837 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/57cd34ea-959f-4d9f-8bbe-ea9e81588a9a-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56ff6b644b94gzs\" (UID: \"57cd34ea-959f-4d9f-8bbe-ea9e81588a9a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56ff6b644b94gzs" Apr 16 20:23:45.008060 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:23:45.007876 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/57cd34ea-959f-4d9f-8bbe-ea9e81588a9a-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56ff6b644b94gzs\" (UID: \"57cd34ea-959f-4d9f-8bbe-ea9e81588a9a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56ff6b644b94gzs" Apr 16 20:23:45.008060 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:23:45.007900 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/57cd34ea-959f-4d9f-8bbe-ea9e81588a9a-home\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56ff6b644b94gzs\" (UID: \"57cd34ea-959f-4d9f-8bbe-ea9e81588a9a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56ff6b644b94gzs" Apr 16 20:23:45.008060 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:23:45.007992 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/57cd34ea-959f-4d9f-8bbe-ea9e81588a9a-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56ff6b644b94gzs\" (UID: \"57cd34ea-959f-4d9f-8bbe-ea9e81588a9a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56ff6b644b94gzs" Apr 16 20:23:45.008060 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:23:45.008022 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjhgq\" (UniqueName: \"kubernetes.io/projected/57cd34ea-959f-4d9f-8bbe-ea9e81588a9a-kube-api-access-pjhgq\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56ff6b644b94gzs\" (UID: \"57cd34ea-959f-4d9f-8bbe-ea9e81588a9a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56ff6b644b94gzs" Apr 16 20:23:45.008271 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:23:45.008149 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/57cd34ea-959f-4d9f-8bbe-ea9e81588a9a-dshm\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56ff6b644b94gzs\" (UID: \"57cd34ea-959f-4d9f-8bbe-ea9e81588a9a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56ff6b644b94gzs" Apr 16 20:23:45.109610 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:23:45.109556 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/57cd34ea-959f-4d9f-8bbe-ea9e81588a9a-dshm\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56ff6b644b94gzs\" (UID: \"57cd34ea-959f-4d9f-8bbe-ea9e81588a9a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56ff6b644b94gzs" Apr 16 20:23:45.109808 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:23:45.109627 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/57cd34ea-959f-4d9f-8bbe-ea9e81588a9a-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56ff6b644b94gzs\" (UID: \"57cd34ea-959f-4d9f-8bbe-ea9e81588a9a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56ff6b644b94gzs" Apr 16 20:23:45.109808 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:23:45.109660 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/57cd34ea-959f-4d9f-8bbe-ea9e81588a9a-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56ff6b644b94gzs\" (UID: \"57cd34ea-959f-4d9f-8bbe-ea9e81588a9a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56ff6b644b94gzs" Apr 16 20:23:45.109808 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:23:45.109690 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/57cd34ea-959f-4d9f-8bbe-ea9e81588a9a-home\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56ff6b644b94gzs\" (UID: \"57cd34ea-959f-4d9f-8bbe-ea9e81588a9a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56ff6b644b94gzs" Apr 16 20:23:45.109808 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:23:45.109725 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/57cd34ea-959f-4d9f-8bbe-ea9e81588a9a-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56ff6b644b94gzs\" (UID: \"57cd34ea-959f-4d9f-8bbe-ea9e81588a9a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56ff6b644b94gzs" Apr 16 20:23:45.109808 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:23:45.109754 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pjhgq\" (UniqueName: \"kubernetes.io/projected/57cd34ea-959f-4d9f-8bbe-ea9e81588a9a-kube-api-access-pjhgq\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56ff6b644b94gzs\" (UID: \"57cd34ea-959f-4d9f-8bbe-ea9e81588a9a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56ff6b644b94gzs" Apr 16 20:23:45.110118 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:23:45.110092 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/57cd34ea-959f-4d9f-8bbe-ea9e81588a9a-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56ff6b644b94gzs\" (UID: \"57cd34ea-959f-4d9f-8bbe-ea9e81588a9a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56ff6b644b94gzs" Apr 16 20:23:45.110171 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:23:45.110113 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/57cd34ea-959f-4d9f-8bbe-ea9e81588a9a-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56ff6b644b94gzs\" (UID: \"57cd34ea-959f-4d9f-8bbe-ea9e81588a9a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56ff6b644b94gzs" Apr 16 20:23:45.110171 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:23:45.110133 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/57cd34ea-959f-4d9f-8bbe-ea9e81588a9a-home\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56ff6b644b94gzs\" (UID: \"57cd34ea-959f-4d9f-8bbe-ea9e81588a9a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56ff6b644b94gzs" Apr 16 20:23:45.111807 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:23:45.111783 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/57cd34ea-959f-4d9f-8bbe-ea9e81588a9a-dshm\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56ff6b644b94gzs\" (UID: \"57cd34ea-959f-4d9f-8bbe-ea9e81588a9a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56ff6b644b94gzs" Apr 16 20:23:45.112080 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:23:45.112064 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/57cd34ea-959f-4d9f-8bbe-ea9e81588a9a-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56ff6b644b94gzs\" (UID: \"57cd34ea-959f-4d9f-8bbe-ea9e81588a9a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56ff6b644b94gzs" Apr 16 20:23:45.121606 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:23:45.121585 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjhgq\" (UniqueName: \"kubernetes.io/projected/57cd34ea-959f-4d9f-8bbe-ea9e81588a9a-kube-api-access-pjhgq\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56ff6b644b94gzs\" (UID: \"57cd34ea-959f-4d9f-8bbe-ea9e81588a9a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56ff6b644b94gzs" Apr 16 20:23:45.213667 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:23:45.213591 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56ff6b644b94gzs" Apr 16 20:23:45.352153 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:23:45.352129 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56ff6b644b94gzs"] Apr 16 20:23:45.353645 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:23:45.353617 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57cd34ea_959f_4d9f_8bbe_ea9e81588a9a.slice/crio-960cebceec0990cb7d4b5744b525a8e55547c931543e2d48d58acc724cc83703 WatchSource:0}: Error finding container 960cebceec0990cb7d4b5744b525a8e55547c931543e2d48d58acc724cc83703: Status 404 returned error can't find the container with id 960cebceec0990cb7d4b5744b525a8e55547c931543e2d48d58acc724cc83703 Apr 16 20:23:45.355394 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:23:45.355378 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 20:23:45.490814 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:23:45.490710 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56ff6b644b94gzs" event={"ID":"57cd34ea-959f-4d9f-8bbe-ea9e81588a9a","Type":"ContainerStarted","Data":"960cebceec0990cb7d4b5744b525a8e55547c931543e2d48d58acc724cc83703"} Apr 16 20:23:49.510435 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:23:49.510393 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56ff6b644b94gzs" event={"ID":"57cd34ea-959f-4d9f-8bbe-ea9e81588a9a","Type":"ContainerStarted","Data":"37b772ee13b2ec51227ffb38b5870c504a633f2670432cebe4872e88a444d6ff"} Apr 16 20:23:53.529162 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:23:53.529123 2577 generic.go:358] "Generic (PLEG): container finished" podID="57cd34ea-959f-4d9f-8bbe-ea9e81588a9a" containerID="37b772ee13b2ec51227ffb38b5870c504a633f2670432cebe4872e88a444d6ff" exitCode=0 Apr 16 20:23:53.529624 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:23:53.529167 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56ff6b644b94gzs" event={"ID":"57cd34ea-959f-4d9f-8bbe-ea9e81588a9a","Type":"ContainerDied","Data":"37b772ee13b2ec51227ffb38b5870c504a633f2670432cebe4872e88a444d6ff"} Apr 16 20:23:55.537281 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:23:55.537247 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56ff6b644b94gzs" event={"ID":"57cd34ea-959f-4d9f-8bbe-ea9e81588a9a","Type":"ContainerStarted","Data":"c6fe0bfa3e5362ddeaaf0cba73a4602c233ad5027393fc25b0f79cb759bc8d4a"} Apr 16 20:23:55.555402 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:23:55.555334 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56ff6b644b94gzs" podStartSLOduration=2.386507734 podStartE2EDuration="11.555317412s" podCreationTimestamp="2026-04-16 20:23:44 +0000 UTC" firstStartedPulling="2026-04-16 20:23:45.355501267 +0000 UTC m=+725.031769477" lastFinishedPulling="2026-04-16 20:23:54.524310938 +0000 UTC m=+734.200579155" observedRunningTime="2026-04-16 20:23:55.555226463 +0000 UTC m=+735.231494696" watchObservedRunningTime="2026-04-16 20:23:55.555317412 +0000 UTC m=+735.231585646" Apr 16 20:24:05.214752 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:24:05.214721 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56ff6b644b94gzs" Apr 16 20:24:05.214752 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:24:05.214757 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56ff6b644b94gzs" Apr 16 20:24:05.227435 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:24:05.227408 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56ff6b644b94gzs" Apr 16 20:24:05.583648 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:24:05.583627 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56ff6b644b94gzs" Apr 16 20:25:06.082205 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:25:06.082136 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56ff6b644b94gzs"] Apr 16 20:25:06.082679 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:25:06.082395 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56ff6b644b94gzs" podUID="57cd34ea-959f-4d9f-8bbe-ea9e81588a9a" containerName="main" containerID="cri-o://c6fe0bfa3e5362ddeaaf0cba73a4602c233ad5027393fc25b0f79cb759bc8d4a" gracePeriod=30 Apr 16 20:25:06.326885 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:25:06.326866 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56ff6b644b94gzs" Apr 16 20:25:06.418055 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:25:06.417982 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjhgq\" (UniqueName: \"kubernetes.io/projected/57cd34ea-959f-4d9f-8bbe-ea9e81588a9a-kube-api-access-pjhgq\") pod \"57cd34ea-959f-4d9f-8bbe-ea9e81588a9a\" (UID: \"57cd34ea-959f-4d9f-8bbe-ea9e81588a9a\") " Apr 16 20:25:06.418055 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:25:06.418036 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/57cd34ea-959f-4d9f-8bbe-ea9e81588a9a-dshm\") pod \"57cd34ea-959f-4d9f-8bbe-ea9e81588a9a\" (UID: \"57cd34ea-959f-4d9f-8bbe-ea9e81588a9a\") " Apr 16 20:25:06.418228 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:25:06.418109 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/57cd34ea-959f-4d9f-8bbe-ea9e81588a9a-model-cache\") pod \"57cd34ea-959f-4d9f-8bbe-ea9e81588a9a\" (UID: \"57cd34ea-959f-4d9f-8bbe-ea9e81588a9a\") " Apr 16 20:25:06.418228 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:25:06.418142 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/57cd34ea-959f-4d9f-8bbe-ea9e81588a9a-home\") pod \"57cd34ea-959f-4d9f-8bbe-ea9e81588a9a\" (UID: \"57cd34ea-959f-4d9f-8bbe-ea9e81588a9a\") " Apr 16 20:25:06.418228 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:25:06.418170 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/57cd34ea-959f-4d9f-8bbe-ea9e81588a9a-kserve-provision-location\") pod \"57cd34ea-959f-4d9f-8bbe-ea9e81588a9a\" (UID: \"57cd34ea-959f-4d9f-8bbe-ea9e81588a9a\") " Apr 16 20:25:06.418228 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:25:06.418197 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/57cd34ea-959f-4d9f-8bbe-ea9e81588a9a-tls-certs\") pod \"57cd34ea-959f-4d9f-8bbe-ea9e81588a9a\" (UID: \"57cd34ea-959f-4d9f-8bbe-ea9e81588a9a\") " Apr 16 20:25:06.418438 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:25:06.418412 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57cd34ea-959f-4d9f-8bbe-ea9e81588a9a-home" (OuterVolumeSpecName: "home") pod "57cd34ea-959f-4d9f-8bbe-ea9e81588a9a" (UID: "57cd34ea-959f-4d9f-8bbe-ea9e81588a9a"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:25:06.418438 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:25:06.418410 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57cd34ea-959f-4d9f-8bbe-ea9e81588a9a-model-cache" (OuterVolumeSpecName: "model-cache") pod "57cd34ea-959f-4d9f-8bbe-ea9e81588a9a" (UID: "57cd34ea-959f-4d9f-8bbe-ea9e81588a9a"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:25:06.420298 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:25:06.420265 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57cd34ea-959f-4d9f-8bbe-ea9e81588a9a-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "57cd34ea-959f-4d9f-8bbe-ea9e81588a9a" (UID: "57cd34ea-959f-4d9f-8bbe-ea9e81588a9a"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:25:06.420411 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:25:06.420317 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57cd34ea-959f-4d9f-8bbe-ea9e81588a9a-kube-api-access-pjhgq" (OuterVolumeSpecName: "kube-api-access-pjhgq") pod "57cd34ea-959f-4d9f-8bbe-ea9e81588a9a" (UID: "57cd34ea-959f-4d9f-8bbe-ea9e81588a9a"). InnerVolumeSpecName "kube-api-access-pjhgq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:25:06.420411 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:25:06.420344 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57cd34ea-959f-4d9f-8bbe-ea9e81588a9a-dshm" (OuterVolumeSpecName: "dshm") pod "57cd34ea-959f-4d9f-8bbe-ea9e81588a9a" (UID: "57cd34ea-959f-4d9f-8bbe-ea9e81588a9a"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:25:06.472303 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:25:06.472263 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57cd34ea-959f-4d9f-8bbe-ea9e81588a9a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "57cd34ea-959f-4d9f-8bbe-ea9e81588a9a" (UID: "57cd34ea-959f-4d9f-8bbe-ea9e81588a9a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:25:06.523435 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:25:06.519231 2577 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/57cd34ea-959f-4d9f-8bbe-ea9e81588a9a-dshm\") on node \"ip-10-0-131-31.ec2.internal\" DevicePath \"\"" Apr 16 20:25:06.523435 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:25:06.519266 2577 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/57cd34ea-959f-4d9f-8bbe-ea9e81588a9a-model-cache\") on node \"ip-10-0-131-31.ec2.internal\" DevicePath \"\"" Apr 16 20:25:06.523435 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:25:06.519281 2577 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/57cd34ea-959f-4d9f-8bbe-ea9e81588a9a-home\") on node \"ip-10-0-131-31.ec2.internal\" DevicePath \"\"" Apr 16 20:25:06.523435 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:25:06.519297 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/57cd34ea-959f-4d9f-8bbe-ea9e81588a9a-kserve-provision-location\") on node \"ip-10-0-131-31.ec2.internal\" DevicePath \"\"" Apr 16 20:25:06.523435 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:25:06.519317 2577 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/57cd34ea-959f-4d9f-8bbe-ea9e81588a9a-tls-certs\") on node \"ip-10-0-131-31.ec2.internal\" DevicePath \"\"" Apr 16 20:25:06.523435 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:25:06.519332 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pjhgq\" (UniqueName: \"kubernetes.io/projected/57cd34ea-959f-4d9f-8bbe-ea9e81588a9a-kube-api-access-pjhgq\") on node \"ip-10-0-131-31.ec2.internal\" DevicePath \"\"" Apr 16 20:25:06.802781 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:25:06.802696 2577 generic.go:358] "Generic (PLEG): container finished" podID="57cd34ea-959f-4d9f-8bbe-ea9e81588a9a" containerID="c6fe0bfa3e5362ddeaaf0cba73a4602c233ad5027393fc25b0f79cb759bc8d4a" exitCode=0 Apr 16 20:25:06.802781 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:25:06.802759 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56ff6b644b94gzs" event={"ID":"57cd34ea-959f-4d9f-8bbe-ea9e81588a9a","Type":"ContainerDied","Data":"c6fe0bfa3e5362ddeaaf0cba73a4602c233ad5027393fc25b0f79cb759bc8d4a"} Apr 16 20:25:06.802988 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:25:06.802785 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56ff6b644b94gzs" Apr 16 20:25:06.802988 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:25:06.802792 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56ff6b644b94gzs" event={"ID":"57cd34ea-959f-4d9f-8bbe-ea9e81588a9a","Type":"ContainerDied","Data":"960cebceec0990cb7d4b5744b525a8e55547c931543e2d48d58acc724cc83703"} Apr 16 20:25:06.802988 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:25:06.802810 2577 scope.go:117] "RemoveContainer" containerID="c6fe0bfa3e5362ddeaaf0cba73a4602c233ad5027393fc25b0f79cb759bc8d4a" Apr 16 20:25:06.812852 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:25:06.812835 2577 scope.go:117] "RemoveContainer" containerID="37b772ee13b2ec51227ffb38b5870c504a633f2670432cebe4872e88a444d6ff" Apr 16 20:25:06.822878 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:25:06.822859 2577 scope.go:117] "RemoveContainer" containerID="c6fe0bfa3e5362ddeaaf0cba73a4602c233ad5027393fc25b0f79cb759bc8d4a" Apr 16 20:25:06.823256 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:25:06.823230 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6fe0bfa3e5362ddeaaf0cba73a4602c233ad5027393fc25b0f79cb759bc8d4a\": container with ID starting with c6fe0bfa3e5362ddeaaf0cba73a4602c233ad5027393fc25b0f79cb759bc8d4a not found: ID does not exist" containerID="c6fe0bfa3e5362ddeaaf0cba73a4602c233ad5027393fc25b0f79cb759bc8d4a" Apr 16 20:25:06.823256 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:25:06.823267 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6fe0bfa3e5362ddeaaf0cba73a4602c233ad5027393fc25b0f79cb759bc8d4a"} err="failed to get container status \"c6fe0bfa3e5362ddeaaf0cba73a4602c233ad5027393fc25b0f79cb759bc8d4a\": rpc error: code = NotFound desc = could not find container \"c6fe0bfa3e5362ddeaaf0cba73a4602c233ad5027393fc25b0f79cb759bc8d4a\": container with ID starting with c6fe0bfa3e5362ddeaaf0cba73a4602c233ad5027393fc25b0f79cb759bc8d4a not found: ID does not exist" Apr 16 20:25:06.823410 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:25:06.823293 2577 scope.go:117] "RemoveContainer" containerID="37b772ee13b2ec51227ffb38b5870c504a633f2670432cebe4872e88a444d6ff" Apr 16 20:25:06.823579 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:25:06.823559 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37b772ee13b2ec51227ffb38b5870c504a633f2670432cebe4872e88a444d6ff\": container with ID starting with 37b772ee13b2ec51227ffb38b5870c504a633f2670432cebe4872e88a444d6ff not found: ID does not exist" containerID="37b772ee13b2ec51227ffb38b5870c504a633f2670432cebe4872e88a444d6ff" Apr 16 20:25:06.823650 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:25:06.823587 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37b772ee13b2ec51227ffb38b5870c504a633f2670432cebe4872e88a444d6ff"} err="failed to get container status \"37b772ee13b2ec51227ffb38b5870c504a633f2670432cebe4872e88a444d6ff\": rpc error: code = NotFound desc = could not find container \"37b772ee13b2ec51227ffb38b5870c504a633f2670432cebe4872e88a444d6ff\": container with ID starting with 37b772ee13b2ec51227ffb38b5870c504a633f2670432cebe4872e88a444d6ff not found: ID does not exist" Apr 16 20:25:06.825491 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:25:06.825473 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56ff6b644b94gzs"] Apr 16 20:25:06.829243 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:25:06.829225 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-56ff6b644b94gzs"] Apr 16 20:25:06.875697 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:25:06.875669 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57cd34ea-959f-4d9f-8bbe-ea9e81588a9a" path="/var/lib/kubelet/pods/57cd34ea-959f-4d9f-8bbe-ea9e81588a9a/volumes" Apr 16 20:25:22.393469 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:25:22.393433 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-55896d89c9gb8th"] Apr 16 20:25:22.394033 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:25:22.394016 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="57cd34ea-959f-4d9f-8bbe-ea9e81588a9a" containerName="storage-initializer" Apr 16 20:25:22.394086 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:25:22.394036 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="57cd34ea-959f-4d9f-8bbe-ea9e81588a9a" containerName="storage-initializer" Apr 16 20:25:22.394086 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:25:22.394056 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="57cd34ea-959f-4d9f-8bbe-ea9e81588a9a" containerName="main" Apr 16 20:25:22.394086 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:25:22.394065 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="57cd34ea-959f-4d9f-8bbe-ea9e81588a9a" containerName="main" Apr 16 20:25:22.394179 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:25:22.394162 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="57cd34ea-959f-4d9f-8bbe-ea9e81588a9a" containerName="main" Apr 16 20:25:22.399710 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:25:22.399690 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-55896d89c9gb8th" Apr 16 20:25:22.402651 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:25:22.402627 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-s7skk\"" Apr 16 20:25:22.402802 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:25:22.402664 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 20:25:22.402997 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:25:22.402707 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvdde380eaa9fe1facad32d45131f9e34d-kserve-self-signed-certs\"" Apr 16 20:25:22.403150 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:25:22.403127 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 16 20:25:22.405270 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:25:22.405246 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-55896d89c9gb8th"] Apr 16 20:25:22.459200 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:25:22.459169 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/88b37c33-7f6d-44b4-9915-bbcd9c15ef62-dshm\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-55896d89c9gb8th\" (UID: \"88b37c33-7f6d-44b4-9915-bbcd9c15ef62\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-55896d89c9gb8th" Apr 16 20:25:22.459200 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:25:22.459202 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/88b37c33-7f6d-44b4-9915-bbcd9c15ef62-home\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-55896d89c9gb8th\" (UID: \"88b37c33-7f6d-44b4-9915-bbcd9c15ef62\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-55896d89c9gb8th" Apr 16 20:25:22.459374 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:25:22.459218 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9x86\" (UniqueName: \"kubernetes.io/projected/88b37c33-7f6d-44b4-9915-bbcd9c15ef62-kube-api-access-n9x86\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-55896d89c9gb8th\" (UID: \"88b37c33-7f6d-44b4-9915-bbcd9c15ef62\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-55896d89c9gb8th" Apr 16 20:25:22.459374 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:25:22.459278 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/88b37c33-7f6d-44b4-9915-bbcd9c15ef62-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-55896d89c9gb8th\" (UID: \"88b37c33-7f6d-44b4-9915-bbcd9c15ef62\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-55896d89c9gb8th" Apr 16 20:25:22.459374 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:25:22.459318 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/88b37c33-7f6d-44b4-9915-bbcd9c15ef62-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-55896d89c9gb8th\" (UID: \"88b37c33-7f6d-44b4-9915-bbcd9c15ef62\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-55896d89c9gb8th" Apr 16 20:25:22.459374 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:25:22.459344 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/88b37c33-7f6d-44b4-9915-bbcd9c15ef62-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-55896d89c9gb8th\" (UID: \"88b37c33-7f6d-44b4-9915-bbcd9c15ef62\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-55896d89c9gb8th" Apr 16 20:25:22.560467 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:25:22.560424 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/88b37c33-7f6d-44b4-9915-bbcd9c15ef62-dshm\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-55896d89c9gb8th\" (UID: \"88b37c33-7f6d-44b4-9915-bbcd9c15ef62\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-55896d89c9gb8th" Apr 16 20:25:22.560467 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:25:22.560470 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/88b37c33-7f6d-44b4-9915-bbcd9c15ef62-home\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-55896d89c9gb8th\" (UID: \"88b37c33-7f6d-44b4-9915-bbcd9c15ef62\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-55896d89c9gb8th" Apr 16 20:25:22.560674 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:25:22.560488 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n9x86\" (UniqueName: \"kubernetes.io/projected/88b37c33-7f6d-44b4-9915-bbcd9c15ef62-kube-api-access-n9x86\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-55896d89c9gb8th\" (UID: \"88b37c33-7f6d-44b4-9915-bbcd9c15ef62\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-55896d89c9gb8th" Apr 16 20:25:22.560674 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:25:22.560508 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/88b37c33-7f6d-44b4-9915-bbcd9c15ef62-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-55896d89c9gb8th\" (UID: \"88b37c33-7f6d-44b4-9915-bbcd9c15ef62\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-55896d89c9gb8th" Apr 16 20:25:22.560674 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:25:22.560540 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/88b37c33-7f6d-44b4-9915-bbcd9c15ef62-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-55896d89c9gb8th\" (UID: \"88b37c33-7f6d-44b4-9915-bbcd9c15ef62\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-55896d89c9gb8th" Apr 16 20:25:22.560674 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:25:22.560562 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/88b37c33-7f6d-44b4-9915-bbcd9c15ef62-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-55896d89c9gb8th\" (UID: \"88b37c33-7f6d-44b4-9915-bbcd9c15ef62\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-55896d89c9gb8th" Apr 16 20:25:22.560928 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:25:22.560904 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/88b37c33-7f6d-44b4-9915-bbcd9c15ef62-home\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-55896d89c9gb8th\" (UID: \"88b37c33-7f6d-44b4-9915-bbcd9c15ef62\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-55896d89c9gb8th" Apr 16 20:25:22.561026 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:25:22.560918 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/88b37c33-7f6d-44b4-9915-bbcd9c15ef62-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-55896d89c9gb8th\" (UID: \"88b37c33-7f6d-44b4-9915-bbcd9c15ef62\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-55896d89c9gb8th" Apr 16 20:25:22.561026 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:25:22.560990 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/88b37c33-7f6d-44b4-9915-bbcd9c15ef62-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-55896d89c9gb8th\" (UID: \"88b37c33-7f6d-44b4-9915-bbcd9c15ef62\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-55896d89c9gb8th" Apr 16 20:25:22.562840 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:25:22.562816 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/88b37c33-7f6d-44b4-9915-bbcd9c15ef62-dshm\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-55896d89c9gb8th\" (UID: \"88b37c33-7f6d-44b4-9915-bbcd9c15ef62\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-55896d89c9gb8th" Apr 16 20:25:22.563022 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:25:22.563003 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/88b37c33-7f6d-44b4-9915-bbcd9c15ef62-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-55896d89c9gb8th\" (UID: \"88b37c33-7f6d-44b4-9915-bbcd9c15ef62\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-55896d89c9gb8th" Apr 16 20:25:22.568486 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:25:22.568465 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9x86\" (UniqueName: \"kubernetes.io/projected/88b37c33-7f6d-44b4-9915-bbcd9c15ef62-kube-api-access-n9x86\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-55896d89c9gb8th\" (UID: \"88b37c33-7f6d-44b4-9915-bbcd9c15ef62\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-55896d89c9gb8th" Apr 16 20:25:22.711279 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:25:22.711196 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-55896d89c9gb8th" Apr 16 20:25:22.834368 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:25:22.834174 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-55896d89c9gb8th"] Apr 16 20:25:22.836717 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:25:22.836685 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88b37c33_7f6d_44b4_9915_bbcd9c15ef62.slice/crio-2244d9d3f3b0362fae83ba0abeaf0279f88db608573a6cb581034817518480a3 WatchSource:0}: Error finding container 2244d9d3f3b0362fae83ba0abeaf0279f88db608573a6cb581034817518480a3: Status 404 returned error can't find the container with id 2244d9d3f3b0362fae83ba0abeaf0279f88db608573a6cb581034817518480a3 Apr 16 20:25:22.863000 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:25:22.862970 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-55896d89c9gb8th" event={"ID":"88b37c33-7f6d-44b4-9915-bbcd9c15ef62","Type":"ContainerStarted","Data":"2244d9d3f3b0362fae83ba0abeaf0279f88db608573a6cb581034817518480a3"} Apr 16 20:25:23.870835 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:25:23.870742 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-55896d89c9gb8th" event={"ID":"88b37c33-7f6d-44b4-9915-bbcd9c15ef62","Type":"ContainerStarted","Data":"51b907463e7052641077ba2f65db30932e3985c47b217b00d5f0e69acc5ef8cd"} Apr 16 20:25:27.887138 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:25:27.887100 2577 generic.go:358] "Generic (PLEG): container finished" podID="88b37c33-7f6d-44b4-9915-bbcd9c15ef62" containerID="51b907463e7052641077ba2f65db30932e3985c47b217b00d5f0e69acc5ef8cd" exitCode=0 Apr 16 20:25:27.887479 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:25:27.887170 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-55896d89c9gb8th" event={"ID":"88b37c33-7f6d-44b4-9915-bbcd9c15ef62","Type":"ContainerDied","Data":"51b907463e7052641077ba2f65db30932e3985c47b217b00d5f0e69acc5ef8cd"} Apr 16 20:25:35.821691 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:25:35.821659 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6855fc94cd-66t9b"] Apr 16 20:25:35.932882 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:25:35.932851 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6855fc94cd-66t9b"] Apr 16 20:25:35.933095 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:25:35.933012 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6855fc94cd-66t9b" Apr 16 20:25:35.935970 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:25:35.935925 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"precise-prefix-cache-test-kserve-self-signed-certs\"" Apr 16 20:25:36.090195 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:25:36.090148 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6e409eca-25b4-4ad1-8be2-7ee858119bdc-model-cache\") pod \"precise-prefix-cache-test-kserve-6855fc94cd-66t9b\" (UID: \"6e409eca-25b4-4ad1-8be2-7ee858119bdc\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6855fc94cd-66t9b" Apr 16 20:25:36.090419 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:25:36.090194 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6e409eca-25b4-4ad1-8be2-7ee858119bdc-tls-certs\") pod \"precise-prefix-cache-test-kserve-6855fc94cd-66t9b\" (UID: \"6e409eca-25b4-4ad1-8be2-7ee858119bdc\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6855fc94cd-66t9b" Apr 16 20:25:36.090419 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:25:36.090227 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6e409eca-25b4-4ad1-8be2-7ee858119bdc-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-6855fc94cd-66t9b\" (UID: \"6e409eca-25b4-4ad1-8be2-7ee858119bdc\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6855fc94cd-66t9b" Apr 16 20:25:36.090419 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:25:36.090321 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6e409eca-25b4-4ad1-8be2-7ee858119bdc-home\") pod \"precise-prefix-cache-test-kserve-6855fc94cd-66t9b\" (UID: \"6e409eca-25b4-4ad1-8be2-7ee858119bdc\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6855fc94cd-66t9b" Apr 16 20:25:36.090419 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:25:36.090379 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrdhb\" (UniqueName: \"kubernetes.io/projected/6e409eca-25b4-4ad1-8be2-7ee858119bdc-kube-api-access-hrdhb\") pod \"precise-prefix-cache-test-kserve-6855fc94cd-66t9b\" (UID: \"6e409eca-25b4-4ad1-8be2-7ee858119bdc\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6855fc94cd-66t9b" Apr 16 20:25:36.090716 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:25:36.090442 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6e409eca-25b4-4ad1-8be2-7ee858119bdc-dshm\") pod \"precise-prefix-cache-test-kserve-6855fc94cd-66t9b\" (UID: \"6e409eca-25b4-4ad1-8be2-7ee858119bdc\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6855fc94cd-66t9b" Apr 16 20:25:36.192687 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:25:36.192024 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6e409eca-25b4-4ad1-8be2-7ee858119bdc-home\") pod \"precise-prefix-cache-test-kserve-6855fc94cd-66t9b\" (UID: \"6e409eca-25b4-4ad1-8be2-7ee858119bdc\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6855fc94cd-66t9b" Apr 16 20:25:36.192687 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:25:36.192097 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hrdhb\" (UniqueName: \"kubernetes.io/projected/6e409eca-25b4-4ad1-8be2-7ee858119bdc-kube-api-access-hrdhb\") pod \"precise-prefix-cache-test-kserve-6855fc94cd-66t9b\" (UID: \"6e409eca-25b4-4ad1-8be2-7ee858119bdc\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6855fc94cd-66t9b" Apr 16 20:25:36.192687 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:25:36.192158 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6e409eca-25b4-4ad1-8be2-7ee858119bdc-dshm\") pod \"precise-prefix-cache-test-kserve-6855fc94cd-66t9b\" (UID: \"6e409eca-25b4-4ad1-8be2-7ee858119bdc\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6855fc94cd-66t9b" Apr 16 20:25:36.192687 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:25:36.192219 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6e409eca-25b4-4ad1-8be2-7ee858119bdc-model-cache\") pod \"precise-prefix-cache-test-kserve-6855fc94cd-66t9b\" (UID: \"6e409eca-25b4-4ad1-8be2-7ee858119bdc\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6855fc94cd-66t9b" Apr 16 20:25:36.192687 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:25:36.192244 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6e409eca-25b4-4ad1-8be2-7ee858119bdc-tls-certs\") pod \"precise-prefix-cache-test-kserve-6855fc94cd-66t9b\" (UID: \"6e409eca-25b4-4ad1-8be2-7ee858119bdc\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6855fc94cd-66t9b" Apr 16 20:25:36.192687 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:25:36.192269 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6e409eca-25b4-4ad1-8be2-7ee858119bdc-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-6855fc94cd-66t9b\" (UID: \"6e409eca-25b4-4ad1-8be2-7ee858119bdc\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6855fc94cd-66t9b" Apr 16 20:25:36.192687 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:25:36.192452 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6e409eca-25b4-4ad1-8be2-7ee858119bdc-home\") pod \"precise-prefix-cache-test-kserve-6855fc94cd-66t9b\" (UID: \"6e409eca-25b4-4ad1-8be2-7ee858119bdc\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6855fc94cd-66t9b" Apr 16 20:25:36.193226 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:25:36.192793 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6e409eca-25b4-4ad1-8be2-7ee858119bdc-model-cache\") pod \"precise-prefix-cache-test-kserve-6855fc94cd-66t9b\" (UID: \"6e409eca-25b4-4ad1-8be2-7ee858119bdc\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6855fc94cd-66t9b" Apr 16 20:25:36.193226 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:25:36.192806 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6e409eca-25b4-4ad1-8be2-7ee858119bdc-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-6855fc94cd-66t9b\" (UID: \"6e409eca-25b4-4ad1-8be2-7ee858119bdc\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6855fc94cd-66t9b" Apr 16 20:25:36.195169 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:25:36.195140 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6e409eca-25b4-4ad1-8be2-7ee858119bdc-dshm\") pod \"precise-prefix-cache-test-kserve-6855fc94cd-66t9b\" (UID: \"6e409eca-25b4-4ad1-8be2-7ee858119bdc\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6855fc94cd-66t9b" Apr 16 20:25:36.195447 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:25:36.195403 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6e409eca-25b4-4ad1-8be2-7ee858119bdc-tls-certs\") pod \"precise-prefix-cache-test-kserve-6855fc94cd-66t9b\" (UID: \"6e409eca-25b4-4ad1-8be2-7ee858119bdc\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6855fc94cd-66t9b" Apr 16 20:25:36.200474 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:25:36.200442 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrdhb\" (UniqueName: \"kubernetes.io/projected/6e409eca-25b4-4ad1-8be2-7ee858119bdc-kube-api-access-hrdhb\") pod \"precise-prefix-cache-test-kserve-6855fc94cd-66t9b\" (UID: \"6e409eca-25b4-4ad1-8be2-7ee858119bdc\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6855fc94cd-66t9b" Apr 16 20:25:36.247147 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:25:36.247101 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6855fc94cd-66t9b" Apr 16 20:25:40.245721 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:25:40.245689 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6855fc94cd-66t9b"] Apr 16 20:25:40.247744 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:25:40.247715 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e409eca_25b4_4ad1_8be2_7ee858119bdc.slice/crio-ac13a2cd441820d2edfb82cf3a2685586746f01ed2cb441ceaea8b262cdbf286 WatchSource:0}: Error finding container ac13a2cd441820d2edfb82cf3a2685586746f01ed2cb441ceaea8b262cdbf286: Status 404 returned error can't find the container with id ac13a2cd441820d2edfb82cf3a2685586746f01ed2cb441ceaea8b262cdbf286 Apr 16 20:25:40.950061 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:25:40.950018 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6855fc94cd-66t9b" event={"ID":"6e409eca-25b4-4ad1-8be2-7ee858119bdc","Type":"ContainerStarted","Data":"3d810153aa4886022fe616e956eb1924cb0a11e03c05e8d069c6b2b60d1328e9"} Apr 16 20:25:40.950061 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:25:40.950062 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6855fc94cd-66t9b" event={"ID":"6e409eca-25b4-4ad1-8be2-7ee858119bdc","Type":"ContainerStarted","Data":"ac13a2cd441820d2edfb82cf3a2685586746f01ed2cb441ceaea8b262cdbf286"} Apr 16 20:25:54.010640 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:25:54.010544 2577 generic.go:358] "Generic (PLEG): container finished" podID="6e409eca-25b4-4ad1-8be2-7ee858119bdc" containerID="3d810153aa4886022fe616e956eb1924cb0a11e03c05e8d069c6b2b60d1328e9" exitCode=0 Apr 16 20:25:54.010640 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:25:54.010614 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6855fc94cd-66t9b" event={"ID":"6e409eca-25b4-4ad1-8be2-7ee858119bdc","Type":"ContainerDied","Data":"3d810153aa4886022fe616e956eb1924cb0a11e03c05e8d069c6b2b60d1328e9"} Apr 16 20:25:56.019997 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:25:56.019878 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-55896d89c9gb8th" event={"ID":"88b37c33-7f6d-44b4-9915-bbcd9c15ef62","Type":"ContainerStarted","Data":"bf32f33f9d8b7963270282d500270a5f18b057645e899016ccc7e28c1fbab4b6"} Apr 16 20:25:56.021636 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:25:56.021610 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6855fc94cd-66t9b" event={"ID":"6e409eca-25b4-4ad1-8be2-7ee858119bdc","Type":"ContainerStarted","Data":"7cda5d6194b47dd73584c67693e77ab4d0e8575ca52e19b839f7661646abc6d7"} Apr 16 20:25:56.041494 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:25:56.041439 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-55896d89c9gb8th" podStartSLOduration=6.241947141 podStartE2EDuration="34.041422683s" podCreationTimestamp="2026-04-16 20:25:22 +0000 UTC" firstStartedPulling="2026-04-16 20:25:27.8882814 +0000 UTC m=+827.564549613" lastFinishedPulling="2026-04-16 20:25:55.687756945 +0000 UTC m=+855.364025155" observedRunningTime="2026-04-16 20:25:56.03838044 +0000 UTC m=+855.714648673" watchObservedRunningTime="2026-04-16 20:25:56.041422683 +0000 UTC m=+855.717690938" Apr 16 20:25:56.056328 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:25:56.056241 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6855fc94cd-66t9b" podStartSLOduration=21.056219219 podStartE2EDuration="21.056219219s" podCreationTimestamp="2026-04-16 20:25:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:25:56.054902868 +0000 UTC m=+855.731171096" watchObservedRunningTime="2026-04-16 20:25:56.056219219 +0000 UTC m=+855.732487451" Apr 16 20:25:56.247807 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:25:56.247774 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6855fc94cd-66t9b" Apr 16 20:25:56.247807 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:25:56.247810 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6855fc94cd-66t9b" Apr 16 20:25:56.260161 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:25:56.260139 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6855fc94cd-66t9b" Apr 16 20:25:57.036322 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:25:57.036295 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6855fc94cd-66t9b" Apr 16 20:26:02.711711 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:26:02.711664 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-55896d89c9gb8th" Apr 16 20:26:02.711711 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:26:02.711717 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-55896d89c9gb8th" Apr 16 20:26:02.713360 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:26:02.713303 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-55896d89c9gb8th" podUID="88b37c33-7f6d-44b4-9915-bbcd9c15ef62" containerName="main" probeResult="failure" output="Get \"https://10.133.0.46:8000/health\": dial tcp 10.133.0.46:8000: connect: connection refused" Apr 16 20:26:10.092594 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:26:10.092562 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6855fc94cd-66t9b"] Apr 16 20:26:10.093039 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:26:10.092832 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6855fc94cd-66t9b" podUID="6e409eca-25b4-4ad1-8be2-7ee858119bdc" containerName="main" containerID="cri-o://7cda5d6194b47dd73584c67693e77ab4d0e8575ca52e19b839f7661646abc6d7" gracePeriod=30 Apr 16 20:26:10.896476 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:26:10.896455 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6855fc94cd-66t9b" Apr 16 20:26:11.015699 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:26:11.015617 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6e409eca-25b4-4ad1-8be2-7ee858119bdc-tls-certs\") pod \"6e409eca-25b4-4ad1-8be2-7ee858119bdc\" (UID: \"6e409eca-25b4-4ad1-8be2-7ee858119bdc\") " Apr 16 20:26:11.015699 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:26:11.015661 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6e409eca-25b4-4ad1-8be2-7ee858119bdc-dshm\") pod \"6e409eca-25b4-4ad1-8be2-7ee858119bdc\" (UID: \"6e409eca-25b4-4ad1-8be2-7ee858119bdc\") " Apr 16 20:26:11.015935 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:26:11.015704 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hrdhb\" (UniqueName: \"kubernetes.io/projected/6e409eca-25b4-4ad1-8be2-7ee858119bdc-kube-api-access-hrdhb\") pod \"6e409eca-25b4-4ad1-8be2-7ee858119bdc\" (UID: \"6e409eca-25b4-4ad1-8be2-7ee858119bdc\") " Apr 16 20:26:11.015935 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:26:11.015785 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6e409eca-25b4-4ad1-8be2-7ee858119bdc-model-cache\") pod \"6e409eca-25b4-4ad1-8be2-7ee858119bdc\" (UID: \"6e409eca-25b4-4ad1-8be2-7ee858119bdc\") " Apr 16 20:26:11.015935 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:26:11.015816 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6e409eca-25b4-4ad1-8be2-7ee858119bdc-home\") pod \"6e409eca-25b4-4ad1-8be2-7ee858119bdc\" (UID: \"6e409eca-25b4-4ad1-8be2-7ee858119bdc\") " Apr 16 20:26:11.015935 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:26:11.015854 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6e409eca-25b4-4ad1-8be2-7ee858119bdc-kserve-provision-location\") pod \"6e409eca-25b4-4ad1-8be2-7ee858119bdc\" (UID: \"6e409eca-25b4-4ad1-8be2-7ee858119bdc\") " Apr 16 20:26:11.016155 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:26:11.016134 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e409eca-25b4-4ad1-8be2-7ee858119bdc-home" (OuterVolumeSpecName: "home") pod "6e409eca-25b4-4ad1-8be2-7ee858119bdc" (UID: "6e409eca-25b4-4ad1-8be2-7ee858119bdc"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:26:11.016206 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:26:11.016140 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e409eca-25b4-4ad1-8be2-7ee858119bdc-model-cache" (OuterVolumeSpecName: "model-cache") pod "6e409eca-25b4-4ad1-8be2-7ee858119bdc" (UID: "6e409eca-25b4-4ad1-8be2-7ee858119bdc"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:26:11.016467 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:26:11.016435 2577 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6e409eca-25b4-4ad1-8be2-7ee858119bdc-model-cache\") on node \"ip-10-0-131-31.ec2.internal\" DevicePath \"\"" Apr 16 20:26:11.016671 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:26:11.016649 2577 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6e409eca-25b4-4ad1-8be2-7ee858119bdc-home\") on node \"ip-10-0-131-31.ec2.internal\" DevicePath \"\"" Apr 16 20:26:11.018300 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:26:11.018276 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e409eca-25b4-4ad1-8be2-7ee858119bdc-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "6e409eca-25b4-4ad1-8be2-7ee858119bdc" (UID: "6e409eca-25b4-4ad1-8be2-7ee858119bdc"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:26:11.018300 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:26:11.018290 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e409eca-25b4-4ad1-8be2-7ee858119bdc-dshm" (OuterVolumeSpecName: "dshm") pod "6e409eca-25b4-4ad1-8be2-7ee858119bdc" (UID: "6e409eca-25b4-4ad1-8be2-7ee858119bdc"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:26:11.018463 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:26:11.018443 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e409eca-25b4-4ad1-8be2-7ee858119bdc-kube-api-access-hrdhb" (OuterVolumeSpecName: "kube-api-access-hrdhb") pod "6e409eca-25b4-4ad1-8be2-7ee858119bdc" (UID: "6e409eca-25b4-4ad1-8be2-7ee858119bdc"). InnerVolumeSpecName "kube-api-access-hrdhb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:26:11.070964 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:26:11.070915 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e409eca-25b4-4ad1-8be2-7ee858119bdc-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "6e409eca-25b4-4ad1-8be2-7ee858119bdc" (UID: "6e409eca-25b4-4ad1-8be2-7ee858119bdc"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:26:11.082876 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:26:11.082842 2577 generic.go:358] "Generic (PLEG): container finished" podID="6e409eca-25b4-4ad1-8be2-7ee858119bdc" containerID="7cda5d6194b47dd73584c67693e77ab4d0e8575ca52e19b839f7661646abc6d7" exitCode=0 Apr 16 20:26:11.083043 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:26:11.082900 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6855fc94cd-66t9b" event={"ID":"6e409eca-25b4-4ad1-8be2-7ee858119bdc","Type":"ContainerDied","Data":"7cda5d6194b47dd73584c67693e77ab4d0e8575ca52e19b839f7661646abc6d7"} Apr 16 20:26:11.083043 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:26:11.082938 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6855fc94cd-66t9b" event={"ID":"6e409eca-25b4-4ad1-8be2-7ee858119bdc","Type":"ContainerDied","Data":"ac13a2cd441820d2edfb82cf3a2685586746f01ed2cb441ceaea8b262cdbf286"} Apr 16 20:26:11.083043 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:26:11.082977 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6855fc94cd-66t9b" Apr 16 20:26:11.083168 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:26:11.082984 2577 scope.go:117] "RemoveContainer" containerID="7cda5d6194b47dd73584c67693e77ab4d0e8575ca52e19b839f7661646abc6d7" Apr 16 20:26:11.092067 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:26:11.092048 2577 scope.go:117] "RemoveContainer" containerID="3d810153aa4886022fe616e956eb1924cb0a11e03c05e8d069c6b2b60d1328e9" Apr 16 20:26:11.104991 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:26:11.104827 2577 scope.go:117] "RemoveContainer" containerID="7cda5d6194b47dd73584c67693e77ab4d0e8575ca52e19b839f7661646abc6d7" Apr 16 20:26:11.105238 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:26:11.105126 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7cda5d6194b47dd73584c67693e77ab4d0e8575ca52e19b839f7661646abc6d7\": container with ID starting with 7cda5d6194b47dd73584c67693e77ab4d0e8575ca52e19b839f7661646abc6d7 not found: ID does not exist" containerID="7cda5d6194b47dd73584c67693e77ab4d0e8575ca52e19b839f7661646abc6d7" Apr 16 20:26:11.105238 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:26:11.105152 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7cda5d6194b47dd73584c67693e77ab4d0e8575ca52e19b839f7661646abc6d7"} err="failed to get container status \"7cda5d6194b47dd73584c67693e77ab4d0e8575ca52e19b839f7661646abc6d7\": rpc error: code = NotFound desc = could not find container \"7cda5d6194b47dd73584c67693e77ab4d0e8575ca52e19b839f7661646abc6d7\": container with ID starting with 7cda5d6194b47dd73584c67693e77ab4d0e8575ca52e19b839f7661646abc6d7 not found: ID does not exist" Apr 16 20:26:11.105238 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:26:11.105169 2577 scope.go:117] "RemoveContainer" containerID="3d810153aa4886022fe616e956eb1924cb0a11e03c05e8d069c6b2b60d1328e9" Apr 16 20:26:11.105400 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:26:11.105379 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d810153aa4886022fe616e956eb1924cb0a11e03c05e8d069c6b2b60d1328e9\": container with ID starting with 3d810153aa4886022fe616e956eb1924cb0a11e03c05e8d069c6b2b60d1328e9 not found: ID does not exist" containerID="3d810153aa4886022fe616e956eb1924cb0a11e03c05e8d069c6b2b60d1328e9" Apr 16 20:26:11.105445 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:26:11.105409 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d810153aa4886022fe616e956eb1924cb0a11e03c05e8d069c6b2b60d1328e9"} err="failed to get container status \"3d810153aa4886022fe616e956eb1924cb0a11e03c05e8d069c6b2b60d1328e9\": rpc error: code = NotFound desc = could not find container \"3d810153aa4886022fe616e956eb1924cb0a11e03c05e8d069c6b2b60d1328e9\": container with ID starting with 3d810153aa4886022fe616e956eb1924cb0a11e03c05e8d069c6b2b60d1328e9 not found: ID does not exist" Apr 16 20:26:11.110381 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:26:11.110355 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6855fc94cd-66t9b"] Apr 16 20:26:11.113818 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:26:11.113793 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6855fc94cd-66t9b"] Apr 16 20:26:11.117618 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:26:11.117596 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6e409eca-25b4-4ad1-8be2-7ee858119bdc-kserve-provision-location\") on node \"ip-10-0-131-31.ec2.internal\" DevicePath \"\"" Apr 16 20:26:11.117680 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:26:11.117625 2577 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6e409eca-25b4-4ad1-8be2-7ee858119bdc-tls-certs\") on node \"ip-10-0-131-31.ec2.internal\" DevicePath \"\"" Apr 16 20:26:11.117680 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:26:11.117644 2577 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6e409eca-25b4-4ad1-8be2-7ee858119bdc-dshm\") on node \"ip-10-0-131-31.ec2.internal\" DevicePath \"\"" Apr 16 20:26:11.117680 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:26:11.117657 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hrdhb\" (UniqueName: \"kubernetes.io/projected/6e409eca-25b4-4ad1-8be2-7ee858119bdc-kube-api-access-hrdhb\") on node \"ip-10-0-131-31.ec2.internal\" DevicePath \"\"" Apr 16 20:26:12.711909 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:26:12.711870 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-55896d89c9gb8th" podUID="88b37c33-7f6d-44b4-9915-bbcd9c15ef62" containerName="main" probeResult="failure" output="Get \"https://10.133.0.46:8000/health\": dial tcp 10.133.0.46:8000: connect: connection refused" Apr 16 20:26:12.876187 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:26:12.876155 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e409eca-25b4-4ad1-8be2-7ee858119bdc" path="/var/lib/kubelet/pods/6e409eca-25b4-4ad1-8be2-7ee858119bdc/volumes" Apr 16 20:26:22.712240 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:26:22.712199 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-55896d89c9gb8th" podUID="88b37c33-7f6d-44b4-9915-bbcd9c15ef62" containerName="main" probeResult="failure" output="Get \"https://10.133.0.46:8000/health\": dial tcp 10.133.0.46:8000: connect: connection refused" Apr 16 20:26:32.711686 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:26:32.711566 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-55896d89c9gb8th" podUID="88b37c33-7f6d-44b4-9915-bbcd9c15ef62" containerName="main" probeResult="failure" output="Get \"https://10.133.0.46:8000/health\": dial tcp 10.133.0.46:8000: connect: connection refused" Apr 16 20:26:40.829872 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:26:40.829838 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-gbsjj_066e6004-882a-4b2c-a626-ecac83f8b502/console-operator/2.log" Apr 16 20:26:40.830433 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:26:40.830093 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-gbsjj_066e6004-882a-4b2c-a626-ecac83f8b502/console-operator/2.log" Apr 16 20:26:42.712260 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:26:42.712212 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-55896d89c9gb8th" podUID="88b37c33-7f6d-44b4-9915-bbcd9c15ef62" containerName="main" probeResult="failure" output="Get \"https://10.133.0.46:8000/health\": dial tcp 10.133.0.46:8000: connect: connection refused" Apr 16 20:26:44.120135 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:26:44.120094 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-5f8b49c797-55mg5"] Apr 16 20:26:44.120747 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:26:44.120725 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6e409eca-25b4-4ad1-8be2-7ee858119bdc" containerName="storage-initializer" Apr 16 20:26:44.120818 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:26:44.120752 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e409eca-25b4-4ad1-8be2-7ee858119bdc" containerName="storage-initializer" Apr 16 20:26:44.120818 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:26:44.120781 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6e409eca-25b4-4ad1-8be2-7ee858119bdc" containerName="main" Apr 16 20:26:44.120818 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:26:44.120793 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e409eca-25b4-4ad1-8be2-7ee858119bdc" containerName="main" Apr 16 20:26:44.121008 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:26:44.120890 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="6e409eca-25b4-4ad1-8be2-7ee858119bdc" containerName="main" Apr 16 20:26:44.125004 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:26:44.124987 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f8b49c797-55mg5" Apr 16 20:26:44.127629 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:26:44.127601 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-kserve-self-signed-certs\"" Apr 16 20:26:44.130042 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:26:44.130020 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-5f8b49c797-55mg5"] Apr 16 20:26:44.208057 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:26:44.208026 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/34962b5c-cbcd-4a3a-8d43-f29b1c02d038-home\") pod \"stop-feature-test-kserve-5f8b49c797-55mg5\" (UID: \"34962b5c-cbcd-4a3a-8d43-f29b1c02d038\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f8b49c797-55mg5" Apr 16 20:26:44.208243 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:26:44.208070 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/34962b5c-cbcd-4a3a-8d43-f29b1c02d038-kserve-provision-location\") pod \"stop-feature-test-kserve-5f8b49c797-55mg5\" (UID: \"34962b5c-cbcd-4a3a-8d43-f29b1c02d038\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f8b49c797-55mg5" Apr 16 20:26:44.208243 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:26:44.208129 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/34962b5c-cbcd-4a3a-8d43-f29b1c02d038-tls-certs\") pod \"stop-feature-test-kserve-5f8b49c797-55mg5\" (UID: \"34962b5c-cbcd-4a3a-8d43-f29b1c02d038\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f8b49c797-55mg5" Apr 16 20:26:44.208243 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:26:44.208194 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lh9hp\" (UniqueName: \"kubernetes.io/projected/34962b5c-cbcd-4a3a-8d43-f29b1c02d038-kube-api-access-lh9hp\") pod \"stop-feature-test-kserve-5f8b49c797-55mg5\" (UID: \"34962b5c-cbcd-4a3a-8d43-f29b1c02d038\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f8b49c797-55mg5" Apr 16 20:26:44.208243 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:26:44.208224 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/34962b5c-cbcd-4a3a-8d43-f29b1c02d038-dshm\") pod \"stop-feature-test-kserve-5f8b49c797-55mg5\" (UID: \"34962b5c-cbcd-4a3a-8d43-f29b1c02d038\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f8b49c797-55mg5" Apr 16 20:26:44.208467 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:26:44.208270 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/34962b5c-cbcd-4a3a-8d43-f29b1c02d038-model-cache\") pod \"stop-feature-test-kserve-5f8b49c797-55mg5\" (UID: \"34962b5c-cbcd-4a3a-8d43-f29b1c02d038\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f8b49c797-55mg5" Apr 16 20:26:44.309196 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:26:44.309158 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lh9hp\" (UniqueName: \"kubernetes.io/projected/34962b5c-cbcd-4a3a-8d43-f29b1c02d038-kube-api-access-lh9hp\") pod \"stop-feature-test-kserve-5f8b49c797-55mg5\" (UID: \"34962b5c-cbcd-4a3a-8d43-f29b1c02d038\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f8b49c797-55mg5" Apr 16 20:26:44.309377 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:26:44.309213 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/34962b5c-cbcd-4a3a-8d43-f29b1c02d038-dshm\") pod \"stop-feature-test-kserve-5f8b49c797-55mg5\" (UID: \"34962b5c-cbcd-4a3a-8d43-f29b1c02d038\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f8b49c797-55mg5" Apr 16 20:26:44.309377 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:26:44.309250 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/34962b5c-cbcd-4a3a-8d43-f29b1c02d038-model-cache\") pod \"stop-feature-test-kserve-5f8b49c797-55mg5\" (UID: \"34962b5c-cbcd-4a3a-8d43-f29b1c02d038\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f8b49c797-55mg5" Apr 16 20:26:44.309377 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:26:44.309312 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/34962b5c-cbcd-4a3a-8d43-f29b1c02d038-home\") pod \"stop-feature-test-kserve-5f8b49c797-55mg5\" (UID: \"34962b5c-cbcd-4a3a-8d43-f29b1c02d038\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f8b49c797-55mg5" Apr 16 20:26:44.309377 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:26:44.309369 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/34962b5c-cbcd-4a3a-8d43-f29b1c02d038-kserve-provision-location\") pod \"stop-feature-test-kserve-5f8b49c797-55mg5\" (UID: \"34962b5c-cbcd-4a3a-8d43-f29b1c02d038\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f8b49c797-55mg5" Apr 16 20:26:44.309591 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:26:44.309418 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/34962b5c-cbcd-4a3a-8d43-f29b1c02d038-tls-certs\") pod \"stop-feature-test-kserve-5f8b49c797-55mg5\" (UID: \"34962b5c-cbcd-4a3a-8d43-f29b1c02d038\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f8b49c797-55mg5" Apr 16 20:26:44.309768 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:26:44.309746 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/34962b5c-cbcd-4a3a-8d43-f29b1c02d038-model-cache\") pod \"stop-feature-test-kserve-5f8b49c797-55mg5\" (UID: \"34962b5c-cbcd-4a3a-8d43-f29b1c02d038\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f8b49c797-55mg5" Apr 16 20:26:44.309870 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:26:44.309847 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/34962b5c-cbcd-4a3a-8d43-f29b1c02d038-home\") pod \"stop-feature-test-kserve-5f8b49c797-55mg5\" (UID: \"34962b5c-cbcd-4a3a-8d43-f29b1c02d038\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f8b49c797-55mg5" Apr 16 20:26:44.309916 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:26:44.309867 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/34962b5c-cbcd-4a3a-8d43-f29b1c02d038-kserve-provision-location\") pod \"stop-feature-test-kserve-5f8b49c797-55mg5\" (UID: \"34962b5c-cbcd-4a3a-8d43-f29b1c02d038\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f8b49c797-55mg5" Apr 16 20:26:44.311590 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:26:44.311565 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/34962b5c-cbcd-4a3a-8d43-f29b1c02d038-dshm\") pod \"stop-feature-test-kserve-5f8b49c797-55mg5\" (UID: \"34962b5c-cbcd-4a3a-8d43-f29b1c02d038\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f8b49c797-55mg5" Apr 16 20:26:44.311835 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:26:44.311821 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/34962b5c-cbcd-4a3a-8d43-f29b1c02d038-tls-certs\") pod \"stop-feature-test-kserve-5f8b49c797-55mg5\" (UID: \"34962b5c-cbcd-4a3a-8d43-f29b1c02d038\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f8b49c797-55mg5" Apr 16 20:26:44.319822 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:26:44.319802 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lh9hp\" (UniqueName: \"kubernetes.io/projected/34962b5c-cbcd-4a3a-8d43-f29b1c02d038-kube-api-access-lh9hp\") pod \"stop-feature-test-kserve-5f8b49c797-55mg5\" (UID: \"34962b5c-cbcd-4a3a-8d43-f29b1c02d038\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f8b49c797-55mg5" Apr 16 20:26:44.437667 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:26:44.437578 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f8b49c797-55mg5" Apr 16 20:26:44.567536 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:26:44.567499 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-5f8b49c797-55mg5"] Apr 16 20:26:44.570872 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:26:44.570834 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34962b5c_cbcd_4a3a_8d43_f29b1c02d038.slice/crio-04337a35648f1c512575679a64fc0219f676623920a393160eeda54ed7e00874 WatchSource:0}: Error finding container 04337a35648f1c512575679a64fc0219f676623920a393160eeda54ed7e00874: Status 404 returned error can't find the container with id 04337a35648f1c512575679a64fc0219f676623920a393160eeda54ed7e00874 Apr 16 20:26:45.217907 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:26:45.217862 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f8b49c797-55mg5" event={"ID":"34962b5c-cbcd-4a3a-8d43-f29b1c02d038","Type":"ContainerStarted","Data":"19977dd2982d999f4ed60326b672fd52ace42c990b64ec3913e01b04d581a8c9"} Apr 16 20:26:45.217907 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:26:45.217902 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f8b49c797-55mg5" event={"ID":"34962b5c-cbcd-4a3a-8d43-f29b1c02d038","Type":"ContainerStarted","Data":"04337a35648f1c512575679a64fc0219f676623920a393160eeda54ed7e00874"} Apr 16 20:26:49.236096 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:26:49.235999 2577 generic.go:358] "Generic (PLEG): container finished" podID="34962b5c-cbcd-4a3a-8d43-f29b1c02d038" containerID="19977dd2982d999f4ed60326b672fd52ace42c990b64ec3913e01b04d581a8c9" exitCode=0 Apr 16 20:26:49.236096 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:26:49.236045 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f8b49c797-55mg5" event={"ID":"34962b5c-cbcd-4a3a-8d43-f29b1c02d038","Type":"ContainerDied","Data":"19977dd2982d999f4ed60326b672fd52ace42c990b64ec3913e01b04d581a8c9"} Apr 16 20:26:50.242395 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:26:50.242357 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f8b49c797-55mg5" event={"ID":"34962b5c-cbcd-4a3a-8d43-f29b1c02d038","Type":"ContainerStarted","Data":"6765b97da3d14e3f49e12f4fcb3c3fcaebd92a2ca38aed524029f15232d24daa"} Apr 16 20:26:50.264636 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:26:50.264574 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f8b49c797-55mg5" podStartSLOduration=6.264554467 podStartE2EDuration="6.264554467s" podCreationTimestamp="2026-04-16 20:26:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:26:50.261162842 +0000 UTC m=+909.937431074" watchObservedRunningTime="2026-04-16 20:26:50.264554467 +0000 UTC m=+909.940822701" Apr 16 20:26:52.712487 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:26:52.712446 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-55896d89c9gb8th" podUID="88b37c33-7f6d-44b4-9915-bbcd9c15ef62" containerName="main" probeResult="failure" output="Get \"https://10.133.0.46:8000/health\": dial tcp 10.133.0.46:8000: connect: connection refused" Apr 16 20:26:54.438547 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:26:54.438503 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f8b49c797-55mg5" Apr 16 20:26:54.439001 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:26:54.438560 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f8b49c797-55mg5" Apr 16 20:26:54.440307 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:26:54.440274 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f8b49c797-55mg5" podUID="34962b5c-cbcd-4a3a-8d43-f29b1c02d038" containerName="main" probeResult="failure" output="Get \"https://10.133.0.48:8000/health\": dial tcp 10.133.0.48:8000: connect: connection refused" Apr 16 20:27:02.712195 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:27:02.712147 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-55896d89c9gb8th" podUID="88b37c33-7f6d-44b4-9915-bbcd9c15ef62" containerName="main" probeResult="failure" output="Get \"https://10.133.0.46:8000/health\": dial tcp 10.133.0.46:8000: connect: connection refused" Apr 16 20:27:04.438501 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:27:04.438460 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f8b49c797-55mg5" podUID="34962b5c-cbcd-4a3a-8d43-f29b1c02d038" containerName="main" probeResult="failure" output="Get \"https://10.133.0.48:8000/health\": dial tcp 10.133.0.48:8000: connect: connection refused" Apr 16 20:27:12.712499 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:27:12.712448 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-55896d89c9gb8th" podUID="88b37c33-7f6d-44b4-9915-bbcd9c15ef62" containerName="main" probeResult="failure" output="Get \"https://10.133.0.46:8000/health\": dial tcp 10.133.0.46:8000: connect: connection refused" Apr 16 20:27:14.438129 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:27:14.438083 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f8b49c797-55mg5" podUID="34962b5c-cbcd-4a3a-8d43-f29b1c02d038" containerName="main" probeResult="failure" output="Get \"https://10.133.0.48:8000/health\": dial tcp 10.133.0.48:8000: connect: connection refused" Apr 16 20:27:22.712103 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:27:22.712049 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-55896d89c9gb8th" podUID="88b37c33-7f6d-44b4-9915-bbcd9c15ef62" containerName="main" probeResult="failure" output="Get \"https://10.133.0.46:8000/health\": dial tcp 10.133.0.46:8000: connect: connection refused" Apr 16 20:27:24.438756 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:27:24.438708 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f8b49c797-55mg5" podUID="34962b5c-cbcd-4a3a-8d43-f29b1c02d038" containerName="main" probeResult="failure" output="Get \"https://10.133.0.48:8000/health\": dial tcp 10.133.0.48:8000: connect: connection refused" Apr 16 20:27:32.722045 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:27:32.722011 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-55896d89c9gb8th" Apr 16 20:27:32.730441 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:27:32.730412 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-55896d89c9gb8th" Apr 16 20:27:34.438810 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:27:34.438770 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f8b49c797-55mg5" podUID="34962b5c-cbcd-4a3a-8d43-f29b1c02d038" containerName="main" probeResult="failure" output="Get \"https://10.133.0.48:8000/health\": dial tcp 10.133.0.48:8000: connect: connection refused" Apr 16 20:27:39.403860 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:27:39.403823 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-55896d89c9gb8th"] Apr 16 20:27:39.404246 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:27:39.404196 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-55896d89c9gb8th" podUID="88b37c33-7f6d-44b4-9915-bbcd9c15ef62" containerName="main" containerID="cri-o://bf32f33f9d8b7963270282d500270a5f18b057645e899016ccc7e28c1fbab4b6" gracePeriod=30 Apr 16 20:27:44.438631 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:27:44.438589 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f8b49c797-55mg5" podUID="34962b5c-cbcd-4a3a-8d43-f29b1c02d038" containerName="main" probeResult="failure" output="Get \"https://10.133.0.48:8000/health\": dial tcp 10.133.0.48:8000: connect: connection refused" Apr 16 20:27:54.438422 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:27:54.438375 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f8b49c797-55mg5" podUID="34962b5c-cbcd-4a3a-8d43-f29b1c02d038" containerName="main" probeResult="failure" output="Get \"https://10.133.0.48:8000/health\": dial tcp 10.133.0.48:8000: connect: connection refused" Apr 16 20:28:04.438706 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:28:04.438605 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f8b49c797-55mg5" podUID="34962b5c-cbcd-4a3a-8d43-f29b1c02d038" containerName="main" probeResult="failure" output="Get \"https://10.133.0.48:8000/health\": dial tcp 10.133.0.48:8000: connect: connection refused" Apr 16 20:28:09.587850 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:28:09.587826 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-55896d89c9gb8th_88b37c33-7f6d-44b4-9915-bbcd9c15ef62/main/0.log" Apr 16 20:28:09.588258 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:28:09.588209 2577 generic.go:358] "Generic (PLEG): container finished" podID="88b37c33-7f6d-44b4-9915-bbcd9c15ef62" containerID="bf32f33f9d8b7963270282d500270a5f18b057645e899016ccc7e28c1fbab4b6" exitCode=137 Apr 16 20:28:09.588426 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:28:09.588259 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-55896d89c9gb8th" event={"ID":"88b37c33-7f6d-44b4-9915-bbcd9c15ef62","Type":"ContainerDied","Data":"bf32f33f9d8b7963270282d500270a5f18b057645e899016ccc7e28c1fbab4b6"} Apr 16 20:28:09.668964 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:28:09.668890 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-55896d89c9gb8th_88b37c33-7f6d-44b4-9915-bbcd9c15ef62/main/0.log" Apr 16 20:28:09.669303 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:28:09.669287 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-55896d89c9gb8th" Apr 16 20:28:09.702927 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:28:09.702896 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/88b37c33-7f6d-44b4-9915-bbcd9c15ef62-home\") pod \"88b37c33-7f6d-44b4-9915-bbcd9c15ef62\" (UID: \"88b37c33-7f6d-44b4-9915-bbcd9c15ef62\") " Apr 16 20:28:09.703102 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:28:09.702961 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/88b37c33-7f6d-44b4-9915-bbcd9c15ef62-model-cache\") pod \"88b37c33-7f6d-44b4-9915-bbcd9c15ef62\" (UID: \"88b37c33-7f6d-44b4-9915-bbcd9c15ef62\") " Apr 16 20:28:09.703102 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:28:09.703011 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n9x86\" (UniqueName: \"kubernetes.io/projected/88b37c33-7f6d-44b4-9915-bbcd9c15ef62-kube-api-access-n9x86\") pod \"88b37c33-7f6d-44b4-9915-bbcd9c15ef62\" (UID: \"88b37c33-7f6d-44b4-9915-bbcd9c15ef62\") " Apr 16 20:28:09.703237 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:28:09.703162 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/88b37c33-7f6d-44b4-9915-bbcd9c15ef62-tls-certs\") pod \"88b37c33-7f6d-44b4-9915-bbcd9c15ef62\" (UID: \"88b37c33-7f6d-44b4-9915-bbcd9c15ef62\") " Apr 16 20:28:09.703237 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:28:09.703186 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88b37c33-7f6d-44b4-9915-bbcd9c15ef62-model-cache" (OuterVolumeSpecName: "model-cache") pod "88b37c33-7f6d-44b4-9915-bbcd9c15ef62" (UID: "88b37c33-7f6d-44b4-9915-bbcd9c15ef62"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:28:09.703346 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:28:09.703242 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/88b37c33-7f6d-44b4-9915-bbcd9c15ef62-dshm\") pod \"88b37c33-7f6d-44b4-9915-bbcd9c15ef62\" (UID: \"88b37c33-7f6d-44b4-9915-bbcd9c15ef62\") " Apr 16 20:28:09.703346 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:28:09.703295 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/88b37c33-7f6d-44b4-9915-bbcd9c15ef62-kserve-provision-location\") pod \"88b37c33-7f6d-44b4-9915-bbcd9c15ef62\" (UID: \"88b37c33-7f6d-44b4-9915-bbcd9c15ef62\") " Apr 16 20:28:09.703346 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:28:09.703299 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88b37c33-7f6d-44b4-9915-bbcd9c15ef62-home" (OuterVolumeSpecName: "home") pod "88b37c33-7f6d-44b4-9915-bbcd9c15ef62" (UID: "88b37c33-7f6d-44b4-9915-bbcd9c15ef62"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:28:09.703610 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:28:09.703586 2577 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/88b37c33-7f6d-44b4-9915-bbcd9c15ef62-home\") on node \"ip-10-0-131-31.ec2.internal\" DevicePath \"\"" Apr 16 20:28:09.703678 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:28:09.703618 2577 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/88b37c33-7f6d-44b4-9915-bbcd9c15ef62-model-cache\") on node \"ip-10-0-131-31.ec2.internal\" DevicePath \"\"" Apr 16 20:28:09.705657 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:28:09.705613 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88b37c33-7f6d-44b4-9915-bbcd9c15ef62-kube-api-access-n9x86" (OuterVolumeSpecName: "kube-api-access-n9x86") pod "88b37c33-7f6d-44b4-9915-bbcd9c15ef62" (UID: "88b37c33-7f6d-44b4-9915-bbcd9c15ef62"). InnerVolumeSpecName "kube-api-access-n9x86". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:28:09.705766 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:28:09.705685 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88b37c33-7f6d-44b4-9915-bbcd9c15ef62-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "88b37c33-7f6d-44b4-9915-bbcd9c15ef62" (UID: "88b37c33-7f6d-44b4-9915-bbcd9c15ef62"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:28:09.705766 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:28:09.705698 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88b37c33-7f6d-44b4-9915-bbcd9c15ef62-dshm" (OuterVolumeSpecName: "dshm") pod "88b37c33-7f6d-44b4-9915-bbcd9c15ef62" (UID: "88b37c33-7f6d-44b4-9915-bbcd9c15ef62"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:28:09.765564 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:28:09.765506 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88b37c33-7f6d-44b4-9915-bbcd9c15ef62-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "88b37c33-7f6d-44b4-9915-bbcd9c15ef62" (UID: "88b37c33-7f6d-44b4-9915-bbcd9c15ef62"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:28:09.804475 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:28:09.804447 2577 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/88b37c33-7f6d-44b4-9915-bbcd9c15ef62-tls-certs\") on node \"ip-10-0-131-31.ec2.internal\" DevicePath \"\"" Apr 16 20:28:09.804475 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:28:09.804476 2577 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/88b37c33-7f6d-44b4-9915-bbcd9c15ef62-dshm\") on node \"ip-10-0-131-31.ec2.internal\" DevicePath \"\"" Apr 16 20:28:09.804642 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:28:09.804491 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/88b37c33-7f6d-44b4-9915-bbcd9c15ef62-kserve-provision-location\") on node \"ip-10-0-131-31.ec2.internal\" DevicePath \"\"" Apr 16 20:28:09.804642 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:28:09.804504 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-n9x86\" (UniqueName: \"kubernetes.io/projected/88b37c33-7f6d-44b4-9915-bbcd9c15ef62-kube-api-access-n9x86\") on node \"ip-10-0-131-31.ec2.internal\" DevicePath \"\"" Apr 16 20:28:10.594064 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:28:10.594038 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-55896d89c9gb8th_88b37c33-7f6d-44b4-9915-bbcd9c15ef62/main/0.log" Apr 16 20:28:10.594515 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:28:10.594461 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-55896d89c9gb8th" event={"ID":"88b37c33-7f6d-44b4-9915-bbcd9c15ef62","Type":"ContainerDied","Data":"2244d9d3f3b0362fae83ba0abeaf0279f88db608573a6cb581034817518480a3"} Apr 16 20:28:10.594515 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:28:10.594502 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-55896d89c9gb8th" Apr 16 20:28:10.594602 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:28:10.594509 2577 scope.go:117] "RemoveContainer" containerID="bf32f33f9d8b7963270282d500270a5f18b057645e899016ccc7e28c1fbab4b6" Apr 16 20:28:10.618077 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:28:10.618046 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-55896d89c9gb8th"] Apr 16 20:28:10.622435 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:28:10.622413 2577 scope.go:117] "RemoveContainer" containerID="51b907463e7052641077ba2f65db30932e3985c47b217b00d5f0e69acc5ef8cd" Apr 16 20:28:10.624130 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:28:10.624109 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-55896d89c9gb8th"] Apr 16 20:28:10.876254 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:28:10.876175 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88b37c33-7f6d-44b4-9915-bbcd9c15ef62" path="/var/lib/kubelet/pods/88b37c33-7f6d-44b4-9915-bbcd9c15ef62/volumes" Apr 16 20:28:14.438698 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:28:14.438658 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f8b49c797-55mg5" podUID="34962b5c-cbcd-4a3a-8d43-f29b1c02d038" containerName="main" probeResult="failure" output="Get \"https://10.133.0.48:8000/health\": dial tcp 10.133.0.48:8000: connect: connection refused" Apr 16 20:28:24.448196 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:28:24.448166 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f8b49c797-55mg5" Apr 16 20:28:24.455805 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:28:24.455776 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f8b49c797-55mg5" Apr 16 20:28:25.692577 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:28:25.692542 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-5f8b49c797-55mg5"] Apr 16 20:28:25.693626 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:28:25.693570 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f8b49c797-55mg5" podUID="34962b5c-cbcd-4a3a-8d43-f29b1c02d038" containerName="main" containerID="cri-o://6765b97da3d14e3f49e12f4fcb3c3fcaebd92a2ca38aed524029f15232d24daa" gracePeriod=30 Apr 16 20:28:48.499268 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:28:48.499230 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-5f8b49c797-l8n4v"] Apr 16 20:28:48.499734 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:28:48.499662 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="88b37c33-7f6d-44b4-9915-bbcd9c15ef62" containerName="storage-initializer" Apr 16 20:28:48.499734 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:28:48.499675 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="88b37c33-7f6d-44b4-9915-bbcd9c15ef62" containerName="storage-initializer" Apr 16 20:28:48.499734 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:28:48.499684 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="88b37c33-7f6d-44b4-9915-bbcd9c15ef62" containerName="main" Apr 16 20:28:48.499734 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:28:48.499689 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="88b37c33-7f6d-44b4-9915-bbcd9c15ef62" containerName="main" Apr 16 20:28:48.499866 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:28:48.499764 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="88b37c33-7f6d-44b4-9915-bbcd9c15ef62" containerName="main" Apr 16 20:28:48.504158 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:28:48.504140 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f8b49c797-l8n4v" Apr 16 20:28:48.514468 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:28:48.514440 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-5f8b49c797-l8n4v"] Apr 16 20:28:48.644833 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:28:48.644793 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c56f5f1c-089c-44a3-af14-8b95bc23df2a-model-cache\") pod \"stop-feature-test-kserve-5f8b49c797-l8n4v\" (UID: \"c56f5f1c-089c-44a3-af14-8b95bc23df2a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f8b49c797-l8n4v" Apr 16 20:28:48.645040 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:28:48.644851 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c56f5f1c-089c-44a3-af14-8b95bc23df2a-tls-certs\") pod \"stop-feature-test-kserve-5f8b49c797-l8n4v\" (UID: \"c56f5f1c-089c-44a3-af14-8b95bc23df2a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f8b49c797-l8n4v" Apr 16 20:28:48.645040 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:28:48.644924 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c56f5f1c-089c-44a3-af14-8b95bc23df2a-dshm\") pod \"stop-feature-test-kserve-5f8b49c797-l8n4v\" (UID: \"c56f5f1c-089c-44a3-af14-8b95bc23df2a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f8b49c797-l8n4v" Apr 16 20:28:48.645040 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:28:48.644942 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vqdq\" (UniqueName: \"kubernetes.io/projected/c56f5f1c-089c-44a3-af14-8b95bc23df2a-kube-api-access-2vqdq\") pod \"stop-feature-test-kserve-5f8b49c797-l8n4v\" (UID: \"c56f5f1c-089c-44a3-af14-8b95bc23df2a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f8b49c797-l8n4v" Apr 16 20:28:48.645206 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:28:48.645066 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c56f5f1c-089c-44a3-af14-8b95bc23df2a-home\") pod \"stop-feature-test-kserve-5f8b49c797-l8n4v\" (UID: \"c56f5f1c-089c-44a3-af14-8b95bc23df2a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f8b49c797-l8n4v" Apr 16 20:28:48.645206 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:28:48.645107 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c56f5f1c-089c-44a3-af14-8b95bc23df2a-kserve-provision-location\") pod \"stop-feature-test-kserve-5f8b49c797-l8n4v\" (UID: \"c56f5f1c-089c-44a3-af14-8b95bc23df2a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f8b49c797-l8n4v" Apr 16 20:28:48.745999 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:28:48.745964 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c56f5f1c-089c-44a3-af14-8b95bc23df2a-dshm\") pod \"stop-feature-test-kserve-5f8b49c797-l8n4v\" (UID: \"c56f5f1c-089c-44a3-af14-8b95bc23df2a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f8b49c797-l8n4v" Apr 16 20:28:48.746157 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:28:48.746008 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2vqdq\" (UniqueName: \"kubernetes.io/projected/c56f5f1c-089c-44a3-af14-8b95bc23df2a-kube-api-access-2vqdq\") pod \"stop-feature-test-kserve-5f8b49c797-l8n4v\" (UID: \"c56f5f1c-089c-44a3-af14-8b95bc23df2a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f8b49c797-l8n4v" Apr 16 20:28:48.746157 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:28:48.746062 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c56f5f1c-089c-44a3-af14-8b95bc23df2a-home\") pod \"stop-feature-test-kserve-5f8b49c797-l8n4v\" (UID: \"c56f5f1c-089c-44a3-af14-8b95bc23df2a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f8b49c797-l8n4v" Apr 16 20:28:48.746157 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:28:48.746092 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c56f5f1c-089c-44a3-af14-8b95bc23df2a-kserve-provision-location\") pod \"stop-feature-test-kserve-5f8b49c797-l8n4v\" (UID: \"c56f5f1c-089c-44a3-af14-8b95bc23df2a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f8b49c797-l8n4v" Apr 16 20:28:48.746157 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:28:48.746149 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c56f5f1c-089c-44a3-af14-8b95bc23df2a-model-cache\") pod \"stop-feature-test-kserve-5f8b49c797-l8n4v\" (UID: \"c56f5f1c-089c-44a3-af14-8b95bc23df2a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f8b49c797-l8n4v" Apr 16 20:28:48.746359 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:28:48.746170 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c56f5f1c-089c-44a3-af14-8b95bc23df2a-tls-certs\") pod \"stop-feature-test-kserve-5f8b49c797-l8n4v\" (UID: \"c56f5f1c-089c-44a3-af14-8b95bc23df2a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f8b49c797-l8n4v" Apr 16 20:28:48.746444 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:28:48.746425 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c56f5f1c-089c-44a3-af14-8b95bc23df2a-home\") pod \"stop-feature-test-kserve-5f8b49c797-l8n4v\" (UID: \"c56f5f1c-089c-44a3-af14-8b95bc23df2a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f8b49c797-l8n4v" Apr 16 20:28:48.746562 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:28:48.746517 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c56f5f1c-089c-44a3-af14-8b95bc23df2a-kserve-provision-location\") pod \"stop-feature-test-kserve-5f8b49c797-l8n4v\" (UID: \"c56f5f1c-089c-44a3-af14-8b95bc23df2a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f8b49c797-l8n4v" Apr 16 20:28:48.746562 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:28:48.746522 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c56f5f1c-089c-44a3-af14-8b95bc23df2a-model-cache\") pod \"stop-feature-test-kserve-5f8b49c797-l8n4v\" (UID: \"c56f5f1c-089c-44a3-af14-8b95bc23df2a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f8b49c797-l8n4v" Apr 16 20:28:48.748162 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:28:48.748142 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c56f5f1c-089c-44a3-af14-8b95bc23df2a-dshm\") pod \"stop-feature-test-kserve-5f8b49c797-l8n4v\" (UID: \"c56f5f1c-089c-44a3-af14-8b95bc23df2a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f8b49c797-l8n4v" Apr 16 20:28:48.748426 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:28:48.748409 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c56f5f1c-089c-44a3-af14-8b95bc23df2a-tls-certs\") pod \"stop-feature-test-kserve-5f8b49c797-l8n4v\" (UID: \"c56f5f1c-089c-44a3-af14-8b95bc23df2a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f8b49c797-l8n4v" Apr 16 20:28:48.753635 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:28:48.753582 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vqdq\" (UniqueName: \"kubernetes.io/projected/c56f5f1c-089c-44a3-af14-8b95bc23df2a-kube-api-access-2vqdq\") pod \"stop-feature-test-kserve-5f8b49c797-l8n4v\" (UID: \"c56f5f1c-089c-44a3-af14-8b95bc23df2a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f8b49c797-l8n4v" Apr 16 20:28:48.815428 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:28:48.815400 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f8b49c797-l8n4v" Apr 16 20:28:48.945202 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:28:48.945176 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-5f8b49c797-l8n4v"] Apr 16 20:28:48.946149 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:28:48.946122 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc56f5f1c_089c_44a3_af14_8b95bc23df2a.slice/crio-c9c3516e5b95f758d64a478aad79db235d25f18c50e5eb06a6e2c0161801f799 WatchSource:0}: Error finding container c9c3516e5b95f758d64a478aad79db235d25f18c50e5eb06a6e2c0161801f799: Status 404 returned error can't find the container with id c9c3516e5b95f758d64a478aad79db235d25f18c50e5eb06a6e2c0161801f799 Apr 16 20:28:48.948045 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:28:48.948017 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 20:28:49.748259 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:28:49.748221 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f8b49c797-l8n4v" event={"ID":"c56f5f1c-089c-44a3-af14-8b95bc23df2a","Type":"ContainerStarted","Data":"984804879cd71a5fa2b7bea1508ddf363af7a8c5b4bcaa969ab74d49828216fd"} Apr 16 20:28:49.748259 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:28:49.748263 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f8b49c797-l8n4v" event={"ID":"c56f5f1c-089c-44a3-af14-8b95bc23df2a","Type":"ContainerStarted","Data":"c9c3516e5b95f758d64a478aad79db235d25f18c50e5eb06a6e2c0161801f799"} Apr 16 20:28:53.765765 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:28:53.765730 2577 generic.go:358] "Generic (PLEG): container finished" podID="c56f5f1c-089c-44a3-af14-8b95bc23df2a" containerID="984804879cd71a5fa2b7bea1508ddf363af7a8c5b4bcaa969ab74d49828216fd" exitCode=0 Apr 16 20:28:53.766276 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:28:53.765805 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f8b49c797-l8n4v" event={"ID":"c56f5f1c-089c-44a3-af14-8b95bc23df2a","Type":"ContainerDied","Data":"984804879cd71a5fa2b7bea1508ddf363af7a8c5b4bcaa969ab74d49828216fd"} Apr 16 20:28:54.771928 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:28:54.771893 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f8b49c797-l8n4v" event={"ID":"c56f5f1c-089c-44a3-af14-8b95bc23df2a","Type":"ContainerStarted","Data":"a3e35adb89afbf8e0a746bb54202247c94ed614bd20010c60213869552b052b0"} Apr 16 20:28:54.790960 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:28:54.790895 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f8b49c797-l8n4v" podStartSLOduration=6.7908746749999995 podStartE2EDuration="6.790874675s" podCreationTimestamp="2026-04-16 20:28:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:28:54.789696305 +0000 UTC m=+1034.465964541" watchObservedRunningTime="2026-04-16 20:28:54.790874675 +0000 UTC m=+1034.467142905" Apr 16 20:28:55.976158 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:28:55.976134 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-5f8b49c797-55mg5_34962b5c-cbcd-4a3a-8d43-f29b1c02d038/main/0.log" Apr 16 20:28:55.976580 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:28:55.976564 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f8b49c797-55mg5" Apr 16 20:28:56.010297 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:28:56.010270 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lh9hp\" (UniqueName: \"kubernetes.io/projected/34962b5c-cbcd-4a3a-8d43-f29b1c02d038-kube-api-access-lh9hp\") pod \"34962b5c-cbcd-4a3a-8d43-f29b1c02d038\" (UID: \"34962b5c-cbcd-4a3a-8d43-f29b1c02d038\") " Apr 16 20:28:56.010481 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:28:56.010309 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/34962b5c-cbcd-4a3a-8d43-f29b1c02d038-kserve-provision-location\") pod \"34962b5c-cbcd-4a3a-8d43-f29b1c02d038\" (UID: \"34962b5c-cbcd-4a3a-8d43-f29b1c02d038\") " Apr 16 20:28:56.010481 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:28:56.010345 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/34962b5c-cbcd-4a3a-8d43-f29b1c02d038-tls-certs\") pod \"34962b5c-cbcd-4a3a-8d43-f29b1c02d038\" (UID: \"34962b5c-cbcd-4a3a-8d43-f29b1c02d038\") " Apr 16 20:28:56.010481 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:28:56.010364 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/34962b5c-cbcd-4a3a-8d43-f29b1c02d038-home\") pod \"34962b5c-cbcd-4a3a-8d43-f29b1c02d038\" (UID: \"34962b5c-cbcd-4a3a-8d43-f29b1c02d038\") " Apr 16 20:28:56.010481 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:28:56.010385 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/34962b5c-cbcd-4a3a-8d43-f29b1c02d038-dshm\") pod \"34962b5c-cbcd-4a3a-8d43-f29b1c02d038\" (UID: \"34962b5c-cbcd-4a3a-8d43-f29b1c02d038\") " Apr 16 20:28:56.010481 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:28:56.010430 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/34962b5c-cbcd-4a3a-8d43-f29b1c02d038-model-cache\") pod \"34962b5c-cbcd-4a3a-8d43-f29b1c02d038\" (UID: \"34962b5c-cbcd-4a3a-8d43-f29b1c02d038\") " Apr 16 20:28:56.010767 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:28:56.010707 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34962b5c-cbcd-4a3a-8d43-f29b1c02d038-home" (OuterVolumeSpecName: "home") pod "34962b5c-cbcd-4a3a-8d43-f29b1c02d038" (UID: "34962b5c-cbcd-4a3a-8d43-f29b1c02d038"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:28:56.010887 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:28:56.010862 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34962b5c-cbcd-4a3a-8d43-f29b1c02d038-model-cache" (OuterVolumeSpecName: "model-cache") pod "34962b5c-cbcd-4a3a-8d43-f29b1c02d038" (UID: "34962b5c-cbcd-4a3a-8d43-f29b1c02d038"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:28:56.013101 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:28:56.013060 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34962b5c-cbcd-4a3a-8d43-f29b1c02d038-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "34962b5c-cbcd-4a3a-8d43-f29b1c02d038" (UID: "34962b5c-cbcd-4a3a-8d43-f29b1c02d038"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:28:56.013262 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:28:56.013230 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34962b5c-cbcd-4a3a-8d43-f29b1c02d038-kube-api-access-lh9hp" (OuterVolumeSpecName: "kube-api-access-lh9hp") pod "34962b5c-cbcd-4a3a-8d43-f29b1c02d038" (UID: "34962b5c-cbcd-4a3a-8d43-f29b1c02d038"). InnerVolumeSpecName "kube-api-access-lh9hp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:28:56.013352 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:28:56.013302 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34962b5c-cbcd-4a3a-8d43-f29b1c02d038-dshm" (OuterVolumeSpecName: "dshm") pod "34962b5c-cbcd-4a3a-8d43-f29b1c02d038" (UID: "34962b5c-cbcd-4a3a-8d43-f29b1c02d038"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:28:56.075763 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:28:56.075715 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34962b5c-cbcd-4a3a-8d43-f29b1c02d038-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "34962b5c-cbcd-4a3a-8d43-f29b1c02d038" (UID: "34962b5c-cbcd-4a3a-8d43-f29b1c02d038"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:28:56.111189 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:28:56.111156 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lh9hp\" (UniqueName: \"kubernetes.io/projected/34962b5c-cbcd-4a3a-8d43-f29b1c02d038-kube-api-access-lh9hp\") on node \"ip-10-0-131-31.ec2.internal\" DevicePath \"\"" Apr 16 20:28:56.111189 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:28:56.111186 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/34962b5c-cbcd-4a3a-8d43-f29b1c02d038-kserve-provision-location\") on node \"ip-10-0-131-31.ec2.internal\" DevicePath \"\"" Apr 16 20:28:56.111189 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:28:56.111196 2577 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/34962b5c-cbcd-4a3a-8d43-f29b1c02d038-tls-certs\") on node \"ip-10-0-131-31.ec2.internal\" DevicePath \"\"" Apr 16 20:28:56.111388 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:28:56.111207 2577 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/34962b5c-cbcd-4a3a-8d43-f29b1c02d038-home\") on node \"ip-10-0-131-31.ec2.internal\" DevicePath \"\"" Apr 16 20:28:56.111388 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:28:56.111216 2577 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/34962b5c-cbcd-4a3a-8d43-f29b1c02d038-dshm\") on node \"ip-10-0-131-31.ec2.internal\" DevicePath \"\"" Apr 16 20:28:56.111388 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:28:56.111224 2577 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/34962b5c-cbcd-4a3a-8d43-f29b1c02d038-model-cache\") on node \"ip-10-0-131-31.ec2.internal\" DevicePath \"\"" Apr 16 20:28:56.781095 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:28:56.781067 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-5f8b49c797-55mg5_34962b5c-cbcd-4a3a-8d43-f29b1c02d038/main/0.log" Apr 16 20:28:56.781450 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:28:56.781427 2577 generic.go:358] "Generic (PLEG): container finished" podID="34962b5c-cbcd-4a3a-8d43-f29b1c02d038" containerID="6765b97da3d14e3f49e12f4fcb3c3fcaebd92a2ca38aed524029f15232d24daa" exitCode=137 Apr 16 20:28:56.781525 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:28:56.781495 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f8b49c797-55mg5" Apr 16 20:28:56.781525 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:28:56.781512 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f8b49c797-55mg5" event={"ID":"34962b5c-cbcd-4a3a-8d43-f29b1c02d038","Type":"ContainerDied","Data":"6765b97da3d14e3f49e12f4fcb3c3fcaebd92a2ca38aed524029f15232d24daa"} Apr 16 20:28:56.781595 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:28:56.781554 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f8b49c797-55mg5" event={"ID":"34962b5c-cbcd-4a3a-8d43-f29b1c02d038","Type":"ContainerDied","Data":"04337a35648f1c512575679a64fc0219f676623920a393160eeda54ed7e00874"} Apr 16 20:28:56.781595 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:28:56.781576 2577 scope.go:117] "RemoveContainer" containerID="6765b97da3d14e3f49e12f4fcb3c3fcaebd92a2ca38aed524029f15232d24daa" Apr 16 20:28:56.800939 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:28:56.800915 2577 scope.go:117] "RemoveContainer" containerID="19977dd2982d999f4ed60326b672fd52ace42c990b64ec3913e01b04d581a8c9" Apr 16 20:28:56.804730 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:28:56.804705 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-5f8b49c797-55mg5"] Apr 16 20:28:56.810130 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:28:56.810106 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-5f8b49c797-55mg5"] Apr 16 20:28:56.862071 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:28:56.861966 2577 scope.go:117] "RemoveContainer" containerID="6765b97da3d14e3f49e12f4fcb3c3fcaebd92a2ca38aed524029f15232d24daa" Apr 16 20:28:56.862317 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:28:56.862296 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6765b97da3d14e3f49e12f4fcb3c3fcaebd92a2ca38aed524029f15232d24daa\": container with ID starting with 6765b97da3d14e3f49e12f4fcb3c3fcaebd92a2ca38aed524029f15232d24daa not found: ID does not exist" containerID="6765b97da3d14e3f49e12f4fcb3c3fcaebd92a2ca38aed524029f15232d24daa" Apr 16 20:28:56.862367 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:28:56.862330 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6765b97da3d14e3f49e12f4fcb3c3fcaebd92a2ca38aed524029f15232d24daa"} err="failed to get container status \"6765b97da3d14e3f49e12f4fcb3c3fcaebd92a2ca38aed524029f15232d24daa\": rpc error: code = NotFound desc = could not find container \"6765b97da3d14e3f49e12f4fcb3c3fcaebd92a2ca38aed524029f15232d24daa\": container with ID starting with 6765b97da3d14e3f49e12f4fcb3c3fcaebd92a2ca38aed524029f15232d24daa not found: ID does not exist" Apr 16 20:28:56.862415 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:28:56.862367 2577 scope.go:117] "RemoveContainer" containerID="19977dd2982d999f4ed60326b672fd52ace42c990b64ec3913e01b04d581a8c9" Apr 16 20:28:56.862617 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:28:56.862597 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19977dd2982d999f4ed60326b672fd52ace42c990b64ec3913e01b04d581a8c9\": container with ID starting with 19977dd2982d999f4ed60326b672fd52ace42c990b64ec3913e01b04d581a8c9 not found: ID does not exist" containerID="19977dd2982d999f4ed60326b672fd52ace42c990b64ec3913e01b04d581a8c9" Apr 16 20:28:56.862680 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:28:56.862625 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19977dd2982d999f4ed60326b672fd52ace42c990b64ec3913e01b04d581a8c9"} err="failed to get container status \"19977dd2982d999f4ed60326b672fd52ace42c990b64ec3913e01b04d581a8c9\": rpc error: code = NotFound desc = could not find container \"19977dd2982d999f4ed60326b672fd52ace42c990b64ec3913e01b04d581a8c9\": container with ID starting with 19977dd2982d999f4ed60326b672fd52ace42c990b64ec3913e01b04d581a8c9 not found: ID does not exist" Apr 16 20:28:56.876478 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:28:56.876451 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34962b5c-cbcd-4a3a-8d43-f29b1c02d038" path="/var/lib/kubelet/pods/34962b5c-cbcd-4a3a-8d43-f29b1c02d038/volumes" Apr 16 20:28:58.816439 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:28:58.816395 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f8b49c797-l8n4v" Apr 16 20:28:58.816439 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:28:58.816439 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f8b49c797-l8n4v" Apr 16 20:28:58.818087 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:28:58.818058 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f8b49c797-l8n4v" podUID="c56f5f1c-089c-44a3-af14-8b95bc23df2a" containerName="main" probeResult="failure" output="Get \"https://10.133.0.49:8000/health\": dial tcp 10.133.0.49:8000: connect: connection refused" Apr 16 20:29:08.816603 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:29:08.816546 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f8b49c797-l8n4v" podUID="c56f5f1c-089c-44a3-af14-8b95bc23df2a" containerName="main" probeResult="failure" output="Get \"https://10.133.0.49:8000/health\": dial tcp 10.133.0.49:8000: connect: connection refused" Apr 16 20:29:18.816439 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:29:18.816396 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f8b49c797-l8n4v" podUID="c56f5f1c-089c-44a3-af14-8b95bc23df2a" containerName="main" probeResult="failure" output="Get \"https://10.133.0.49:8000/health\": dial tcp 10.133.0.49:8000: connect: connection refused" Apr 16 20:29:28.816579 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:29:28.816536 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f8b49c797-l8n4v" podUID="c56f5f1c-089c-44a3-af14-8b95bc23df2a" containerName="main" probeResult="failure" output="Get \"https://10.133.0.49:8000/health\": dial tcp 10.133.0.49:8000: connect: connection refused" Apr 16 20:29:38.816052 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:29:38.816006 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f8b49c797-l8n4v" podUID="c56f5f1c-089c-44a3-af14-8b95bc23df2a" containerName="main" probeResult="failure" output="Get \"https://10.133.0.49:8000/health\": dial tcp 10.133.0.49:8000: connect: connection refused" Apr 16 20:29:38.827571 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:29:38.827534 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-566d8bd6d5-kggxl"] Apr 16 20:29:38.827997 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:29:38.827982 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="34962b5c-cbcd-4a3a-8d43-f29b1c02d038" containerName="main" Apr 16 20:29:38.827997 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:29:38.827997 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="34962b5c-cbcd-4a3a-8d43-f29b1c02d038" containerName="main" Apr 16 20:29:38.828120 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:29:38.828008 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="34962b5c-cbcd-4a3a-8d43-f29b1c02d038" containerName="storage-initializer" Apr 16 20:29:38.828120 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:29:38.828014 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="34962b5c-cbcd-4a3a-8d43-f29b1c02d038" containerName="storage-initializer" Apr 16 20:29:38.828120 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:29:38.828088 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="34962b5c-cbcd-4a3a-8d43-f29b1c02d038" containerName="main" Apr 16 20:29:38.831236 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:29:38.831220 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-566d8bd6d5-kggxl" Apr 16 20:29:38.833537 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:29:38.833521 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-test-kserve-self-signed-certs\"" Apr 16 20:29:38.843590 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:29:38.843565 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-566d8bd6d5-kggxl"] Apr 16 20:29:38.997119 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:29:38.997083 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9477d0c5-85ec-43d2-85c8-58c90961d95b-dshm\") pod \"router-with-refs-test-kserve-566d8bd6d5-kggxl\" (UID: \"9477d0c5-85ec-43d2-85c8-58c90961d95b\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-566d8bd6d5-kggxl" Apr 16 20:29:38.997119 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:29:38.997120 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9477d0c5-85ec-43d2-85c8-58c90961d95b-tls-certs\") pod \"router-with-refs-test-kserve-566d8bd6d5-kggxl\" (UID: \"9477d0c5-85ec-43d2-85c8-58c90961d95b\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-566d8bd6d5-kggxl" Apr 16 20:29:38.997352 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:29:38.997224 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9477d0c5-85ec-43d2-85c8-58c90961d95b-model-cache\") pod \"router-with-refs-test-kserve-566d8bd6d5-kggxl\" (UID: \"9477d0c5-85ec-43d2-85c8-58c90961d95b\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-566d8bd6d5-kggxl" Apr 16 20:29:38.997352 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:29:38.997284 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9477d0c5-85ec-43d2-85c8-58c90961d95b-kserve-provision-location\") pod \"router-with-refs-test-kserve-566d8bd6d5-kggxl\" (UID: \"9477d0c5-85ec-43d2-85c8-58c90961d95b\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-566d8bd6d5-kggxl" Apr 16 20:29:38.997466 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:29:38.997411 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9477d0c5-85ec-43d2-85c8-58c90961d95b-home\") pod \"router-with-refs-test-kserve-566d8bd6d5-kggxl\" (UID: \"9477d0c5-85ec-43d2-85c8-58c90961d95b\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-566d8bd6d5-kggxl" Apr 16 20:29:38.997537 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:29:38.997519 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttf6d\" (UniqueName: \"kubernetes.io/projected/9477d0c5-85ec-43d2-85c8-58c90961d95b-kube-api-access-ttf6d\") pod \"router-with-refs-test-kserve-566d8bd6d5-kggxl\" (UID: \"9477d0c5-85ec-43d2-85c8-58c90961d95b\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-566d8bd6d5-kggxl" Apr 16 20:29:39.098716 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:29:39.098682 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9477d0c5-85ec-43d2-85c8-58c90961d95b-model-cache\") pod \"router-with-refs-test-kserve-566d8bd6d5-kggxl\" (UID: \"9477d0c5-85ec-43d2-85c8-58c90961d95b\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-566d8bd6d5-kggxl" Apr 16 20:29:39.098904 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:29:39.098724 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9477d0c5-85ec-43d2-85c8-58c90961d95b-kserve-provision-location\") pod \"router-with-refs-test-kserve-566d8bd6d5-kggxl\" (UID: \"9477d0c5-85ec-43d2-85c8-58c90961d95b\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-566d8bd6d5-kggxl" Apr 16 20:29:39.098904 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:29:39.098757 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9477d0c5-85ec-43d2-85c8-58c90961d95b-home\") pod \"router-with-refs-test-kserve-566d8bd6d5-kggxl\" (UID: \"9477d0c5-85ec-43d2-85c8-58c90961d95b\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-566d8bd6d5-kggxl" Apr 16 20:29:39.098904 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:29:39.098774 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ttf6d\" (UniqueName: \"kubernetes.io/projected/9477d0c5-85ec-43d2-85c8-58c90961d95b-kube-api-access-ttf6d\") pod \"router-with-refs-test-kserve-566d8bd6d5-kggxl\" (UID: \"9477d0c5-85ec-43d2-85c8-58c90961d95b\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-566d8bd6d5-kggxl" Apr 16 20:29:39.098904 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:29:39.098826 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9477d0c5-85ec-43d2-85c8-58c90961d95b-dshm\") pod \"router-with-refs-test-kserve-566d8bd6d5-kggxl\" (UID: \"9477d0c5-85ec-43d2-85c8-58c90961d95b\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-566d8bd6d5-kggxl" Apr 16 20:29:39.098904 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:29:39.098851 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9477d0c5-85ec-43d2-85c8-58c90961d95b-tls-certs\") pod \"router-with-refs-test-kserve-566d8bd6d5-kggxl\" (UID: \"9477d0c5-85ec-43d2-85c8-58c90961d95b\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-566d8bd6d5-kggxl" Apr 16 20:29:39.099518 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:29:39.099161 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9477d0c5-85ec-43d2-85c8-58c90961d95b-model-cache\") pod \"router-with-refs-test-kserve-566d8bd6d5-kggxl\" (UID: \"9477d0c5-85ec-43d2-85c8-58c90961d95b\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-566d8bd6d5-kggxl" Apr 16 20:29:39.099518 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:29:39.099212 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9477d0c5-85ec-43d2-85c8-58c90961d95b-kserve-provision-location\") pod \"router-with-refs-test-kserve-566d8bd6d5-kggxl\" (UID: \"9477d0c5-85ec-43d2-85c8-58c90961d95b\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-566d8bd6d5-kggxl" Apr 16 20:29:39.099518 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:29:39.099250 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9477d0c5-85ec-43d2-85c8-58c90961d95b-home\") pod \"router-with-refs-test-kserve-566d8bd6d5-kggxl\" (UID: \"9477d0c5-85ec-43d2-85c8-58c90961d95b\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-566d8bd6d5-kggxl" Apr 16 20:29:39.101282 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:29:39.101255 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9477d0c5-85ec-43d2-85c8-58c90961d95b-dshm\") pod \"router-with-refs-test-kserve-566d8bd6d5-kggxl\" (UID: \"9477d0c5-85ec-43d2-85c8-58c90961d95b\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-566d8bd6d5-kggxl" Apr 16 20:29:39.101531 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:29:39.101514 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9477d0c5-85ec-43d2-85c8-58c90961d95b-tls-certs\") pod \"router-with-refs-test-kserve-566d8bd6d5-kggxl\" (UID: \"9477d0c5-85ec-43d2-85c8-58c90961d95b\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-566d8bd6d5-kggxl" Apr 16 20:29:39.111424 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:29:39.111397 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttf6d\" (UniqueName: \"kubernetes.io/projected/9477d0c5-85ec-43d2-85c8-58c90961d95b-kube-api-access-ttf6d\") pod \"router-with-refs-test-kserve-566d8bd6d5-kggxl\" (UID: \"9477d0c5-85ec-43d2-85c8-58c90961d95b\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-566d8bd6d5-kggxl" Apr 16 20:29:39.142088 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:29:39.142063 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-566d8bd6d5-kggxl" Apr 16 20:29:39.271575 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:29:39.271545 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-566d8bd6d5-kggxl"] Apr 16 20:29:39.272834 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:29:39.272800 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9477d0c5_85ec_43d2_85c8_58c90961d95b.slice/crio-96fab77d680dad5abfac730d85f73a7e609c17376ae7bbf6814ab8b070e44be7 WatchSource:0}: Error finding container 96fab77d680dad5abfac730d85f73a7e609c17376ae7bbf6814ab8b070e44be7: Status 404 returned error can't find the container with id 96fab77d680dad5abfac730d85f73a7e609c17376ae7bbf6814ab8b070e44be7 Apr 16 20:29:39.948661 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:29:39.948625 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-566d8bd6d5-kggxl" event={"ID":"9477d0c5-85ec-43d2-85c8-58c90961d95b","Type":"ContainerStarted","Data":"097a155fb91572f0db9790168c79f39f762dba15cc68ae2f1b24ba03ac9bd97e"} Apr 16 20:29:39.948661 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:29:39.948666 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-566d8bd6d5-kggxl" event={"ID":"9477d0c5-85ec-43d2-85c8-58c90961d95b","Type":"ContainerStarted","Data":"96fab77d680dad5abfac730d85f73a7e609c17376ae7bbf6814ab8b070e44be7"} Apr 16 20:29:43.965575 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:29:43.965538 2577 generic.go:358] "Generic (PLEG): container finished" podID="9477d0c5-85ec-43d2-85c8-58c90961d95b" containerID="097a155fb91572f0db9790168c79f39f762dba15cc68ae2f1b24ba03ac9bd97e" exitCode=0 Apr 16 20:29:43.965933 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:29:43.965612 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-566d8bd6d5-kggxl" event={"ID":"9477d0c5-85ec-43d2-85c8-58c90961d95b","Type":"ContainerDied","Data":"097a155fb91572f0db9790168c79f39f762dba15cc68ae2f1b24ba03ac9bd97e"} Apr 16 20:29:44.972019 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:29:44.971978 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-566d8bd6d5-kggxl" event={"ID":"9477d0c5-85ec-43d2-85c8-58c90961d95b","Type":"ContainerStarted","Data":"3c80d47de0ade9c718bb2272431a676d1437bceb0b359376e9639d0b3aa5db60"} Apr 16 20:29:44.994134 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:29:44.994086 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-566d8bd6d5-kggxl" podStartSLOduration=6.994072354 podStartE2EDuration="6.994072354s" podCreationTimestamp="2026-04-16 20:29:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:29:44.992521898 +0000 UTC m=+1084.668790128" watchObservedRunningTime="2026-04-16 20:29:44.994072354 +0000 UTC m=+1084.670340591" Apr 16 20:29:48.816125 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:29:48.816070 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f8b49c797-l8n4v" podUID="c56f5f1c-089c-44a3-af14-8b95bc23df2a" containerName="main" probeResult="failure" output="Get \"https://10.133.0.49:8000/health\": dial tcp 10.133.0.49:8000: connect: connection refused" Apr 16 20:29:49.142811 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:29:49.142774 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-566d8bd6d5-kggxl" Apr 16 20:29:49.143057 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:29:49.142823 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-566d8bd6d5-kggxl" Apr 16 20:29:49.144589 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:29:49.144559 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-566d8bd6d5-kggxl" podUID="9477d0c5-85ec-43d2-85c8-58c90961d95b" containerName="main" probeResult="failure" output="Get \"https://10.133.0.50:8000/health\": dial tcp 10.133.0.50:8000: connect: connection refused" Apr 16 20:29:58.816709 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:29:58.816660 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f8b49c797-l8n4v" podUID="c56f5f1c-089c-44a3-af14-8b95bc23df2a" containerName="main" probeResult="failure" output="Get \"https://10.133.0.49:8000/health\": dial tcp 10.133.0.49:8000: connect: connection refused" Apr 16 20:29:59.143592 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:29:59.143547 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-566d8bd6d5-kggxl" podUID="9477d0c5-85ec-43d2-85c8-58c90961d95b" containerName="main" probeResult="failure" output="Get \"https://10.133.0.50:8000/health\": dial tcp 10.133.0.50:8000: connect: connection refused" Apr 16 20:30:08.816760 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:30:08.816705 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f8b49c797-l8n4v" podUID="c56f5f1c-089c-44a3-af14-8b95bc23df2a" containerName="main" probeResult="failure" output="Get \"https://10.133.0.49:8000/health\": dial tcp 10.133.0.49:8000: connect: connection refused" Apr 16 20:30:09.142811 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:30:09.142775 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-566d8bd6d5-kggxl" podUID="9477d0c5-85ec-43d2-85c8-58c90961d95b" containerName="main" probeResult="failure" output="Get \"https://10.133.0.50:8000/health\": dial tcp 10.133.0.50:8000: connect: connection refused" Apr 16 20:30:18.816497 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:30:18.816440 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f8b49c797-l8n4v" podUID="c56f5f1c-089c-44a3-af14-8b95bc23df2a" containerName="main" probeResult="failure" output="Get \"https://10.133.0.49:8000/health\": dial tcp 10.133.0.49:8000: connect: connection refused" Apr 16 20:30:19.143263 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:30:19.143216 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-566d8bd6d5-kggxl" podUID="9477d0c5-85ec-43d2-85c8-58c90961d95b" containerName="main" probeResult="failure" output="Get \"https://10.133.0.50:8000/health\": dial tcp 10.133.0.50:8000: connect: connection refused" Apr 16 20:30:28.825809 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:30:28.825772 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f8b49c797-l8n4v" Apr 16 20:30:28.833458 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:30:28.833436 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f8b49c797-l8n4v" Apr 16 20:30:29.143312 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:30:29.143258 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-566d8bd6d5-kggxl" podUID="9477d0c5-85ec-43d2-85c8-58c90961d95b" containerName="main" probeResult="failure" output="Get \"https://10.133.0.50:8000/health\": dial tcp 10.133.0.50:8000: connect: connection refused" Apr 16 20:30:30.332483 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:30:30.332451 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-5f8b49c797-l8n4v"] Apr 16 20:30:30.333333 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:30:30.333289 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f8b49c797-l8n4v" podUID="c56f5f1c-089c-44a3-af14-8b95bc23df2a" containerName="main" containerID="cri-o://a3e35adb89afbf8e0a746bb54202247c94ed614bd20010c60213869552b052b0" gracePeriod=30 Apr 16 20:30:39.142796 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:30:39.142752 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-566d8bd6d5-kggxl" podUID="9477d0c5-85ec-43d2-85c8-58c90961d95b" containerName="main" probeResult="failure" output="Get \"https://10.133.0.50:8000/health\": dial tcp 10.133.0.50:8000: connect: connection refused" Apr 16 20:30:49.142888 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:30:49.142840 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-566d8bd6d5-kggxl" podUID="9477d0c5-85ec-43d2-85c8-58c90961d95b" containerName="main" probeResult="failure" output="Get \"https://10.133.0.50:8000/health\": dial tcp 10.133.0.50:8000: connect: connection refused" Apr 16 20:30:59.143278 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:30:59.143223 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-566d8bd6d5-kggxl" podUID="9477d0c5-85ec-43d2-85c8-58c90961d95b" containerName="main" probeResult="failure" output="Get \"https://10.133.0.50:8000/health\": dial tcp 10.133.0.50:8000: connect: connection refused" Apr 16 20:31:00.598775 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:31:00.598713 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-5f8b49c797-l8n4v_c56f5f1c-089c-44a3-af14-8b95bc23df2a/main/0.log" Apr 16 20:31:00.599152 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:31:00.599112 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f8b49c797-l8n4v" Apr 16 20:31:00.760890 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:31:00.760859 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c56f5f1c-089c-44a3-af14-8b95bc23df2a-model-cache\") pod \"c56f5f1c-089c-44a3-af14-8b95bc23df2a\" (UID: \"c56f5f1c-089c-44a3-af14-8b95bc23df2a\") " Apr 16 20:31:00.761121 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:31:00.760970 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c56f5f1c-089c-44a3-af14-8b95bc23df2a-dshm\") pod \"c56f5f1c-089c-44a3-af14-8b95bc23df2a\" (UID: \"c56f5f1c-089c-44a3-af14-8b95bc23df2a\") " Apr 16 20:31:00.761121 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:31:00.761018 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c56f5f1c-089c-44a3-af14-8b95bc23df2a-kserve-provision-location\") pod \"c56f5f1c-089c-44a3-af14-8b95bc23df2a\" (UID: \"c56f5f1c-089c-44a3-af14-8b95bc23df2a\") " Apr 16 20:31:00.761121 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:31:00.761068 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c56f5f1c-089c-44a3-af14-8b95bc23df2a-home\") pod \"c56f5f1c-089c-44a3-af14-8b95bc23df2a\" (UID: \"c56f5f1c-089c-44a3-af14-8b95bc23df2a\") " Apr 16 20:31:00.761121 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:31:00.761126 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2vqdq\" (UniqueName: \"kubernetes.io/projected/c56f5f1c-089c-44a3-af14-8b95bc23df2a-kube-api-access-2vqdq\") pod \"c56f5f1c-089c-44a3-af14-8b95bc23df2a\" (UID: \"c56f5f1c-089c-44a3-af14-8b95bc23df2a\") " Apr 16 20:31:00.761360 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:31:00.761166 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c56f5f1c-089c-44a3-af14-8b95bc23df2a-tls-certs\") pod \"c56f5f1c-089c-44a3-af14-8b95bc23df2a\" (UID: \"c56f5f1c-089c-44a3-af14-8b95bc23df2a\") " Apr 16 20:31:00.761360 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:31:00.761161 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c56f5f1c-089c-44a3-af14-8b95bc23df2a-model-cache" (OuterVolumeSpecName: "model-cache") pod "c56f5f1c-089c-44a3-af14-8b95bc23df2a" (UID: "c56f5f1c-089c-44a3-af14-8b95bc23df2a"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:31:00.761464 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:31:00.761451 2577 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c56f5f1c-089c-44a3-af14-8b95bc23df2a-model-cache\") on node \"ip-10-0-131-31.ec2.internal\" DevicePath \"\"" Apr 16 20:31:00.761610 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:31:00.761582 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c56f5f1c-089c-44a3-af14-8b95bc23df2a-home" (OuterVolumeSpecName: "home") pod "c56f5f1c-089c-44a3-af14-8b95bc23df2a" (UID: "c56f5f1c-089c-44a3-af14-8b95bc23df2a"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:31:00.763725 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:31:00.763696 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c56f5f1c-089c-44a3-af14-8b95bc23df2a-dshm" (OuterVolumeSpecName: "dshm") pod "c56f5f1c-089c-44a3-af14-8b95bc23df2a" (UID: "c56f5f1c-089c-44a3-af14-8b95bc23df2a"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:31:00.763832 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:31:00.763765 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c56f5f1c-089c-44a3-af14-8b95bc23df2a-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "c56f5f1c-089c-44a3-af14-8b95bc23df2a" (UID: "c56f5f1c-089c-44a3-af14-8b95bc23df2a"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:31:00.763832 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:31:00.763766 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c56f5f1c-089c-44a3-af14-8b95bc23df2a-kube-api-access-2vqdq" (OuterVolumeSpecName: "kube-api-access-2vqdq") pod "c56f5f1c-089c-44a3-af14-8b95bc23df2a" (UID: "c56f5f1c-089c-44a3-af14-8b95bc23df2a"). InnerVolumeSpecName "kube-api-access-2vqdq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:31:00.818577 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:31:00.818517 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c56f5f1c-089c-44a3-af14-8b95bc23df2a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c56f5f1c-089c-44a3-af14-8b95bc23df2a" (UID: "c56f5f1c-089c-44a3-af14-8b95bc23df2a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:31:00.862411 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:31:00.862332 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2vqdq\" (UniqueName: \"kubernetes.io/projected/c56f5f1c-089c-44a3-af14-8b95bc23df2a-kube-api-access-2vqdq\") on node \"ip-10-0-131-31.ec2.internal\" DevicePath \"\"" Apr 16 20:31:00.862411 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:31:00.862371 2577 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c56f5f1c-089c-44a3-af14-8b95bc23df2a-tls-certs\") on node \"ip-10-0-131-31.ec2.internal\" DevicePath \"\"" Apr 16 20:31:00.862411 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:31:00.862388 2577 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c56f5f1c-089c-44a3-af14-8b95bc23df2a-dshm\") on node \"ip-10-0-131-31.ec2.internal\" DevicePath \"\"" Apr 16 20:31:00.862411 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:31:00.862402 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c56f5f1c-089c-44a3-af14-8b95bc23df2a-kserve-provision-location\") on node \"ip-10-0-131-31.ec2.internal\" DevicePath \"\"" Apr 16 20:31:00.862411 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:31:00.862414 2577 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c56f5f1c-089c-44a3-af14-8b95bc23df2a-home\") on node \"ip-10-0-131-31.ec2.internal\" DevicePath \"\"" Apr 16 20:31:01.297495 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:31:01.297421 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-5f8b49c797-l8n4v_c56f5f1c-089c-44a3-af14-8b95bc23df2a/main/0.log" Apr 16 20:31:01.297781 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:31:01.297757 2577 generic.go:358] "Generic (PLEG): container finished" podID="c56f5f1c-089c-44a3-af14-8b95bc23df2a" containerID="a3e35adb89afbf8e0a746bb54202247c94ed614bd20010c60213869552b052b0" exitCode=137 Apr 16 20:31:01.297856 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:31:01.297841 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f8b49c797-l8n4v" event={"ID":"c56f5f1c-089c-44a3-af14-8b95bc23df2a","Type":"ContainerDied","Data":"a3e35adb89afbf8e0a746bb54202247c94ed614bd20010c60213869552b052b0"} Apr 16 20:31:01.297896 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:31:01.297864 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f8b49c797-l8n4v" event={"ID":"c56f5f1c-089c-44a3-af14-8b95bc23df2a","Type":"ContainerDied","Data":"c9c3516e5b95f758d64a478aad79db235d25f18c50e5eb06a6e2c0161801f799"} Apr 16 20:31:01.297896 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:31:01.297880 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5f8b49c797-l8n4v" Apr 16 20:31:01.297985 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:31:01.297882 2577 scope.go:117] "RemoveContainer" containerID="a3e35adb89afbf8e0a746bb54202247c94ed614bd20010c60213869552b052b0" Apr 16 20:31:01.317441 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:31:01.317413 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-5f8b49c797-l8n4v"] Apr 16 20:31:01.318114 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:31:01.318088 2577 scope.go:117] "RemoveContainer" containerID="984804879cd71a5fa2b7bea1508ddf363af7a8c5b4bcaa969ab74d49828216fd" Apr 16 20:31:01.320570 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:31:01.320546 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-5f8b49c797-l8n4v"] Apr 16 20:31:01.383286 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:31:01.383133 2577 scope.go:117] "RemoveContainer" containerID="a3e35adb89afbf8e0a746bb54202247c94ed614bd20010c60213869552b052b0" Apr 16 20:31:01.383472 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:31:01.383455 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3e35adb89afbf8e0a746bb54202247c94ed614bd20010c60213869552b052b0\": container with ID starting with a3e35adb89afbf8e0a746bb54202247c94ed614bd20010c60213869552b052b0 not found: ID does not exist" containerID="a3e35adb89afbf8e0a746bb54202247c94ed614bd20010c60213869552b052b0" Apr 16 20:31:01.383526 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:31:01.383481 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3e35adb89afbf8e0a746bb54202247c94ed614bd20010c60213869552b052b0"} err="failed to get container status \"a3e35adb89afbf8e0a746bb54202247c94ed614bd20010c60213869552b052b0\": rpc error: code = NotFound desc = could not find container \"a3e35adb89afbf8e0a746bb54202247c94ed614bd20010c60213869552b052b0\": container with ID starting with a3e35adb89afbf8e0a746bb54202247c94ed614bd20010c60213869552b052b0 not found: ID does not exist" Apr 16 20:31:01.383526 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:31:01.383501 2577 scope.go:117] "RemoveContainer" containerID="984804879cd71a5fa2b7bea1508ddf363af7a8c5b4bcaa969ab74d49828216fd" Apr 16 20:31:01.383790 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:31:01.383775 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"984804879cd71a5fa2b7bea1508ddf363af7a8c5b4bcaa969ab74d49828216fd\": container with ID starting with 984804879cd71a5fa2b7bea1508ddf363af7a8c5b4bcaa969ab74d49828216fd not found: ID does not exist" containerID="984804879cd71a5fa2b7bea1508ddf363af7a8c5b4bcaa969ab74d49828216fd" Apr 16 20:31:01.383846 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:31:01.383793 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"984804879cd71a5fa2b7bea1508ddf363af7a8c5b4bcaa969ab74d49828216fd"} err="failed to get container status \"984804879cd71a5fa2b7bea1508ddf363af7a8c5b4bcaa969ab74d49828216fd\": rpc error: code = NotFound desc = could not find container \"984804879cd71a5fa2b7bea1508ddf363af7a8c5b4bcaa969ab74d49828216fd\": container with ID starting with 984804879cd71a5fa2b7bea1508ddf363af7a8c5b4bcaa969ab74d49828216fd not found: ID does not exist" Apr 16 20:31:02.876825 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:31:02.876788 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c56f5f1c-089c-44a3-af14-8b95bc23df2a" path="/var/lib/kubelet/pods/c56f5f1c-089c-44a3-af14-8b95bc23df2a/volumes" Apr 16 20:31:09.143350 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:31:09.143298 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-566d8bd6d5-kggxl" podUID="9477d0c5-85ec-43d2-85c8-58c90961d95b" containerName="main" probeResult="failure" output="Get \"https://10.133.0.50:8000/health\": dial tcp 10.133.0.50:8000: connect: connection refused" Apr 16 20:31:19.152347 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:31:19.152318 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-566d8bd6d5-kggxl" Apr 16 20:31:19.160766 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:31:19.160743 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-566d8bd6d5-kggxl" Apr 16 20:31:29.975864 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:31:29.975829 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-566d8bd6d5-kggxl"] Apr 16 20:31:29.976459 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:31:29.976214 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-566d8bd6d5-kggxl" podUID="9477d0c5-85ec-43d2-85c8-58c90961d95b" containerName="main" containerID="cri-o://3c80d47de0ade9c718bb2272431a676d1437bceb0b359376e9639d0b3aa5db60" gracePeriod=30 Apr 16 20:31:39.356424 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:31:39.356394 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c66dbc7"] Apr 16 20:31:39.356922 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:31:39.356892 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c56f5f1c-089c-44a3-af14-8b95bc23df2a" containerName="storage-initializer" Apr 16 20:31:39.356922 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:31:39.356923 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="c56f5f1c-089c-44a3-af14-8b95bc23df2a" containerName="storage-initializer" Apr 16 20:31:39.357153 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:31:39.356966 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c56f5f1c-089c-44a3-af14-8b95bc23df2a" containerName="main" Apr 16 20:31:39.357153 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:31:39.356975 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="c56f5f1c-089c-44a3-af14-8b95bc23df2a" containerName="main" Apr 16 20:31:39.357153 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:31:39.357063 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="c56f5f1c-089c-44a3-af14-8b95bc23df2a" containerName="main" Apr 16 20:31:39.360238 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:31:39.360193 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c66dbc7" Apr 16 20:31:39.363305 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:31:39.363283 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv8f1a6f044e8c7a4d31a250e0c4861caf-kserve-self-signed-certs\"" Apr 16 20:31:39.373241 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:31:39.373217 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c66dbc7"] Apr 16 20:31:39.492161 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:31:39.492127 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ad65e36f-272c-4e78-9d50-efafd6038367-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c66dbc7\" (UID: \"ad65e36f-272c-4e78-9d50-efafd6038367\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c66dbc7" Apr 16 20:31:39.492161 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:31:39.492163 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ad65e36f-272c-4e78-9d50-efafd6038367-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c66dbc7\" (UID: \"ad65e36f-272c-4e78-9d50-efafd6038367\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c66dbc7" Apr 16 20:31:39.492407 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:31:39.492186 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwtp6\" (UniqueName: \"kubernetes.io/projected/ad65e36f-272c-4e78-9d50-efafd6038367-kube-api-access-pwtp6\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c66dbc7\" (UID: \"ad65e36f-272c-4e78-9d50-efafd6038367\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c66dbc7" Apr 16 20:31:39.492407 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:31:39.492249 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ad65e36f-272c-4e78-9d50-efafd6038367-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c66dbc7\" (UID: \"ad65e36f-272c-4e78-9d50-efafd6038367\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c66dbc7" Apr 16 20:31:39.492407 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:31:39.492386 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ad65e36f-272c-4e78-9d50-efafd6038367-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c66dbc7\" (UID: \"ad65e36f-272c-4e78-9d50-efafd6038367\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c66dbc7" Apr 16 20:31:39.492559 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:31:39.492441 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ad65e36f-272c-4e78-9d50-efafd6038367-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c66dbc7\" (UID: \"ad65e36f-272c-4e78-9d50-efafd6038367\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c66dbc7" Apr 16 20:31:39.593564 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:31:39.593530 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ad65e36f-272c-4e78-9d50-efafd6038367-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c66dbc7\" (UID: \"ad65e36f-272c-4e78-9d50-efafd6038367\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c66dbc7" Apr 16 20:31:39.593733 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:31:39.593684 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ad65e36f-272c-4e78-9d50-efafd6038367-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c66dbc7\" (UID: \"ad65e36f-272c-4e78-9d50-efafd6038367\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c66dbc7" Apr 16 20:31:39.593733 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:31:39.593712 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ad65e36f-272c-4e78-9d50-efafd6038367-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c66dbc7\" (UID: \"ad65e36f-272c-4e78-9d50-efafd6038367\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c66dbc7" Apr 16 20:31:39.593854 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:31:39.593737 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pwtp6\" (UniqueName: \"kubernetes.io/projected/ad65e36f-272c-4e78-9d50-efafd6038367-kube-api-access-pwtp6\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c66dbc7\" (UID: \"ad65e36f-272c-4e78-9d50-efafd6038367\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c66dbc7" Apr 16 20:31:39.593854 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:31:39.593768 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ad65e36f-272c-4e78-9d50-efafd6038367-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c66dbc7\" (UID: \"ad65e36f-272c-4e78-9d50-efafd6038367\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c66dbc7" Apr 16 20:31:39.593976 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:31:39.593897 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ad65e36f-272c-4e78-9d50-efafd6038367-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c66dbc7\" (UID: \"ad65e36f-272c-4e78-9d50-efafd6038367\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c66dbc7" Apr 16 20:31:39.594035 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:31:39.593985 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ad65e36f-272c-4e78-9d50-efafd6038367-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c66dbc7\" (UID: \"ad65e36f-272c-4e78-9d50-efafd6038367\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c66dbc7" Apr 16 20:31:39.594035 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:31:39.594017 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ad65e36f-272c-4e78-9d50-efafd6038367-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c66dbc7\" (UID: \"ad65e36f-272c-4e78-9d50-efafd6038367\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c66dbc7" Apr 16 20:31:39.594391 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:31:39.594360 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ad65e36f-272c-4e78-9d50-efafd6038367-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c66dbc7\" (UID: \"ad65e36f-272c-4e78-9d50-efafd6038367\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c66dbc7" Apr 16 20:31:39.596440 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:31:39.596414 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ad65e36f-272c-4e78-9d50-efafd6038367-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c66dbc7\" (UID: \"ad65e36f-272c-4e78-9d50-efafd6038367\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c66dbc7" Apr 16 20:31:39.596581 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:31:39.596558 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ad65e36f-272c-4e78-9d50-efafd6038367-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c66dbc7\" (UID: \"ad65e36f-272c-4e78-9d50-efafd6038367\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c66dbc7" Apr 16 20:31:39.612235 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:31:39.612158 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwtp6\" (UniqueName: \"kubernetes.io/projected/ad65e36f-272c-4e78-9d50-efafd6038367-kube-api-access-pwtp6\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c66dbc7\" (UID: \"ad65e36f-272c-4e78-9d50-efafd6038367\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c66dbc7" Apr 16 20:31:39.673986 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:31:39.673938 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c66dbc7" Apr 16 20:31:39.804085 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:31:39.804061 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c66dbc7"] Apr 16 20:31:39.805988 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:31:39.805934 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad65e36f_272c_4e78_9d50_efafd6038367.slice/crio-9f22b181cb44181d01b0a5484425dd2b24689ca6f546443f783f361e2061eb02 WatchSource:0}: Error finding container 9f22b181cb44181d01b0a5484425dd2b24689ca6f546443f783f361e2061eb02: Status 404 returned error can't find the container with id 9f22b181cb44181d01b0a5484425dd2b24689ca6f546443f783f361e2061eb02 Apr 16 20:31:40.460840 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:31:40.460805 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c66dbc7" event={"ID":"ad65e36f-272c-4e78-9d50-efafd6038367","Type":"ContainerStarted","Data":"1e33ad773e594bccb4b64a82a0ad7ef6d853105c81a9b88084a8b49c300a400f"} Apr 16 20:31:40.460840 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:31:40.460848 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c66dbc7" event={"ID":"ad65e36f-272c-4e78-9d50-efafd6038367","Type":"ContainerStarted","Data":"9f22b181cb44181d01b0a5484425dd2b24689ca6f546443f783f361e2061eb02"} Apr 16 20:31:40.867117 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:31:40.867060 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-gbsjj_066e6004-882a-4b2c-a626-ecac83f8b502/console-operator/2.log" Apr 16 20:31:40.870347 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:31:40.870309 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-gbsjj_066e6004-882a-4b2c-a626-ecac83f8b502/console-operator/2.log" Apr 16 20:31:44.478686 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:31:44.478643 2577 generic.go:358] "Generic (PLEG): container finished" podID="ad65e36f-272c-4e78-9d50-efafd6038367" containerID="1e33ad773e594bccb4b64a82a0ad7ef6d853105c81a9b88084a8b49c300a400f" exitCode=0 Apr 16 20:31:44.479095 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:31:44.478719 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c66dbc7" event={"ID":"ad65e36f-272c-4e78-9d50-efafd6038367","Type":"ContainerDied","Data":"1e33ad773e594bccb4b64a82a0ad7ef6d853105c81a9b88084a8b49c300a400f"} Apr 16 20:31:45.484855 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:31:45.484819 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c66dbc7" event={"ID":"ad65e36f-272c-4e78-9d50-efafd6038367","Type":"ContainerStarted","Data":"6126592be78b12b500ce81ee25212bb5bb1558f5a7325ab93b30f1234f193819"} Apr 16 20:31:45.505371 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:31:45.505326 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c66dbc7" podStartSLOduration=6.505310274 podStartE2EDuration="6.505310274s" podCreationTimestamp="2026-04-16 20:31:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:31:45.502848219 +0000 UTC m=+1205.179116451" watchObservedRunningTime="2026-04-16 20:31:45.505310274 +0000 UTC m=+1205.181578488" Apr 16 20:31:49.674151 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:31:49.674106 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c66dbc7" Apr 16 20:31:49.674151 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:31:49.674155 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c66dbc7" Apr 16 20:31:49.675844 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:31:49.675809 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c66dbc7" podUID="ad65e36f-272c-4e78-9d50-efafd6038367" containerName="main" probeResult="failure" output="Get \"https://10.133.0.51:8000/health\": dial tcp 10.133.0.51:8000: connect: connection refused" Apr 16 20:31:59.674806 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:31:59.674767 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c66dbc7" podUID="ad65e36f-272c-4e78-9d50-efafd6038367" containerName="main" probeResult="failure" output="Get \"https://10.133.0.51:8000/health\": dial tcp 10.133.0.51:8000: connect: connection refused" Apr 16 20:32:00.263637 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:32:00.263609 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-test-kserve-566d8bd6d5-kggxl_9477d0c5-85ec-43d2-85c8-58c90961d95b/main/0.log" Apr 16 20:32:00.264025 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:32:00.264006 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-566d8bd6d5-kggxl" Apr 16 20:32:00.280604 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:32:00.280579 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ttf6d\" (UniqueName: \"kubernetes.io/projected/9477d0c5-85ec-43d2-85c8-58c90961d95b-kube-api-access-ttf6d\") pod \"9477d0c5-85ec-43d2-85c8-58c90961d95b\" (UID: \"9477d0c5-85ec-43d2-85c8-58c90961d95b\") " Apr 16 20:32:00.280729 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:32:00.280645 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9477d0c5-85ec-43d2-85c8-58c90961d95b-model-cache\") pod \"9477d0c5-85ec-43d2-85c8-58c90961d95b\" (UID: \"9477d0c5-85ec-43d2-85c8-58c90961d95b\") " Apr 16 20:32:00.280816 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:32:00.280745 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9477d0c5-85ec-43d2-85c8-58c90961d95b-kserve-provision-location\") pod \"9477d0c5-85ec-43d2-85c8-58c90961d95b\" (UID: \"9477d0c5-85ec-43d2-85c8-58c90961d95b\") " Apr 16 20:32:00.280816 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:32:00.280798 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9477d0c5-85ec-43d2-85c8-58c90961d95b-home\") pod \"9477d0c5-85ec-43d2-85c8-58c90961d95b\" (UID: \"9477d0c5-85ec-43d2-85c8-58c90961d95b\") " Apr 16 20:32:00.280939 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:32:00.280836 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9477d0c5-85ec-43d2-85c8-58c90961d95b-tls-certs\") pod \"9477d0c5-85ec-43d2-85c8-58c90961d95b\" (UID: \"9477d0c5-85ec-43d2-85c8-58c90961d95b\") " Apr 16 20:32:00.280939 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:32:00.280883 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9477d0c5-85ec-43d2-85c8-58c90961d95b-dshm\") pod \"9477d0c5-85ec-43d2-85c8-58c90961d95b\" (UID: \"9477d0c5-85ec-43d2-85c8-58c90961d95b\") " Apr 16 20:32:00.280939 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:32:00.280897 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9477d0c5-85ec-43d2-85c8-58c90961d95b-model-cache" (OuterVolumeSpecName: "model-cache") pod "9477d0c5-85ec-43d2-85c8-58c90961d95b" (UID: "9477d0c5-85ec-43d2-85c8-58c90961d95b"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:32:00.281195 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:32:00.281176 2577 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9477d0c5-85ec-43d2-85c8-58c90961d95b-model-cache\") on node \"ip-10-0-131-31.ec2.internal\" DevicePath \"\"" Apr 16 20:32:00.281270 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:32:00.281202 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9477d0c5-85ec-43d2-85c8-58c90961d95b-home" (OuterVolumeSpecName: "home") pod "9477d0c5-85ec-43d2-85c8-58c90961d95b" (UID: "9477d0c5-85ec-43d2-85c8-58c90961d95b"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:32:00.283480 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:32:00.283336 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9477d0c5-85ec-43d2-85c8-58c90961d95b-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "9477d0c5-85ec-43d2-85c8-58c90961d95b" (UID: "9477d0c5-85ec-43d2-85c8-58c90961d95b"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:32:00.283480 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:32:00.283465 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9477d0c5-85ec-43d2-85c8-58c90961d95b-kube-api-access-ttf6d" (OuterVolumeSpecName: "kube-api-access-ttf6d") pod "9477d0c5-85ec-43d2-85c8-58c90961d95b" (UID: "9477d0c5-85ec-43d2-85c8-58c90961d95b"). InnerVolumeSpecName "kube-api-access-ttf6d". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:32:00.283648 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:32:00.283547 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9477d0c5-85ec-43d2-85c8-58c90961d95b-dshm" (OuterVolumeSpecName: "dshm") pod "9477d0c5-85ec-43d2-85c8-58c90961d95b" (UID: "9477d0c5-85ec-43d2-85c8-58c90961d95b"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:32:00.339677 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:32:00.339635 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9477d0c5-85ec-43d2-85c8-58c90961d95b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "9477d0c5-85ec-43d2-85c8-58c90961d95b" (UID: "9477d0c5-85ec-43d2-85c8-58c90961d95b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:32:00.381650 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:32:00.381614 2577 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9477d0c5-85ec-43d2-85c8-58c90961d95b-dshm\") on node \"ip-10-0-131-31.ec2.internal\" DevicePath \"\"" Apr 16 20:32:00.381650 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:32:00.381643 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ttf6d\" (UniqueName: \"kubernetes.io/projected/9477d0c5-85ec-43d2-85c8-58c90961d95b-kube-api-access-ttf6d\") on node \"ip-10-0-131-31.ec2.internal\" DevicePath \"\"" Apr 16 20:32:00.381650 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:32:00.381654 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9477d0c5-85ec-43d2-85c8-58c90961d95b-kserve-provision-location\") on node \"ip-10-0-131-31.ec2.internal\" DevicePath \"\"" Apr 16 20:32:00.381875 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:32:00.381664 2577 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9477d0c5-85ec-43d2-85c8-58c90961d95b-home\") on node \"ip-10-0-131-31.ec2.internal\" DevicePath \"\"" Apr 16 20:32:00.381875 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:32:00.381673 2577 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9477d0c5-85ec-43d2-85c8-58c90961d95b-tls-certs\") on node \"ip-10-0-131-31.ec2.internal\" DevicePath \"\"" Apr 16 20:32:00.546439 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:32:00.546351 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-test-kserve-566d8bd6d5-kggxl_9477d0c5-85ec-43d2-85c8-58c90961d95b/main/0.log" Apr 16 20:32:00.546762 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:32:00.546738 2577 generic.go:358] "Generic (PLEG): container finished" podID="9477d0c5-85ec-43d2-85c8-58c90961d95b" containerID="3c80d47de0ade9c718bb2272431a676d1437bceb0b359376e9639d0b3aa5db60" exitCode=137 Apr 16 20:32:00.546838 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:32:00.546826 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-566d8bd6d5-kggxl" Apr 16 20:32:00.546924 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:32:00.546818 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-566d8bd6d5-kggxl" event={"ID":"9477d0c5-85ec-43d2-85c8-58c90961d95b","Type":"ContainerDied","Data":"3c80d47de0ade9c718bb2272431a676d1437bceb0b359376e9639d0b3aa5db60"} Apr 16 20:32:00.546998 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:32:00.546956 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-566d8bd6d5-kggxl" event={"ID":"9477d0c5-85ec-43d2-85c8-58c90961d95b","Type":"ContainerDied","Data":"96fab77d680dad5abfac730d85f73a7e609c17376ae7bbf6814ab8b070e44be7"} Apr 16 20:32:00.546998 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:32:00.546972 2577 scope.go:117] "RemoveContainer" containerID="3c80d47de0ade9c718bb2272431a676d1437bceb0b359376e9639d0b3aa5db60" Apr 16 20:32:00.571745 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:32:00.571709 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-566d8bd6d5-kggxl"] Apr 16 20:32:00.576742 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:32:00.576716 2577 scope.go:117] "RemoveContainer" containerID="097a155fb91572f0db9790168c79f39f762dba15cc68ae2f1b24ba03ac9bd97e" Apr 16 20:32:00.579059 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:32:00.579037 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-566d8bd6d5-kggxl"] Apr 16 20:32:00.650121 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:32:00.650098 2577 scope.go:117] "RemoveContainer" containerID="3c80d47de0ade9c718bb2272431a676d1437bceb0b359376e9639d0b3aa5db60" Apr 16 20:32:00.650474 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:32:00.650452 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c80d47de0ade9c718bb2272431a676d1437bceb0b359376e9639d0b3aa5db60\": container with ID starting with 3c80d47de0ade9c718bb2272431a676d1437bceb0b359376e9639d0b3aa5db60 not found: ID does not exist" containerID="3c80d47de0ade9c718bb2272431a676d1437bceb0b359376e9639d0b3aa5db60" Apr 16 20:32:00.650563 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:32:00.650487 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c80d47de0ade9c718bb2272431a676d1437bceb0b359376e9639d0b3aa5db60"} err="failed to get container status \"3c80d47de0ade9c718bb2272431a676d1437bceb0b359376e9639d0b3aa5db60\": rpc error: code = NotFound desc = could not find container \"3c80d47de0ade9c718bb2272431a676d1437bceb0b359376e9639d0b3aa5db60\": container with ID starting with 3c80d47de0ade9c718bb2272431a676d1437bceb0b359376e9639d0b3aa5db60 not found: ID does not exist" Apr 16 20:32:00.650563 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:32:00.650508 2577 scope.go:117] "RemoveContainer" containerID="097a155fb91572f0db9790168c79f39f762dba15cc68ae2f1b24ba03ac9bd97e" Apr 16 20:32:00.650773 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:32:00.650756 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"097a155fb91572f0db9790168c79f39f762dba15cc68ae2f1b24ba03ac9bd97e\": container with ID starting with 097a155fb91572f0db9790168c79f39f762dba15cc68ae2f1b24ba03ac9bd97e not found: ID does not exist" containerID="097a155fb91572f0db9790168c79f39f762dba15cc68ae2f1b24ba03ac9bd97e" Apr 16 20:32:00.650813 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:32:00.650778 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"097a155fb91572f0db9790168c79f39f762dba15cc68ae2f1b24ba03ac9bd97e"} err="failed to get container status \"097a155fb91572f0db9790168c79f39f762dba15cc68ae2f1b24ba03ac9bd97e\": rpc error: code = NotFound desc = could not find container \"097a155fb91572f0db9790168c79f39f762dba15cc68ae2f1b24ba03ac9bd97e\": container with ID starting with 097a155fb91572f0db9790168c79f39f762dba15cc68ae2f1b24ba03ac9bd97e not found: ID does not exist" Apr 16 20:32:00.877122 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:32:00.877092 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9477d0c5-85ec-43d2-85c8-58c90961d95b" path="/var/lib/kubelet/pods/9477d0c5-85ec-43d2-85c8-58c90961d95b/volumes" Apr 16 20:32:09.674562 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:32:09.674520 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c66dbc7" podUID="ad65e36f-272c-4e78-9d50-efafd6038367" containerName="main" probeResult="failure" output="Get \"https://10.133.0.51:8000/health\": dial tcp 10.133.0.51:8000: connect: connection refused" Apr 16 20:32:19.674479 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:32:19.674436 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c66dbc7" podUID="ad65e36f-272c-4e78-9d50-efafd6038367" containerName="main" probeResult="failure" output="Get \"https://10.133.0.51:8000/health\": dial tcp 10.133.0.51:8000: connect: connection refused" Apr 16 20:32:29.675306 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:32:29.675208 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c66dbc7" podUID="ad65e36f-272c-4e78-9d50-efafd6038367" containerName="main" probeResult="failure" output="Get \"https://10.133.0.51:8000/health\": dial tcp 10.133.0.51:8000: connect: connection refused" Apr 16 20:32:39.674598 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:32:39.674548 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c66dbc7" podUID="ad65e36f-272c-4e78-9d50-efafd6038367" containerName="main" probeResult="failure" output="Get \"https://10.133.0.51:8000/health\": dial tcp 10.133.0.51:8000: connect: connection refused" Apr 16 20:32:49.674613 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:32:49.674568 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c66dbc7" podUID="ad65e36f-272c-4e78-9d50-efafd6038367" containerName="main" probeResult="failure" output="Get \"https://10.133.0.51:8000/health\": dial tcp 10.133.0.51:8000: connect: connection refused" Apr 16 20:32:59.674503 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:32:59.674460 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c66dbc7" podUID="ad65e36f-272c-4e78-9d50-efafd6038367" containerName="main" probeResult="failure" output="Get \"https://10.133.0.51:8000/health\": dial tcp 10.133.0.51:8000: connect: connection refused" Apr 16 20:33:09.675257 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:33:09.675210 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c66dbc7" podUID="ad65e36f-272c-4e78-9d50-efafd6038367" containerName="main" probeResult="failure" output="Get \"https://10.133.0.51:8000/health\": dial tcp 10.133.0.51:8000: connect: connection refused" Apr 16 20:33:19.683932 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:33:19.683901 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c66dbc7" Apr 16 20:33:19.691705 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:33:19.691680 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c66dbc7" Apr 16 20:34:02.445071 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:02.444981 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c66dbc7"] Apr 16 20:34:02.445638 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:02.445349 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c66dbc7" podUID="ad65e36f-272c-4e78-9d50-efafd6038367" containerName="main" containerID="cri-o://6126592be78b12b500ce81ee25212bb5bb1558f5a7325ab93b30f1234f193819" gracePeriod=30 Apr 16 20:34:18.531363 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:18.531329 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c7bf48db5-9jdhg"] Apr 16 20:34:18.531904 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:18.531731 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9477d0c5-85ec-43d2-85c8-58c90961d95b" containerName="storage-initializer" Apr 16 20:34:18.531904 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:18.531744 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="9477d0c5-85ec-43d2-85c8-58c90961d95b" containerName="storage-initializer" Apr 16 20:34:18.531904 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:18.531755 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9477d0c5-85ec-43d2-85c8-58c90961d95b" containerName="main" Apr 16 20:34:18.531904 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:18.531761 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="9477d0c5-85ec-43d2-85c8-58c90961d95b" containerName="main" Apr 16 20:34:18.531904 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:18.531820 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="9477d0c5-85ec-43d2-85c8-58c90961d95b" containerName="main" Apr 16 20:34:18.534082 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:18.534066 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c7bf48db5-9jdhg" Apr 16 20:34:18.536604 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:18.536582 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-kserve-dockercfg-f8xjd\"" Apr 16 20:34:18.536776 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:18.536582 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-kserve-self-signed-certs\"" Apr 16 20:34:18.550810 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:18.550746 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c7bf48db5-9jdhg"] Apr 16 20:34:18.554213 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:18.554185 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-f67d97b6b-jgsqs"] Apr 16 20:34:18.556887 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:18.556868 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-f67d97b6b-jgsqs" Apr 16 20:34:18.570733 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:18.570703 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-f67d97b6b-jgsqs"] Apr 16 20:34:18.647615 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:18.647585 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwtjq\" (UniqueName: \"kubernetes.io/projected/69a059df-08aa-4f24-b614-6ee2ded6900a-kube-api-access-qwtjq\") pod \"custom-route-timeout-pd-test-kserve-prefill-f67d97b6b-jgsqs\" (UID: \"69a059df-08aa-4f24-b614-6ee2ded6900a\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-f67d97b6b-jgsqs" Apr 16 20:34:18.647779 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:18.647622 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e6705e2e-53fe-4fc1-b5ab-082ac0dcacae-dshm\") pod \"custom-route-timeout-pd-test-kserve-c7bf48db5-9jdhg\" (UID: \"e6705e2e-53fe-4fc1-b5ab-082ac0dcacae\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c7bf48db5-9jdhg" Apr 16 20:34:18.647779 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:18.647645 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/69a059df-08aa-4f24-b614-6ee2ded6900a-home\") pod \"custom-route-timeout-pd-test-kserve-prefill-f67d97b6b-jgsqs\" (UID: \"69a059df-08aa-4f24-b614-6ee2ded6900a\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-f67d97b6b-jgsqs" Apr 16 20:34:18.647779 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:18.647686 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/69a059df-08aa-4f24-b614-6ee2ded6900a-dshm\") pod \"custom-route-timeout-pd-test-kserve-prefill-f67d97b6b-jgsqs\" (UID: \"69a059df-08aa-4f24-b614-6ee2ded6900a\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-f67d97b6b-jgsqs" Apr 16 20:34:18.647779 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:18.647735 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/69a059df-08aa-4f24-b614-6ee2ded6900a-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-prefill-f67d97b6b-jgsqs\" (UID: \"69a059df-08aa-4f24-b614-6ee2ded6900a\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-f67d97b6b-jgsqs" Apr 16 20:34:18.647779 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:18.647758 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e6705e2e-53fe-4fc1-b5ab-082ac0dcacae-home\") pod \"custom-route-timeout-pd-test-kserve-c7bf48db5-9jdhg\" (UID: \"e6705e2e-53fe-4fc1-b5ab-082ac0dcacae\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c7bf48db5-9jdhg" Apr 16 20:34:18.647779 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:18.647777 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/69a059df-08aa-4f24-b614-6ee2ded6900a-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-prefill-f67d97b6b-jgsqs\" (UID: \"69a059df-08aa-4f24-b614-6ee2ded6900a\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-f67d97b6b-jgsqs" Apr 16 20:34:18.648048 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:18.647825 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e6705e2e-53fe-4fc1-b5ab-082ac0dcacae-model-cache\") pod \"custom-route-timeout-pd-test-kserve-c7bf48db5-9jdhg\" (UID: \"e6705e2e-53fe-4fc1-b5ab-082ac0dcacae\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c7bf48db5-9jdhg" Apr 16 20:34:18.648048 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:18.647855 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e6705e2e-53fe-4fc1-b5ab-082ac0dcacae-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-c7bf48db5-9jdhg\" (UID: \"e6705e2e-53fe-4fc1-b5ab-082ac0dcacae\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c7bf48db5-9jdhg" Apr 16 20:34:18.648048 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:18.647875 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e6705e2e-53fe-4fc1-b5ab-082ac0dcacae-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-c7bf48db5-9jdhg\" (UID: \"e6705e2e-53fe-4fc1-b5ab-082ac0dcacae\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c7bf48db5-9jdhg" Apr 16 20:34:18.648048 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:18.647896 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkfn9\" (UniqueName: \"kubernetes.io/projected/e6705e2e-53fe-4fc1-b5ab-082ac0dcacae-kube-api-access-mkfn9\") pod \"custom-route-timeout-pd-test-kserve-c7bf48db5-9jdhg\" (UID: \"e6705e2e-53fe-4fc1-b5ab-082ac0dcacae\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c7bf48db5-9jdhg" Apr 16 20:34:18.648048 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:18.647978 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/69a059df-08aa-4f24-b614-6ee2ded6900a-model-cache\") pod \"custom-route-timeout-pd-test-kserve-prefill-f67d97b6b-jgsqs\" (UID: \"69a059df-08aa-4f24-b614-6ee2ded6900a\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-f67d97b6b-jgsqs" Apr 16 20:34:18.749200 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:18.749173 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e6705e2e-53fe-4fc1-b5ab-082ac0dcacae-dshm\") pod \"custom-route-timeout-pd-test-kserve-c7bf48db5-9jdhg\" (UID: \"e6705e2e-53fe-4fc1-b5ab-082ac0dcacae\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c7bf48db5-9jdhg" Apr 16 20:34:18.749200 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:18.749213 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/69a059df-08aa-4f24-b614-6ee2ded6900a-home\") pod \"custom-route-timeout-pd-test-kserve-prefill-f67d97b6b-jgsqs\" (UID: \"69a059df-08aa-4f24-b614-6ee2ded6900a\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-f67d97b6b-jgsqs" Apr 16 20:34:18.749458 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:18.749229 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/69a059df-08aa-4f24-b614-6ee2ded6900a-dshm\") pod \"custom-route-timeout-pd-test-kserve-prefill-f67d97b6b-jgsqs\" (UID: \"69a059df-08aa-4f24-b614-6ee2ded6900a\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-f67d97b6b-jgsqs" Apr 16 20:34:18.749458 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:18.749255 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/69a059df-08aa-4f24-b614-6ee2ded6900a-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-prefill-f67d97b6b-jgsqs\" (UID: \"69a059df-08aa-4f24-b614-6ee2ded6900a\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-f67d97b6b-jgsqs" Apr 16 20:34:18.749458 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:18.749303 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e6705e2e-53fe-4fc1-b5ab-082ac0dcacae-home\") pod \"custom-route-timeout-pd-test-kserve-c7bf48db5-9jdhg\" (UID: \"e6705e2e-53fe-4fc1-b5ab-082ac0dcacae\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c7bf48db5-9jdhg" Apr 16 20:34:18.749458 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:18.749337 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/69a059df-08aa-4f24-b614-6ee2ded6900a-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-prefill-f67d97b6b-jgsqs\" (UID: \"69a059df-08aa-4f24-b614-6ee2ded6900a\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-f67d97b6b-jgsqs" Apr 16 20:34:18.749458 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:18.749375 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e6705e2e-53fe-4fc1-b5ab-082ac0dcacae-model-cache\") pod \"custom-route-timeout-pd-test-kserve-c7bf48db5-9jdhg\" (UID: \"e6705e2e-53fe-4fc1-b5ab-082ac0dcacae\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c7bf48db5-9jdhg" Apr 16 20:34:18.749458 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:18.749414 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e6705e2e-53fe-4fc1-b5ab-082ac0dcacae-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-c7bf48db5-9jdhg\" (UID: \"e6705e2e-53fe-4fc1-b5ab-082ac0dcacae\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c7bf48db5-9jdhg" Apr 16 20:34:18.749458 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:18.749447 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e6705e2e-53fe-4fc1-b5ab-082ac0dcacae-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-c7bf48db5-9jdhg\" (UID: \"e6705e2e-53fe-4fc1-b5ab-082ac0dcacae\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c7bf48db5-9jdhg" Apr 16 20:34:18.749829 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:18.749470 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mkfn9\" (UniqueName: \"kubernetes.io/projected/e6705e2e-53fe-4fc1-b5ab-082ac0dcacae-kube-api-access-mkfn9\") pod \"custom-route-timeout-pd-test-kserve-c7bf48db5-9jdhg\" (UID: \"e6705e2e-53fe-4fc1-b5ab-082ac0dcacae\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c7bf48db5-9jdhg" Apr 16 20:34:18.749829 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:18.749503 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/69a059df-08aa-4f24-b614-6ee2ded6900a-model-cache\") pod \"custom-route-timeout-pd-test-kserve-prefill-f67d97b6b-jgsqs\" (UID: \"69a059df-08aa-4f24-b614-6ee2ded6900a\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-f67d97b6b-jgsqs" Apr 16 20:34:18.749829 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:18.749573 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qwtjq\" (UniqueName: \"kubernetes.io/projected/69a059df-08aa-4f24-b614-6ee2ded6900a-kube-api-access-qwtjq\") pod \"custom-route-timeout-pd-test-kserve-prefill-f67d97b6b-jgsqs\" (UID: \"69a059df-08aa-4f24-b614-6ee2ded6900a\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-f67d97b6b-jgsqs" Apr 16 20:34:18.749829 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:18.749718 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e6705e2e-53fe-4fc1-b5ab-082ac0dcacae-home\") pod \"custom-route-timeout-pd-test-kserve-c7bf48db5-9jdhg\" (UID: \"e6705e2e-53fe-4fc1-b5ab-082ac0dcacae\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c7bf48db5-9jdhg" Apr 16 20:34:18.749829 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:18.749765 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/69a059df-08aa-4f24-b614-6ee2ded6900a-home\") pod \"custom-route-timeout-pd-test-kserve-prefill-f67d97b6b-jgsqs\" (UID: \"69a059df-08aa-4f24-b614-6ee2ded6900a\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-f67d97b6b-jgsqs" Apr 16 20:34:18.750113 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:18.749940 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e6705e2e-53fe-4fc1-b5ab-082ac0dcacae-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-c7bf48db5-9jdhg\" (UID: \"e6705e2e-53fe-4fc1-b5ab-082ac0dcacae\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c7bf48db5-9jdhg" Apr 16 20:34:18.750113 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:18.749985 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/69a059df-08aa-4f24-b614-6ee2ded6900a-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-prefill-f67d97b6b-jgsqs\" (UID: \"69a059df-08aa-4f24-b614-6ee2ded6900a\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-f67d97b6b-jgsqs" Apr 16 20:34:18.750113 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:18.750079 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/69a059df-08aa-4f24-b614-6ee2ded6900a-model-cache\") pod \"custom-route-timeout-pd-test-kserve-prefill-f67d97b6b-jgsqs\" (UID: \"69a059df-08aa-4f24-b614-6ee2ded6900a\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-f67d97b6b-jgsqs" Apr 16 20:34:18.750271 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:18.750181 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e6705e2e-53fe-4fc1-b5ab-082ac0dcacae-model-cache\") pod \"custom-route-timeout-pd-test-kserve-c7bf48db5-9jdhg\" (UID: \"e6705e2e-53fe-4fc1-b5ab-082ac0dcacae\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c7bf48db5-9jdhg" Apr 16 20:34:18.751794 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:18.751772 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/69a059df-08aa-4f24-b614-6ee2ded6900a-dshm\") pod \"custom-route-timeout-pd-test-kserve-prefill-f67d97b6b-jgsqs\" (UID: \"69a059df-08aa-4f24-b614-6ee2ded6900a\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-f67d97b6b-jgsqs" Apr 16 20:34:18.752522 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:18.752498 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e6705e2e-53fe-4fc1-b5ab-082ac0dcacae-dshm\") pod \"custom-route-timeout-pd-test-kserve-c7bf48db5-9jdhg\" (UID: \"e6705e2e-53fe-4fc1-b5ab-082ac0dcacae\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c7bf48db5-9jdhg" Apr 16 20:34:18.752938 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:18.752916 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e6705e2e-53fe-4fc1-b5ab-082ac0dcacae-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-c7bf48db5-9jdhg\" (UID: \"e6705e2e-53fe-4fc1-b5ab-082ac0dcacae\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c7bf48db5-9jdhg" Apr 16 20:34:18.753670 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:18.753183 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/69a059df-08aa-4f24-b614-6ee2ded6900a-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-prefill-f67d97b6b-jgsqs\" (UID: \"69a059df-08aa-4f24-b614-6ee2ded6900a\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-f67d97b6b-jgsqs" Apr 16 20:34:18.761448 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:18.761427 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwtjq\" (UniqueName: \"kubernetes.io/projected/69a059df-08aa-4f24-b614-6ee2ded6900a-kube-api-access-qwtjq\") pod \"custom-route-timeout-pd-test-kserve-prefill-f67d97b6b-jgsqs\" (UID: \"69a059df-08aa-4f24-b614-6ee2ded6900a\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-f67d97b6b-jgsqs" Apr 16 20:34:18.761754 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:18.761732 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkfn9\" (UniqueName: \"kubernetes.io/projected/e6705e2e-53fe-4fc1-b5ab-082ac0dcacae-kube-api-access-mkfn9\") pod \"custom-route-timeout-pd-test-kserve-c7bf48db5-9jdhg\" (UID: \"e6705e2e-53fe-4fc1-b5ab-082ac0dcacae\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c7bf48db5-9jdhg" Apr 16 20:34:18.844923 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:18.844883 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c7bf48db5-9jdhg" Apr 16 20:34:18.867977 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:18.867932 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-f67d97b6b-jgsqs" Apr 16 20:34:18.984357 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:18.984325 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c7bf48db5-9jdhg"] Apr 16 20:34:18.987726 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:34:18.987689 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6705e2e_53fe_4fc1_b5ab_082ac0dcacae.slice/crio-39f77d35911020737ea7fbe55d1fbade91b62bb951b9a9061671cd515876b267 WatchSource:0}: Error finding container 39f77d35911020737ea7fbe55d1fbade91b62bb951b9a9061671cd515876b267: Status 404 returned error can't find the container with id 39f77d35911020737ea7fbe55d1fbade91b62bb951b9a9061671cd515876b267 Apr 16 20:34:18.989508 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:18.989491 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 20:34:19.010008 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:19.009983 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-f67d97b6b-jgsqs"] Apr 16 20:34:19.011378 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:34:19.011350 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69a059df_08aa_4f24_b614_6ee2ded6900a.slice/crio-9f54dada4aea43681c9a0c25f2c55cd10a5825f88d6fd122ed63f2f3efebbef6 WatchSource:0}: Error finding container 9f54dada4aea43681c9a0c25f2c55cd10a5825f88d6fd122ed63f2f3efebbef6: Status 404 returned error can't find the container with id 9f54dada4aea43681c9a0c25f2c55cd10a5825f88d6fd122ed63f2f3efebbef6 Apr 16 20:34:19.105427 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:19.105333 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-f67d97b6b-jgsqs" event={"ID":"69a059df-08aa-4f24-b614-6ee2ded6900a","Type":"ContainerStarted","Data":"b96abc14fc88e2b696c8076698512218441cbd6f970a5872cbad0538376ccff4"} Apr 16 20:34:19.105427 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:19.105376 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-f67d97b6b-jgsqs" event={"ID":"69a059df-08aa-4f24-b614-6ee2ded6900a","Type":"ContainerStarted","Data":"9f54dada4aea43681c9a0c25f2c55cd10a5825f88d6fd122ed63f2f3efebbef6"} Apr 16 20:34:19.106496 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:19.106461 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c7bf48db5-9jdhg" event={"ID":"e6705e2e-53fe-4fc1-b5ab-082ac0dcacae","Type":"ContainerStarted","Data":"39f77d35911020737ea7fbe55d1fbade91b62bb951b9a9061671cd515876b267"} Apr 16 20:34:20.111708 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:20.111624 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c7bf48db5-9jdhg" event={"ID":"e6705e2e-53fe-4fc1-b5ab-082ac0dcacae","Type":"ContainerStarted","Data":"e0dfc49f864f31562f0cea0d7b98232a4a21a5167238e21e2c175e4c0dae7e34"} Apr 16 20:34:20.112268 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:20.111842 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c7bf48db5-9jdhg" Apr 16 20:34:21.116563 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:21.116532 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c7bf48db5-9jdhg" event={"ID":"e6705e2e-53fe-4fc1-b5ab-082ac0dcacae","Type":"ContainerStarted","Data":"79a16d285fdc867c05059a69ab394d0f9c8feb4cfc1e69c7041c5545d0f0b7fc"} Apr 16 20:34:24.131969 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:24.131918 2577 generic.go:358] "Generic (PLEG): container finished" podID="69a059df-08aa-4f24-b614-6ee2ded6900a" containerID="b96abc14fc88e2b696c8076698512218441cbd6f970a5872cbad0538376ccff4" exitCode=0 Apr 16 20:34:24.132331 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:24.131996 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-f67d97b6b-jgsqs" event={"ID":"69a059df-08aa-4f24-b614-6ee2ded6900a","Type":"ContainerDied","Data":"b96abc14fc88e2b696c8076698512218441cbd6f970a5872cbad0538376ccff4"} Apr 16 20:34:24.962495 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:24.962461 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 16 20:34:24.990181 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:24.990149 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 16 20:34:24.990344 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:24.990321 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 20:34:24.993546 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:24.993523 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-f312f5-cb7fb8cf-dockercfg-xl9fd\"" Apr 16 20:34:24.993682 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:24.993545 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs\"" Apr 16 20:34:25.115119 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:25.115090 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8dceeca5-f8d2-4905-8cbc-49c8603e6697-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"8dceeca5-f8d2-4905-8cbc-49c8603e6697\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 20:34:25.115119 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:25.115122 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8dceeca5-f8d2-4905-8cbc-49c8603e6697-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"8dceeca5-f8d2-4905-8cbc-49c8603e6697\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 20:34:25.115323 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:25.115148 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gb8t\" (UniqueName: \"kubernetes.io/projected/8dceeca5-f8d2-4905-8cbc-49c8603e6697-kube-api-access-2gb8t\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"8dceeca5-f8d2-4905-8cbc-49c8603e6697\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 20:34:25.115323 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:25.115272 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8dceeca5-f8d2-4905-8cbc-49c8603e6697-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"8dceeca5-f8d2-4905-8cbc-49c8603e6697\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 20:34:25.115323 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:25.115307 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8dceeca5-f8d2-4905-8cbc-49c8603e6697-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"8dceeca5-f8d2-4905-8cbc-49c8603e6697\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 20:34:25.115420 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:25.115355 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8dceeca5-f8d2-4905-8cbc-49c8603e6697-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"8dceeca5-f8d2-4905-8cbc-49c8603e6697\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 20:34:25.138841 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:25.138804 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-f67d97b6b-jgsqs" event={"ID":"69a059df-08aa-4f24-b614-6ee2ded6900a","Type":"ContainerStarted","Data":"93f6049b476617f7ca4f68bb44eccc6dde5e9141e8f55c99efc053095734765d"} Apr 16 20:34:25.141408 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:25.141380 2577 generic.go:358] "Generic (PLEG): container finished" podID="e6705e2e-53fe-4fc1-b5ab-082ac0dcacae" containerID="79a16d285fdc867c05059a69ab394d0f9c8feb4cfc1e69c7041c5545d0f0b7fc" exitCode=0 Apr 16 20:34:25.141548 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:25.141445 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c7bf48db5-9jdhg" event={"ID":"e6705e2e-53fe-4fc1-b5ab-082ac0dcacae","Type":"ContainerDied","Data":"79a16d285fdc867c05059a69ab394d0f9c8feb4cfc1e69c7041c5545d0f0b7fc"} Apr 16 20:34:25.165310 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:25.165249 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-f67d97b6b-jgsqs" podStartSLOduration=7.165230981 podStartE2EDuration="7.165230981s" podCreationTimestamp="2026-04-16 20:34:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:34:25.162568712 +0000 UTC m=+1364.838836944" watchObservedRunningTime="2026-04-16 20:34:25.165230981 +0000 UTC m=+1364.841499213" Apr 16 20:34:25.216068 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:25.216038 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8dceeca5-f8d2-4905-8cbc-49c8603e6697-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"8dceeca5-f8d2-4905-8cbc-49c8603e6697\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 20:34:25.216218 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:25.216088 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8dceeca5-f8d2-4905-8cbc-49c8603e6697-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"8dceeca5-f8d2-4905-8cbc-49c8603e6697\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 20:34:25.216218 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:25.216161 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8dceeca5-f8d2-4905-8cbc-49c8603e6697-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"8dceeca5-f8d2-4905-8cbc-49c8603e6697\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 20:34:25.216218 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:25.216189 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8dceeca5-f8d2-4905-8cbc-49c8603e6697-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"8dceeca5-f8d2-4905-8cbc-49c8603e6697\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 20:34:25.216372 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:25.216225 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2gb8t\" (UniqueName: \"kubernetes.io/projected/8dceeca5-f8d2-4905-8cbc-49c8603e6697-kube-api-access-2gb8t\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"8dceeca5-f8d2-4905-8cbc-49c8603e6697\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 20:34:25.216372 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:25.216363 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8dceeca5-f8d2-4905-8cbc-49c8603e6697-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"8dceeca5-f8d2-4905-8cbc-49c8603e6697\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 20:34:25.216516 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:25.216491 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8dceeca5-f8d2-4905-8cbc-49c8603e6697-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"8dceeca5-f8d2-4905-8cbc-49c8603e6697\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 20:34:25.216681 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:25.216620 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8dceeca5-f8d2-4905-8cbc-49c8603e6697-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"8dceeca5-f8d2-4905-8cbc-49c8603e6697\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 20:34:25.217005 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:25.216983 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8dceeca5-f8d2-4905-8cbc-49c8603e6697-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"8dceeca5-f8d2-4905-8cbc-49c8603e6697\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 20:34:25.218785 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:25.218746 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8dceeca5-f8d2-4905-8cbc-49c8603e6697-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"8dceeca5-f8d2-4905-8cbc-49c8603e6697\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 20:34:25.219728 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:25.219708 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8dceeca5-f8d2-4905-8cbc-49c8603e6697-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"8dceeca5-f8d2-4905-8cbc-49c8603e6697\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 20:34:25.224171 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:25.224146 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gb8t\" (UniqueName: \"kubernetes.io/projected/8dceeca5-f8d2-4905-8cbc-49c8603e6697-kube-api-access-2gb8t\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"8dceeca5-f8d2-4905-8cbc-49c8603e6697\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 20:34:25.301659 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:25.301623 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 20:34:25.446436 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:25.446405 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 16 20:34:25.448203 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:34:25.448174 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8dceeca5_f8d2_4905_8cbc_49c8603e6697.slice/crio-ebc1e4d31849cc5ea276493cee207e63f87e9fed51edd502730c3f83de93b643 WatchSource:0}: Error finding container ebc1e4d31849cc5ea276493cee207e63f87e9fed51edd502730c3f83de93b643: Status 404 returned error can't find the container with id ebc1e4d31849cc5ea276493cee207e63f87e9fed51edd502730c3f83de93b643 Apr 16 20:34:26.149231 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:26.149192 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"8dceeca5-f8d2-4905-8cbc-49c8603e6697","Type":"ContainerStarted","Data":"87db2dc2c29591e70778f6f91e90958e1a98449cb34d8bed141286743116e1ab"} Apr 16 20:34:26.149660 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:26.149340 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"8dceeca5-f8d2-4905-8cbc-49c8603e6697","Type":"ContainerStarted","Data":"ebc1e4d31849cc5ea276493cee207e63f87e9fed51edd502730c3f83de93b643"} Apr 16 20:34:26.151582 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:26.151553 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c7bf48db5-9jdhg" event={"ID":"e6705e2e-53fe-4fc1-b5ab-082ac0dcacae","Type":"ContainerStarted","Data":"b8d7b4988e13313e8085872ed4926a501725464764944e168ee81b84faa9457d"} Apr 16 20:34:26.185556 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:26.185492 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c7bf48db5-9jdhg" podStartSLOduration=7.410156708 podStartE2EDuration="8.185473042s" podCreationTimestamp="2026-04-16 20:34:18 +0000 UTC" firstStartedPulling="2026-04-16 20:34:18.989625411 +0000 UTC m=+1358.665893621" lastFinishedPulling="2026-04-16 20:34:19.764941745 +0000 UTC m=+1359.441209955" observedRunningTime="2026-04-16 20:34:26.184172078 +0000 UTC m=+1365.860440322" watchObservedRunningTime="2026-04-16 20:34:26.185473042 +0000 UTC m=+1365.861741274" Apr 16 20:34:28.845682 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:28.845633 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c7bf48db5-9jdhg" Apr 16 20:34:28.845682 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:28.845690 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c7bf48db5-9jdhg" Apr 16 20:34:28.847664 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:28.847634 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c7bf48db5-9jdhg" podUID="e6705e2e-53fe-4fc1-b5ab-082ac0dcacae" containerName="main" probeResult="failure" output="Get \"https://10.133.0.52:8001/health\": dial tcp 10.133.0.52:8001: connect: connection refused" Apr 16 20:34:28.868697 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:28.868651 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-f67d97b6b-jgsqs" Apr 16 20:34:28.868861 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:28.868708 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-f67d97b6b-jgsqs" Apr 16 20:34:28.870317 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:28.870278 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-f67d97b6b-jgsqs" podUID="69a059df-08aa-4f24-b614-6ee2ded6900a" containerName="main" probeResult="failure" output="Get \"https://10.133.0.53:8000/health\": dial tcp 10.133.0.53:8000: connect: connection refused" Apr 16 20:34:30.173106 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:30.173069 2577 generic.go:358] "Generic (PLEG): container finished" podID="8dceeca5-f8d2-4905-8cbc-49c8603e6697" containerID="87db2dc2c29591e70778f6f91e90958e1a98449cb34d8bed141286743116e1ab" exitCode=0 Apr 16 20:34:30.173851 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:30.173151 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"8dceeca5-f8d2-4905-8cbc-49c8603e6697","Type":"ContainerDied","Data":"87db2dc2c29591e70778f6f91e90958e1a98449cb34d8bed141286743116e1ab"} Apr 16 20:34:31.181286 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:31.181248 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"8dceeca5-f8d2-4905-8cbc-49c8603e6697","Type":"ContainerStarted","Data":"7417d82d38656d0980d569bb68ccf1b6fc7d0b5c2fd4966d5d9957606d9df931"} Apr 16 20:34:31.202760 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:31.202702 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" podStartSLOduration=7.202686742 podStartE2EDuration="7.202686742s" podCreationTimestamp="2026-04-16 20:34:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:34:31.199051876 +0000 UTC m=+1370.875320104" watchObservedRunningTime="2026-04-16 20:34:31.202686742 +0000 UTC m=+1370.878954973" Apr 16 20:34:32.787257 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:32.787226 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c66dbc7" Apr 16 20:34:32.896721 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:32.896683 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ad65e36f-272c-4e78-9d50-efafd6038367-dshm\") pod \"ad65e36f-272c-4e78-9d50-efafd6038367\" (UID: \"ad65e36f-272c-4e78-9d50-efafd6038367\") " Apr 16 20:34:32.896935 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:32.896733 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ad65e36f-272c-4e78-9d50-efafd6038367-kserve-provision-location\") pod \"ad65e36f-272c-4e78-9d50-efafd6038367\" (UID: \"ad65e36f-272c-4e78-9d50-efafd6038367\") " Apr 16 20:34:32.896935 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:32.896767 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ad65e36f-272c-4e78-9d50-efafd6038367-model-cache\") pod \"ad65e36f-272c-4e78-9d50-efafd6038367\" (UID: \"ad65e36f-272c-4e78-9d50-efafd6038367\") " Apr 16 20:34:32.896935 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:32.896839 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pwtp6\" (UniqueName: \"kubernetes.io/projected/ad65e36f-272c-4e78-9d50-efafd6038367-kube-api-access-pwtp6\") pod \"ad65e36f-272c-4e78-9d50-efafd6038367\" (UID: \"ad65e36f-272c-4e78-9d50-efafd6038367\") " Apr 16 20:34:32.896935 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:32.896873 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ad65e36f-272c-4e78-9d50-efafd6038367-tls-certs\") pod \"ad65e36f-272c-4e78-9d50-efafd6038367\" (UID: \"ad65e36f-272c-4e78-9d50-efafd6038367\") " Apr 16 20:34:32.896935 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:32.896918 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ad65e36f-272c-4e78-9d50-efafd6038367-home\") pod \"ad65e36f-272c-4e78-9d50-efafd6038367\" (UID: \"ad65e36f-272c-4e78-9d50-efafd6038367\") " Apr 16 20:34:32.897254 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:32.897065 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad65e36f-272c-4e78-9d50-efafd6038367-model-cache" (OuterVolumeSpecName: "model-cache") pod "ad65e36f-272c-4e78-9d50-efafd6038367" (UID: "ad65e36f-272c-4e78-9d50-efafd6038367"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:34:32.897322 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:32.897263 2577 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ad65e36f-272c-4e78-9d50-efafd6038367-model-cache\") on node \"ip-10-0-131-31.ec2.internal\" DevicePath \"\"" Apr 16 20:34:32.897412 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:32.897386 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad65e36f-272c-4e78-9d50-efafd6038367-home" (OuterVolumeSpecName: "home") pod "ad65e36f-272c-4e78-9d50-efafd6038367" (UID: "ad65e36f-272c-4e78-9d50-efafd6038367"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:34:32.899538 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:32.899493 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad65e36f-272c-4e78-9d50-efafd6038367-kube-api-access-pwtp6" (OuterVolumeSpecName: "kube-api-access-pwtp6") pod "ad65e36f-272c-4e78-9d50-efafd6038367" (UID: "ad65e36f-272c-4e78-9d50-efafd6038367"). InnerVolumeSpecName "kube-api-access-pwtp6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:34:32.899737 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:32.899709 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad65e36f-272c-4e78-9d50-efafd6038367-dshm" (OuterVolumeSpecName: "dshm") pod "ad65e36f-272c-4e78-9d50-efafd6038367" (UID: "ad65e36f-272c-4e78-9d50-efafd6038367"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:34:32.900174 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:32.900150 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad65e36f-272c-4e78-9d50-efafd6038367-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "ad65e36f-272c-4e78-9d50-efafd6038367" (UID: "ad65e36f-272c-4e78-9d50-efafd6038367"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:34:32.990428 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:32.990369 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad65e36f-272c-4e78-9d50-efafd6038367-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "ad65e36f-272c-4e78-9d50-efafd6038367" (UID: "ad65e36f-272c-4e78-9d50-efafd6038367"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:34:32.998637 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:32.998600 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pwtp6\" (UniqueName: \"kubernetes.io/projected/ad65e36f-272c-4e78-9d50-efafd6038367-kube-api-access-pwtp6\") on node \"ip-10-0-131-31.ec2.internal\" DevicePath \"\"" Apr 16 20:34:32.998637 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:32.998629 2577 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ad65e36f-272c-4e78-9d50-efafd6038367-tls-certs\") on node \"ip-10-0-131-31.ec2.internal\" DevicePath \"\"" Apr 16 20:34:32.998637 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:32.998641 2577 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ad65e36f-272c-4e78-9d50-efafd6038367-home\") on node \"ip-10-0-131-31.ec2.internal\" DevicePath \"\"" Apr 16 20:34:32.998905 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:32.998653 2577 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ad65e36f-272c-4e78-9d50-efafd6038367-dshm\") on node \"ip-10-0-131-31.ec2.internal\" DevicePath \"\"" Apr 16 20:34:32.998905 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:32.998666 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ad65e36f-272c-4e78-9d50-efafd6038367-kserve-provision-location\") on node \"ip-10-0-131-31.ec2.internal\" DevicePath \"\"" Apr 16 20:34:33.201669 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:33.201535 2577 generic.go:358] "Generic (PLEG): container finished" podID="ad65e36f-272c-4e78-9d50-efafd6038367" containerID="6126592be78b12b500ce81ee25212bb5bb1558f5a7325ab93b30f1234f193819" exitCode=137 Apr 16 20:34:33.201669 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:33.201579 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c66dbc7" event={"ID":"ad65e36f-272c-4e78-9d50-efafd6038367","Type":"ContainerDied","Data":"6126592be78b12b500ce81ee25212bb5bb1558f5a7325ab93b30f1234f193819"} Apr 16 20:34:33.201669 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:33.201629 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c66dbc7" event={"ID":"ad65e36f-272c-4e78-9d50-efafd6038367","Type":"ContainerDied","Data":"9f22b181cb44181d01b0a5484425dd2b24689ca6f546443f783f361e2061eb02"} Apr 16 20:34:33.201669 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:33.201650 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c66dbc7" Apr 16 20:34:33.202097 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:33.201651 2577 scope.go:117] "RemoveContainer" containerID="6126592be78b12b500ce81ee25212bb5bb1558f5a7325ab93b30f1234f193819" Apr 16 20:34:33.227430 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:33.227402 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c66dbc7"] Apr 16 20:34:33.231850 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:33.231830 2577 scope.go:117] "RemoveContainer" containerID="1e33ad773e594bccb4b64a82a0ad7ef6d853105c81a9b88084a8b49c300a400f" Apr 16 20:34:33.234393 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:33.234365 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c66dbc7"] Apr 16 20:34:33.312026 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:33.311977 2577 scope.go:117] "RemoveContainer" containerID="6126592be78b12b500ce81ee25212bb5bb1558f5a7325ab93b30f1234f193819" Apr 16 20:34:33.312415 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:34:33.312393 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6126592be78b12b500ce81ee25212bb5bb1558f5a7325ab93b30f1234f193819\": container with ID starting with 6126592be78b12b500ce81ee25212bb5bb1558f5a7325ab93b30f1234f193819 not found: ID does not exist" containerID="6126592be78b12b500ce81ee25212bb5bb1558f5a7325ab93b30f1234f193819" Apr 16 20:34:33.312498 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:33.312429 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6126592be78b12b500ce81ee25212bb5bb1558f5a7325ab93b30f1234f193819"} err="failed to get container status \"6126592be78b12b500ce81ee25212bb5bb1558f5a7325ab93b30f1234f193819\": rpc error: code = NotFound desc = could not find container \"6126592be78b12b500ce81ee25212bb5bb1558f5a7325ab93b30f1234f193819\": container with ID starting with 6126592be78b12b500ce81ee25212bb5bb1558f5a7325ab93b30f1234f193819 not found: ID does not exist" Apr 16 20:34:33.312498 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:33.312457 2577 scope.go:117] "RemoveContainer" containerID="1e33ad773e594bccb4b64a82a0ad7ef6d853105c81a9b88084a8b49c300a400f" Apr 16 20:34:33.312779 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:34:33.312760 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e33ad773e594bccb4b64a82a0ad7ef6d853105c81a9b88084a8b49c300a400f\": container with ID starting with 1e33ad773e594bccb4b64a82a0ad7ef6d853105c81a9b88084a8b49c300a400f not found: ID does not exist" containerID="1e33ad773e594bccb4b64a82a0ad7ef6d853105c81a9b88084a8b49c300a400f" Apr 16 20:34:33.312855 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:33.312788 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e33ad773e594bccb4b64a82a0ad7ef6d853105c81a9b88084a8b49c300a400f"} err="failed to get container status \"1e33ad773e594bccb4b64a82a0ad7ef6d853105c81a9b88084a8b49c300a400f\": rpc error: code = NotFound desc = could not find container \"1e33ad773e594bccb4b64a82a0ad7ef6d853105c81a9b88084a8b49c300a400f\": container with ID starting with 1e33ad773e594bccb4b64a82a0ad7ef6d853105c81a9b88084a8b49c300a400f not found: ID does not exist" Apr 16 20:34:34.877644 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:34.877604 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad65e36f-272c-4e78-9d50-efafd6038367" path="/var/lib/kubelet/pods/ad65e36f-272c-4e78-9d50-efafd6038367/volumes" Apr 16 20:34:38.845438 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:38.845394 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c7bf48db5-9jdhg" podUID="e6705e2e-53fe-4fc1-b5ab-082ac0dcacae" containerName="main" probeResult="failure" output="Get \"https://10.133.0.52:8001/health\": dial tcp 10.133.0.52:8001: connect: connection refused" Apr 16 20:34:38.857867 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:38.857836 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c7bf48db5-9jdhg" Apr 16 20:34:38.868748 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:38.868712 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-f67d97b6b-jgsqs" podUID="69a059df-08aa-4f24-b614-6ee2ded6900a" containerName="main" probeResult="failure" output="Get \"https://10.133.0.53:8000/health\": dial tcp 10.133.0.53:8000: connect: connection refused" Apr 16 20:34:48.846348 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:48.846296 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c7bf48db5-9jdhg" podUID="e6705e2e-53fe-4fc1-b5ab-082ac0dcacae" containerName="main" probeResult="failure" output="Get \"https://10.133.0.52:8001/health\": dial tcp 10.133.0.52:8001: connect: connection refused" Apr 16 20:34:48.868634 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:48.868597 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-f67d97b6b-jgsqs" podUID="69a059df-08aa-4f24-b614-6ee2ded6900a" containerName="main" probeResult="failure" output="Get \"https://10.133.0.53:8000/health\": dial tcp 10.133.0.53:8000: connect: connection refused" Apr 16 20:34:58.846361 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:58.846165 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c7bf48db5-9jdhg" podUID="e6705e2e-53fe-4fc1-b5ab-082ac0dcacae" containerName="main" probeResult="failure" output="Get \"https://10.133.0.52:8001/health\": dial tcp 10.133.0.52:8001: connect: connection refused" Apr 16 20:34:58.869142 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:34:58.869089 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-f67d97b6b-jgsqs" podUID="69a059df-08aa-4f24-b614-6ee2ded6900a" containerName="main" probeResult="failure" output="Get \"https://10.133.0.53:8000/health\": dial tcp 10.133.0.53:8000: connect: connection refused" Apr 16 20:35:08.845759 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:35:08.845711 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c7bf48db5-9jdhg" podUID="e6705e2e-53fe-4fc1-b5ab-082ac0dcacae" containerName="main" probeResult="failure" output="Get \"https://10.133.0.52:8001/health\": dial tcp 10.133.0.52:8001: connect: connection refused" Apr 16 20:35:08.868766 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:35:08.868722 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-f67d97b6b-jgsqs" podUID="69a059df-08aa-4f24-b614-6ee2ded6900a" containerName="main" probeResult="failure" output="Get \"https://10.133.0.53:8000/health\": dial tcp 10.133.0.53:8000: connect: connection refused" Apr 16 20:35:18.845394 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:35:18.845338 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c7bf48db5-9jdhg" podUID="e6705e2e-53fe-4fc1-b5ab-082ac0dcacae" containerName="main" probeResult="failure" output="Get \"https://10.133.0.52:8001/health\": dial tcp 10.133.0.52:8001: connect: connection refused" Apr 16 20:35:18.869046 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:35:18.868995 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-f67d97b6b-jgsqs" podUID="69a059df-08aa-4f24-b614-6ee2ded6900a" containerName="main" probeResult="failure" output="Get \"https://10.133.0.53:8000/health\": dial tcp 10.133.0.53:8000: connect: connection refused" Apr 16 20:35:28.845943 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:35:28.845895 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c7bf48db5-9jdhg" podUID="e6705e2e-53fe-4fc1-b5ab-082ac0dcacae" containerName="main" probeResult="failure" output="Get \"https://10.133.0.52:8001/health\": dial tcp 10.133.0.52:8001: connect: connection refused" Apr 16 20:35:28.868561 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:35:28.868522 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-f67d97b6b-jgsqs" podUID="69a059df-08aa-4f24-b614-6ee2ded6900a" containerName="main" probeResult="failure" output="Get \"https://10.133.0.53:8000/health\": dial tcp 10.133.0.53:8000: connect: connection refused" Apr 16 20:35:38.846075 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:35:38.846014 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c7bf48db5-9jdhg" podUID="e6705e2e-53fe-4fc1-b5ab-082ac0dcacae" containerName="main" probeResult="failure" output="Get \"https://10.133.0.52:8001/health\": dial tcp 10.133.0.52:8001: connect: connection refused" Apr 16 20:35:38.868672 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:35:38.868638 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-f67d97b6b-jgsqs" podUID="69a059df-08aa-4f24-b614-6ee2ded6900a" containerName="main" probeResult="failure" output="Get \"https://10.133.0.53:8000/health\": dial tcp 10.133.0.53:8000: connect: connection refused" Apr 16 20:35:48.845461 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:35:48.845405 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c7bf48db5-9jdhg" podUID="e6705e2e-53fe-4fc1-b5ab-082ac0dcacae" containerName="main" probeResult="failure" output="Get \"https://10.133.0.52:8001/health\": dial tcp 10.133.0.52:8001: connect: connection refused" Apr 16 20:35:48.869190 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:35:48.869151 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-f67d97b6b-jgsqs" podUID="69a059df-08aa-4f24-b614-6ee2ded6900a" containerName="main" probeResult="failure" output="Get \"https://10.133.0.53:8000/health\": dial tcp 10.133.0.53:8000: connect: connection refused" Apr 16 20:35:58.846073 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:35:58.846027 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c7bf48db5-9jdhg" podUID="e6705e2e-53fe-4fc1-b5ab-082ac0dcacae" containerName="main" probeResult="failure" output="Get \"https://10.133.0.52:8001/health\": dial tcp 10.133.0.52:8001: connect: connection refused" Apr 16 20:35:58.868808 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:35:58.868769 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-f67d97b6b-jgsqs" podUID="69a059df-08aa-4f24-b614-6ee2ded6900a" containerName="main" probeResult="failure" output="Get \"https://10.133.0.53:8000/health\": dial tcp 10.133.0.53:8000: connect: connection refused" Apr 16 20:36:08.845719 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:36:08.845660 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c7bf48db5-9jdhg" podUID="e6705e2e-53fe-4fc1-b5ab-082ac0dcacae" containerName="main" probeResult="failure" output="Get \"https://10.133.0.52:8001/health\": dial tcp 10.133.0.52:8001: connect: connection refused" Apr 16 20:36:08.869172 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:36:08.869131 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-f67d97b6b-jgsqs" podUID="69a059df-08aa-4f24-b614-6ee2ded6900a" containerName="main" probeResult="failure" output="Get \"https://10.133.0.53:8000/health\": dial tcp 10.133.0.53:8000: connect: connection refused" Apr 16 20:36:14.406479 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:36:14.406436 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 16 20:36:14.406984 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:36:14.406715 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" podUID="8dceeca5-f8d2-4905-8cbc-49c8603e6697" containerName="main" containerID="cri-o://7417d82d38656d0980d569bb68ccf1b6fc7d0b5c2fd4966d5d9957606d9df931" gracePeriod=30 Apr 16 20:36:18.846288 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:36:18.846227 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c7bf48db5-9jdhg" podUID="e6705e2e-53fe-4fc1-b5ab-082ac0dcacae" containerName="main" probeResult="failure" output="Get \"https://10.133.0.52:8001/health\": dial tcp 10.133.0.52:8001: connect: connection refused" Apr 16 20:36:18.868401 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:36:18.868362 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-f67d97b6b-jgsqs" podUID="69a059df-08aa-4f24-b614-6ee2ded6900a" containerName="main" probeResult="failure" output="Get \"https://10.133.0.53:8000/health\": dial tcp 10.133.0.53:8000: connect: connection refused" Apr 16 20:36:28.845789 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:36:28.845740 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c7bf48db5-9jdhg" podUID="e6705e2e-53fe-4fc1-b5ab-082ac0dcacae" containerName="main" probeResult="failure" output="Get \"https://10.133.0.52:8001/health\": dial tcp 10.133.0.52:8001: connect: connection refused" Apr 16 20:36:28.869104 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:36:28.869061 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-f67d97b6b-jgsqs" podUID="69a059df-08aa-4f24-b614-6ee2ded6900a" containerName="main" probeResult="failure" output="Get \"https://10.133.0.53:8000/health\": dial tcp 10.133.0.53:8000: connect: connection refused" Apr 16 20:36:38.846075 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:36:38.846027 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c7bf48db5-9jdhg" podUID="e6705e2e-53fe-4fc1-b5ab-082ac0dcacae" containerName="main" probeResult="failure" output="Get \"https://10.133.0.52:8001/health\": dial tcp 10.133.0.52:8001: connect: connection refused" Apr 16 20:36:38.869037 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:36:38.868995 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-f67d97b6b-jgsqs" podUID="69a059df-08aa-4f24-b614-6ee2ded6900a" containerName="main" probeResult="failure" output="Get \"https://10.133.0.53:8000/health\": dial tcp 10.133.0.53:8000: connect: connection refused" Apr 16 20:36:40.907716 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:36:40.907689 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-gbsjj_066e6004-882a-4b2c-a626-ecac83f8b502/console-operator/2.log" Apr 16 20:36:40.920751 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:36:40.920731 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-gbsjj_066e6004-882a-4b2c-a626-ecac83f8b502/console-operator/2.log" Apr 16 20:36:45.058533 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:36:45.058510 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1_8dceeca5-f8d2-4905-8cbc-49c8603e6697/main/0.log" Apr 16 20:36:45.058971 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:36:45.058931 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 20:36:45.136084 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:36:45.135976 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8dceeca5-f8d2-4905-8cbc-49c8603e6697-tls-certs\") pod \"8dceeca5-f8d2-4905-8cbc-49c8603e6697\" (UID: \"8dceeca5-f8d2-4905-8cbc-49c8603e6697\") " Apr 16 20:36:45.136084 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:36:45.136045 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8dceeca5-f8d2-4905-8cbc-49c8603e6697-dshm\") pod \"8dceeca5-f8d2-4905-8cbc-49c8603e6697\" (UID: \"8dceeca5-f8d2-4905-8cbc-49c8603e6697\") " Apr 16 20:36:45.136084 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:36:45.136079 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8dceeca5-f8d2-4905-8cbc-49c8603e6697-kserve-provision-location\") pod \"8dceeca5-f8d2-4905-8cbc-49c8603e6697\" (UID: \"8dceeca5-f8d2-4905-8cbc-49c8603e6697\") " Apr 16 20:36:45.136371 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:36:45.136108 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8dceeca5-f8d2-4905-8cbc-49c8603e6697-model-cache\") pod \"8dceeca5-f8d2-4905-8cbc-49c8603e6697\" (UID: \"8dceeca5-f8d2-4905-8cbc-49c8603e6697\") " Apr 16 20:36:45.136371 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:36:45.136155 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8dceeca5-f8d2-4905-8cbc-49c8603e6697-home\") pod \"8dceeca5-f8d2-4905-8cbc-49c8603e6697\" (UID: \"8dceeca5-f8d2-4905-8cbc-49c8603e6697\") " Apr 16 20:36:45.136371 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:36:45.136232 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2gb8t\" (UniqueName: \"kubernetes.io/projected/8dceeca5-f8d2-4905-8cbc-49c8603e6697-kube-api-access-2gb8t\") pod \"8dceeca5-f8d2-4905-8cbc-49c8603e6697\" (UID: \"8dceeca5-f8d2-4905-8cbc-49c8603e6697\") " Apr 16 20:36:45.136789 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:36:45.136659 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8dceeca5-f8d2-4905-8cbc-49c8603e6697-home" (OuterVolumeSpecName: "home") pod "8dceeca5-f8d2-4905-8cbc-49c8603e6697" (UID: "8dceeca5-f8d2-4905-8cbc-49c8603e6697"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:36:45.136789 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:36:45.136748 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8dceeca5-f8d2-4905-8cbc-49c8603e6697-model-cache" (OuterVolumeSpecName: "model-cache") pod "8dceeca5-f8d2-4905-8cbc-49c8603e6697" (UID: "8dceeca5-f8d2-4905-8cbc-49c8603e6697"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:36:45.138428 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:36:45.138394 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8dceeca5-f8d2-4905-8cbc-49c8603e6697-dshm" (OuterVolumeSpecName: "dshm") pod "8dceeca5-f8d2-4905-8cbc-49c8603e6697" (UID: "8dceeca5-f8d2-4905-8cbc-49c8603e6697"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:36:45.138521 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:36:45.138496 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8dceeca5-f8d2-4905-8cbc-49c8603e6697-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "8dceeca5-f8d2-4905-8cbc-49c8603e6697" (UID: "8dceeca5-f8d2-4905-8cbc-49c8603e6697"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:36:45.138740 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:36:45.138712 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8dceeca5-f8d2-4905-8cbc-49c8603e6697-kube-api-access-2gb8t" (OuterVolumeSpecName: "kube-api-access-2gb8t") pod "8dceeca5-f8d2-4905-8cbc-49c8603e6697" (UID: "8dceeca5-f8d2-4905-8cbc-49c8603e6697"). InnerVolumeSpecName "kube-api-access-2gb8t". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:36:45.190221 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:36:45.190169 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8dceeca5-f8d2-4905-8cbc-49c8603e6697-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "8dceeca5-f8d2-4905-8cbc-49c8603e6697" (UID: "8dceeca5-f8d2-4905-8cbc-49c8603e6697"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:36:45.237567 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:36:45.237534 2577 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8dceeca5-f8d2-4905-8cbc-49c8603e6697-tls-certs\") on node \"ip-10-0-131-31.ec2.internal\" DevicePath \"\"" Apr 16 20:36:45.237567 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:36:45.237561 2577 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8dceeca5-f8d2-4905-8cbc-49c8603e6697-dshm\") on node \"ip-10-0-131-31.ec2.internal\" DevicePath \"\"" Apr 16 20:36:45.237567 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:36:45.237571 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8dceeca5-f8d2-4905-8cbc-49c8603e6697-kserve-provision-location\") on node \"ip-10-0-131-31.ec2.internal\" DevicePath \"\"" Apr 16 20:36:45.237769 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:36:45.237580 2577 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8dceeca5-f8d2-4905-8cbc-49c8603e6697-model-cache\") on node \"ip-10-0-131-31.ec2.internal\" DevicePath \"\"" Apr 16 20:36:45.237769 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:36:45.237589 2577 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8dceeca5-f8d2-4905-8cbc-49c8603e6697-home\") on node \"ip-10-0-131-31.ec2.internal\" DevicePath \"\"" Apr 16 20:36:45.237769 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:36:45.237598 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2gb8t\" (UniqueName: \"kubernetes.io/projected/8dceeca5-f8d2-4905-8cbc-49c8603e6697-kube-api-access-2gb8t\") on node \"ip-10-0-131-31.ec2.internal\" DevicePath \"\"" Apr 16 20:36:45.789658 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:36:45.789618 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1_8dceeca5-f8d2-4905-8cbc-49c8603e6697/main/0.log" Apr 16 20:36:45.790052 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:36:45.790025 2577 generic.go:358] "Generic (PLEG): container finished" podID="8dceeca5-f8d2-4905-8cbc-49c8603e6697" containerID="7417d82d38656d0980d569bb68ccf1b6fc7d0b5c2fd4966d5d9957606d9df931" exitCode=137 Apr 16 20:36:45.790156 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:36:45.790114 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 20:36:45.790156 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:36:45.790133 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"8dceeca5-f8d2-4905-8cbc-49c8603e6697","Type":"ContainerDied","Data":"7417d82d38656d0980d569bb68ccf1b6fc7d0b5c2fd4966d5d9957606d9df931"} Apr 16 20:36:45.790282 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:36:45.790162 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"8dceeca5-f8d2-4905-8cbc-49c8603e6697","Type":"ContainerDied","Data":"ebc1e4d31849cc5ea276493cee207e63f87e9fed51edd502730c3f83de93b643"} Apr 16 20:36:45.790282 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:36:45.790180 2577 scope.go:117] "RemoveContainer" containerID="7417d82d38656d0980d569bb68ccf1b6fc7d0b5c2fd4966d5d9957606d9df931" Apr 16 20:36:45.801409 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:36:45.801357 2577 scope.go:117] "RemoveContainer" containerID="87db2dc2c29591e70778f6f91e90958e1a98449cb34d8bed141286743116e1ab" Apr 16 20:36:45.817977 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:36:45.817928 2577 scope.go:117] "RemoveContainer" containerID="7417d82d38656d0980d569bb68ccf1b6fc7d0b5c2fd4966d5d9957606d9df931" Apr 16 20:36:45.818393 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:36:45.818369 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7417d82d38656d0980d569bb68ccf1b6fc7d0b5c2fd4966d5d9957606d9df931\": container with ID starting with 7417d82d38656d0980d569bb68ccf1b6fc7d0b5c2fd4966d5d9957606d9df931 not found: ID does not exist" containerID="7417d82d38656d0980d569bb68ccf1b6fc7d0b5c2fd4966d5d9957606d9df931" Apr 16 20:36:45.818510 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:36:45.818412 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7417d82d38656d0980d569bb68ccf1b6fc7d0b5c2fd4966d5d9957606d9df931"} err="failed to get container status \"7417d82d38656d0980d569bb68ccf1b6fc7d0b5c2fd4966d5d9957606d9df931\": rpc error: code = NotFound desc = could not find container \"7417d82d38656d0980d569bb68ccf1b6fc7d0b5c2fd4966d5d9957606d9df931\": container with ID starting with 7417d82d38656d0980d569bb68ccf1b6fc7d0b5c2fd4966d5d9957606d9df931 not found: ID does not exist" Apr 16 20:36:45.818510 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:36:45.818437 2577 scope.go:117] "RemoveContainer" containerID="87db2dc2c29591e70778f6f91e90958e1a98449cb34d8bed141286743116e1ab" Apr 16 20:36:45.818823 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:36:45.818776 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87db2dc2c29591e70778f6f91e90958e1a98449cb34d8bed141286743116e1ab\": container with ID starting with 87db2dc2c29591e70778f6f91e90958e1a98449cb34d8bed141286743116e1ab not found: ID does not exist" containerID="87db2dc2c29591e70778f6f91e90958e1a98449cb34d8bed141286743116e1ab" Apr 16 20:36:45.818896 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:36:45.818817 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87db2dc2c29591e70778f6f91e90958e1a98449cb34d8bed141286743116e1ab"} err="failed to get container status \"87db2dc2c29591e70778f6f91e90958e1a98449cb34d8bed141286743116e1ab\": rpc error: code = NotFound desc = could not find container \"87db2dc2c29591e70778f6f91e90958e1a98449cb34d8bed141286743116e1ab\": container with ID starting with 87db2dc2c29591e70778f6f91e90958e1a98449cb34d8bed141286743116e1ab not found: ID does not exist" Apr 16 20:36:45.819870 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:36:45.819837 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 16 20:36:45.823555 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:36:45.823532 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 16 20:36:46.878046 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:36:46.878011 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8dceeca5-f8d2-4905-8cbc-49c8603e6697" path="/var/lib/kubelet/pods/8dceeca5-f8d2-4905-8cbc-49c8603e6697/volumes" Apr 16 20:36:48.845780 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:36:48.845727 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c7bf48db5-9jdhg" podUID="e6705e2e-53fe-4fc1-b5ab-082ac0dcacae" containerName="main" probeResult="failure" output="Get \"https://10.133.0.52:8001/health\": dial tcp 10.133.0.52:8001: connect: connection refused" Apr 16 20:36:48.869116 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:36:48.869076 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-f67d97b6b-jgsqs" podUID="69a059df-08aa-4f24-b614-6ee2ded6900a" containerName="main" probeResult="failure" output="Get \"https://10.133.0.53:8000/health\": dial tcp 10.133.0.53:8000: connect: connection refused" Apr 16 20:36:58.845301 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:36:58.845255 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c7bf48db5-9jdhg" podUID="e6705e2e-53fe-4fc1-b5ab-082ac0dcacae" containerName="main" probeResult="failure" output="Get \"https://10.133.0.52:8001/health\": dial tcp 10.133.0.52:8001: connect: connection refused" Apr 16 20:36:58.868815 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:36:58.868778 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-f67d97b6b-jgsqs" podUID="69a059df-08aa-4f24-b614-6ee2ded6900a" containerName="main" probeResult="failure" output="Get \"https://10.133.0.53:8000/health\": dial tcp 10.133.0.53:8000: connect: connection refused" Apr 16 20:37:08.854972 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:37:08.854879 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c7bf48db5-9jdhg" Apr 16 20:37:08.867516 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:37:08.867490 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c7bf48db5-9jdhg" Apr 16 20:37:08.878424 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:37:08.878405 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-f67d97b6b-jgsqs" Apr 16 20:37:08.888343 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:37:08.888318 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-f67d97b6b-jgsqs" Apr 16 20:37:20.855357 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:37:20.855321 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c7bf48db5-9jdhg"] Apr 16 20:37:20.855843 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:37:20.855722 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c7bf48db5-9jdhg" podUID="e6705e2e-53fe-4fc1-b5ab-082ac0dcacae" containerName="main" containerID="cri-o://b8d7b4988e13313e8085872ed4926a501725464764944e168ee81b84faa9457d" gracePeriod=30 Apr 16 20:37:20.860026 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:37:20.860002 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-f67d97b6b-jgsqs"] Apr 16 20:37:20.860261 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:37:20.860236 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-f67d97b6b-jgsqs" podUID="69a059df-08aa-4f24-b614-6ee2ded6900a" containerName="main" containerID="cri-o://93f6049b476617f7ca4f68bb44eccc6dde5e9141e8f55c99efc053095734765d" gracePeriod=30 Apr 16 20:37:50.856320 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:37:50.856257 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c7bf48db5-9jdhg" podUID="e6705e2e-53fe-4fc1-b5ab-082ac0dcacae" containerName="llm-d-routing-sidecar" containerID="cri-o://e0dfc49f864f31562f0cea0d7b98232a4a21a5167238e21e2c175e4c0dae7e34" gracePeriod=2 Apr 16 20:37:51.044724 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:37:51.044697 2577 generic.go:358] "Generic (PLEG): container finished" podID="e6705e2e-53fe-4fc1-b5ab-082ac0dcacae" containerID="e0dfc49f864f31562f0cea0d7b98232a4a21a5167238e21e2c175e4c0dae7e34" exitCode=0 Apr 16 20:37:51.044835 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:37:51.044740 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c7bf48db5-9jdhg" event={"ID":"e6705e2e-53fe-4fc1-b5ab-082ac0dcacae","Type":"ContainerDied","Data":"e0dfc49f864f31562f0cea0d7b98232a4a21a5167238e21e2c175e4c0dae7e34"} Apr 16 20:37:51.315359 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:37:51.315335 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-f67d97b6b-jgsqs" Apr 16 20:37:51.318369 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:37:51.318346 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-pd-test-kserve-c7bf48db5-9jdhg_e6705e2e-53fe-4fc1-b5ab-082ac0dcacae/main/0.log" Apr 16 20:37:51.318973 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:37:51.318932 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c7bf48db5-9jdhg" Apr 16 20:37:51.340021 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:37:51.340002 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e6705e2e-53fe-4fc1-b5ab-082ac0dcacae-dshm\") pod \"e6705e2e-53fe-4fc1-b5ab-082ac0dcacae\" (UID: \"e6705e2e-53fe-4fc1-b5ab-082ac0dcacae\") " Apr 16 20:37:51.340126 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:37:51.340042 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e6705e2e-53fe-4fc1-b5ab-082ac0dcacae-tls-certs\") pod \"e6705e2e-53fe-4fc1-b5ab-082ac0dcacae\" (UID: \"e6705e2e-53fe-4fc1-b5ab-082ac0dcacae\") " Apr 16 20:37:51.340126 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:37:51.340059 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e6705e2e-53fe-4fc1-b5ab-082ac0dcacae-home\") pod \"e6705e2e-53fe-4fc1-b5ab-082ac0dcacae\" (UID: \"e6705e2e-53fe-4fc1-b5ab-082ac0dcacae\") " Apr 16 20:37:51.340126 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:37:51.340079 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mkfn9\" (UniqueName: \"kubernetes.io/projected/e6705e2e-53fe-4fc1-b5ab-082ac0dcacae-kube-api-access-mkfn9\") pod \"e6705e2e-53fe-4fc1-b5ab-082ac0dcacae\" (UID: \"e6705e2e-53fe-4fc1-b5ab-082ac0dcacae\") " Apr 16 20:37:51.340126 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:37:51.340110 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/69a059df-08aa-4f24-b614-6ee2ded6900a-dshm\") pod \"69a059df-08aa-4f24-b614-6ee2ded6900a\" (UID: \"69a059df-08aa-4f24-b614-6ee2ded6900a\") " Apr 16 20:37:51.340339 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:37:51.340129 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/69a059df-08aa-4f24-b614-6ee2ded6900a-kserve-provision-location\") pod \"69a059df-08aa-4f24-b614-6ee2ded6900a\" (UID: \"69a059df-08aa-4f24-b614-6ee2ded6900a\") " Apr 16 20:37:51.340339 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:37:51.340145 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/69a059df-08aa-4f24-b614-6ee2ded6900a-model-cache\") pod \"69a059df-08aa-4f24-b614-6ee2ded6900a\" (UID: \"69a059df-08aa-4f24-b614-6ee2ded6900a\") " Apr 16 20:37:51.340339 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:37:51.340192 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e6705e2e-53fe-4fc1-b5ab-082ac0dcacae-kserve-provision-location\") pod \"e6705e2e-53fe-4fc1-b5ab-082ac0dcacae\" (UID: \"e6705e2e-53fe-4fc1-b5ab-082ac0dcacae\") " Apr 16 20:37:51.340339 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:37:51.340243 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e6705e2e-53fe-4fc1-b5ab-082ac0dcacae-model-cache\") pod \"e6705e2e-53fe-4fc1-b5ab-082ac0dcacae\" (UID: \"e6705e2e-53fe-4fc1-b5ab-082ac0dcacae\") " Apr 16 20:37:51.340339 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:37:51.340272 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qwtjq\" (UniqueName: \"kubernetes.io/projected/69a059df-08aa-4f24-b614-6ee2ded6900a-kube-api-access-qwtjq\") pod \"69a059df-08aa-4f24-b614-6ee2ded6900a\" (UID: \"69a059df-08aa-4f24-b614-6ee2ded6900a\") " Apr 16 20:37:51.340339 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:37:51.340303 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/69a059df-08aa-4f24-b614-6ee2ded6900a-tls-certs\") pod \"69a059df-08aa-4f24-b614-6ee2ded6900a\" (UID: \"69a059df-08aa-4f24-b614-6ee2ded6900a\") " Apr 16 20:37:51.340634 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:37:51.340352 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/69a059df-08aa-4f24-b614-6ee2ded6900a-home\") pod \"69a059df-08aa-4f24-b614-6ee2ded6900a\" (UID: \"69a059df-08aa-4f24-b614-6ee2ded6900a\") " Apr 16 20:37:51.340634 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:37:51.340429 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6705e2e-53fe-4fc1-b5ab-082ac0dcacae-home" (OuterVolumeSpecName: "home") pod "e6705e2e-53fe-4fc1-b5ab-082ac0dcacae" (UID: "e6705e2e-53fe-4fc1-b5ab-082ac0dcacae"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:37:51.340744 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:37:51.340691 2577 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e6705e2e-53fe-4fc1-b5ab-082ac0dcacae-home\") on node \"ip-10-0-131-31.ec2.internal\" DevicePath \"\"" Apr 16 20:37:51.341377 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:37:51.341233 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6705e2e-53fe-4fc1-b5ab-082ac0dcacae-model-cache" (OuterVolumeSpecName: "model-cache") pod "e6705e2e-53fe-4fc1-b5ab-082ac0dcacae" (UID: "e6705e2e-53fe-4fc1-b5ab-082ac0dcacae"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:37:51.341850 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:37:51.341562 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69a059df-08aa-4f24-b614-6ee2ded6900a-home" (OuterVolumeSpecName: "home") pod "69a059df-08aa-4f24-b614-6ee2ded6900a" (UID: "69a059df-08aa-4f24-b614-6ee2ded6900a"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:37:51.342046 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:37:51.342012 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69a059df-08aa-4f24-b614-6ee2ded6900a-model-cache" (OuterVolumeSpecName: "model-cache") pod "69a059df-08aa-4f24-b614-6ee2ded6900a" (UID: "69a059df-08aa-4f24-b614-6ee2ded6900a"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:37:51.342860 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:37:51.342833 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6705e2e-53fe-4fc1-b5ab-082ac0dcacae-dshm" (OuterVolumeSpecName: "dshm") pod "e6705e2e-53fe-4fc1-b5ab-082ac0dcacae" (UID: "e6705e2e-53fe-4fc1-b5ab-082ac0dcacae"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:37:51.342981 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:37:51.342904 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6705e2e-53fe-4fc1-b5ab-082ac0dcacae-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "e6705e2e-53fe-4fc1-b5ab-082ac0dcacae" (UID: "e6705e2e-53fe-4fc1-b5ab-082ac0dcacae"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:37:51.343234 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:37:51.343206 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69a059df-08aa-4f24-b614-6ee2ded6900a-kube-api-access-qwtjq" (OuterVolumeSpecName: "kube-api-access-qwtjq") pod "69a059df-08aa-4f24-b614-6ee2ded6900a" (UID: "69a059df-08aa-4f24-b614-6ee2ded6900a"). InnerVolumeSpecName "kube-api-access-qwtjq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:37:51.343425 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:37:51.343392 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69a059df-08aa-4f24-b614-6ee2ded6900a-dshm" (OuterVolumeSpecName: "dshm") pod "69a059df-08aa-4f24-b614-6ee2ded6900a" (UID: "69a059df-08aa-4f24-b614-6ee2ded6900a"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:37:51.343499 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:37:51.343448 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6705e2e-53fe-4fc1-b5ab-082ac0dcacae-kube-api-access-mkfn9" (OuterVolumeSpecName: "kube-api-access-mkfn9") pod "e6705e2e-53fe-4fc1-b5ab-082ac0dcacae" (UID: "e6705e2e-53fe-4fc1-b5ab-082ac0dcacae"). InnerVolumeSpecName "kube-api-access-mkfn9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:37:51.344450 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:37:51.344430 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69a059df-08aa-4f24-b614-6ee2ded6900a-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "69a059df-08aa-4f24-b614-6ee2ded6900a" (UID: "69a059df-08aa-4f24-b614-6ee2ded6900a"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:37:51.366761 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:37:51.366725 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69a059df-08aa-4f24-b614-6ee2ded6900a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "69a059df-08aa-4f24-b614-6ee2ded6900a" (UID: "69a059df-08aa-4f24-b614-6ee2ded6900a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:37:51.370422 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:37:51.370393 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6705e2e-53fe-4fc1-b5ab-082ac0dcacae-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "e6705e2e-53fe-4fc1-b5ab-082ac0dcacae" (UID: "e6705e2e-53fe-4fc1-b5ab-082ac0dcacae"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:37:51.441111 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:37:51.441046 2577 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e6705e2e-53fe-4fc1-b5ab-082ac0dcacae-dshm\") on node \"ip-10-0-131-31.ec2.internal\" DevicePath \"\"" Apr 16 20:37:51.441111 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:37:51.441069 2577 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e6705e2e-53fe-4fc1-b5ab-082ac0dcacae-tls-certs\") on node \"ip-10-0-131-31.ec2.internal\" DevicePath \"\"" Apr 16 20:37:51.441111 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:37:51.441080 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mkfn9\" (UniqueName: \"kubernetes.io/projected/e6705e2e-53fe-4fc1-b5ab-082ac0dcacae-kube-api-access-mkfn9\") on node \"ip-10-0-131-31.ec2.internal\" DevicePath \"\"" Apr 16 20:37:51.441111 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:37:51.441091 2577 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/69a059df-08aa-4f24-b614-6ee2ded6900a-dshm\") on node \"ip-10-0-131-31.ec2.internal\" DevicePath \"\"" Apr 16 20:37:51.441111 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:37:51.441099 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/69a059df-08aa-4f24-b614-6ee2ded6900a-kserve-provision-location\") on node \"ip-10-0-131-31.ec2.internal\" DevicePath \"\"" Apr 16 20:37:51.441111 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:37:51.441108 2577 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/69a059df-08aa-4f24-b614-6ee2ded6900a-model-cache\") on node \"ip-10-0-131-31.ec2.internal\" DevicePath \"\"" Apr 16 20:37:51.441380 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:37:51.441119 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e6705e2e-53fe-4fc1-b5ab-082ac0dcacae-kserve-provision-location\") on node \"ip-10-0-131-31.ec2.internal\" DevicePath \"\"" Apr 16 20:37:51.441380 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:37:51.441128 2577 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e6705e2e-53fe-4fc1-b5ab-082ac0dcacae-model-cache\") on node \"ip-10-0-131-31.ec2.internal\" DevicePath \"\"" Apr 16 20:37:51.441380 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:37:51.441137 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qwtjq\" (UniqueName: \"kubernetes.io/projected/69a059df-08aa-4f24-b614-6ee2ded6900a-kube-api-access-qwtjq\") on node \"ip-10-0-131-31.ec2.internal\" DevicePath \"\"" Apr 16 20:37:51.441380 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:37:51.441144 2577 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/69a059df-08aa-4f24-b614-6ee2ded6900a-tls-certs\") on node \"ip-10-0-131-31.ec2.internal\" DevicePath \"\"" Apr 16 20:37:51.441380 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:37:51.441152 2577 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/69a059df-08aa-4f24-b614-6ee2ded6900a-home\") on node \"ip-10-0-131-31.ec2.internal\" DevicePath \"\"" Apr 16 20:37:52.049924 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:37:52.049889 2577 generic.go:358] "Generic (PLEG): container finished" podID="69a059df-08aa-4f24-b614-6ee2ded6900a" containerID="93f6049b476617f7ca4f68bb44eccc6dde5e9141e8f55c99efc053095734765d" exitCode=137 Apr 16 20:37:52.050388 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:37:52.049973 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-f67d97b6b-jgsqs" event={"ID":"69a059df-08aa-4f24-b614-6ee2ded6900a","Type":"ContainerDied","Data":"93f6049b476617f7ca4f68bb44eccc6dde5e9141e8f55c99efc053095734765d"} Apr 16 20:37:52.050388 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:37:52.049999 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-f67d97b6b-jgsqs" Apr 16 20:37:52.050388 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:37:52.050015 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-f67d97b6b-jgsqs" event={"ID":"69a059df-08aa-4f24-b614-6ee2ded6900a","Type":"ContainerDied","Data":"9f54dada4aea43681c9a0c25f2c55cd10a5825f88d6fd122ed63f2f3efebbef6"} Apr 16 20:37:52.050388 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:37:52.050037 2577 scope.go:117] "RemoveContainer" containerID="93f6049b476617f7ca4f68bb44eccc6dde5e9141e8f55c99efc053095734765d" Apr 16 20:37:52.051503 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:37:52.051483 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-pd-test-kserve-c7bf48db5-9jdhg_e6705e2e-53fe-4fc1-b5ab-082ac0dcacae/main/0.log" Apr 16 20:37:52.052145 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:37:52.052123 2577 generic.go:358] "Generic (PLEG): container finished" podID="e6705e2e-53fe-4fc1-b5ab-082ac0dcacae" containerID="b8d7b4988e13313e8085872ed4926a501725464764944e168ee81b84faa9457d" exitCode=137 Apr 16 20:37:52.052254 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:37:52.052207 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c7bf48db5-9jdhg" event={"ID":"e6705e2e-53fe-4fc1-b5ab-082ac0dcacae","Type":"ContainerDied","Data":"b8d7b4988e13313e8085872ed4926a501725464764944e168ee81b84faa9457d"} Apr 16 20:37:52.052254 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:37:52.052239 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c7bf48db5-9jdhg" Apr 16 20:37:52.052254 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:37:52.052246 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c7bf48db5-9jdhg" event={"ID":"e6705e2e-53fe-4fc1-b5ab-082ac0dcacae","Type":"ContainerDied","Data":"39f77d35911020737ea7fbe55d1fbade91b62bb951b9a9061671cd515876b267"} Apr 16 20:37:52.072918 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:37:52.072899 2577 scope.go:117] "RemoveContainer" containerID="b96abc14fc88e2b696c8076698512218441cbd6f970a5872cbad0538376ccff4" Apr 16 20:37:52.077556 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:37:52.077533 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c7bf48db5-9jdhg"] Apr 16 20:37:52.081125 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:37:52.081100 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c7bf48db5-9jdhg"] Apr 16 20:37:52.085322 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:37:52.085298 2577 scope.go:117] "RemoveContainer" containerID="93f6049b476617f7ca4f68bb44eccc6dde5e9141e8f55c99efc053095734765d" Apr 16 20:37:52.085611 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:37:52.085590 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93f6049b476617f7ca4f68bb44eccc6dde5e9141e8f55c99efc053095734765d\": container with ID starting with 93f6049b476617f7ca4f68bb44eccc6dde5e9141e8f55c99efc053095734765d not found: ID does not exist" containerID="93f6049b476617f7ca4f68bb44eccc6dde5e9141e8f55c99efc053095734765d" Apr 16 20:37:52.085666 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:37:52.085619 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93f6049b476617f7ca4f68bb44eccc6dde5e9141e8f55c99efc053095734765d"} err="failed to get container status \"93f6049b476617f7ca4f68bb44eccc6dde5e9141e8f55c99efc053095734765d\": rpc error: code = NotFound desc = could not find container \"93f6049b476617f7ca4f68bb44eccc6dde5e9141e8f55c99efc053095734765d\": container with ID starting with 93f6049b476617f7ca4f68bb44eccc6dde5e9141e8f55c99efc053095734765d not found: ID does not exist" Apr 16 20:37:52.085666 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:37:52.085641 2577 scope.go:117] "RemoveContainer" containerID="b96abc14fc88e2b696c8076698512218441cbd6f970a5872cbad0538376ccff4" Apr 16 20:37:52.085936 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:37:52.085907 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b96abc14fc88e2b696c8076698512218441cbd6f970a5872cbad0538376ccff4\": container with ID starting with b96abc14fc88e2b696c8076698512218441cbd6f970a5872cbad0538376ccff4 not found: ID does not exist" containerID="b96abc14fc88e2b696c8076698512218441cbd6f970a5872cbad0538376ccff4" Apr 16 20:37:52.086045 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:37:52.085937 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b96abc14fc88e2b696c8076698512218441cbd6f970a5872cbad0538376ccff4"} err="failed to get container status \"b96abc14fc88e2b696c8076698512218441cbd6f970a5872cbad0538376ccff4\": rpc error: code = NotFound desc = could not find container \"b96abc14fc88e2b696c8076698512218441cbd6f970a5872cbad0538376ccff4\": container with ID starting with b96abc14fc88e2b696c8076698512218441cbd6f970a5872cbad0538376ccff4 not found: ID does not exist" Apr 16 20:37:52.086045 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:37:52.085981 2577 scope.go:117] "RemoveContainer" containerID="b8d7b4988e13313e8085872ed4926a501725464764944e168ee81b84faa9457d" Apr 16 20:37:52.090909 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:37:52.090830 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-f67d97b6b-jgsqs"] Apr 16 20:37:52.093833 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:37:52.093811 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-f67d97b6b-jgsqs"] Apr 16 20:37:52.096127 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:37:52.096111 2577 scope.go:117] "RemoveContainer" containerID="79a16d285fdc867c05059a69ab394d0f9c8feb4cfc1e69c7041c5545d0f0b7fc" Apr 16 20:37:52.108216 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:37:52.108194 2577 scope.go:117] "RemoveContainer" containerID="e0dfc49f864f31562f0cea0d7b98232a4a21a5167238e21e2c175e4c0dae7e34" Apr 16 20:37:52.117411 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:37:52.117389 2577 scope.go:117] "RemoveContainer" containerID="b8d7b4988e13313e8085872ed4926a501725464764944e168ee81b84faa9457d" Apr 16 20:37:52.117651 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:37:52.117632 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8d7b4988e13313e8085872ed4926a501725464764944e168ee81b84faa9457d\": container with ID starting with b8d7b4988e13313e8085872ed4926a501725464764944e168ee81b84faa9457d not found: ID does not exist" containerID="b8d7b4988e13313e8085872ed4926a501725464764944e168ee81b84faa9457d" Apr 16 20:37:52.117708 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:37:52.117664 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8d7b4988e13313e8085872ed4926a501725464764944e168ee81b84faa9457d"} err="failed to get container status \"b8d7b4988e13313e8085872ed4926a501725464764944e168ee81b84faa9457d\": rpc error: code = NotFound desc = could not find container \"b8d7b4988e13313e8085872ed4926a501725464764944e168ee81b84faa9457d\": container with ID starting with b8d7b4988e13313e8085872ed4926a501725464764944e168ee81b84faa9457d not found: ID does not exist" Apr 16 20:37:52.117708 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:37:52.117684 2577 scope.go:117] "RemoveContainer" containerID="79a16d285fdc867c05059a69ab394d0f9c8feb4cfc1e69c7041c5545d0f0b7fc" Apr 16 20:37:52.117918 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:37:52.117897 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79a16d285fdc867c05059a69ab394d0f9c8feb4cfc1e69c7041c5545d0f0b7fc\": container with ID starting with 79a16d285fdc867c05059a69ab394d0f9c8feb4cfc1e69c7041c5545d0f0b7fc not found: ID does not exist" containerID="79a16d285fdc867c05059a69ab394d0f9c8feb4cfc1e69c7041c5545d0f0b7fc" Apr 16 20:37:52.118014 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:37:52.117926 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79a16d285fdc867c05059a69ab394d0f9c8feb4cfc1e69c7041c5545d0f0b7fc"} err="failed to get container status \"79a16d285fdc867c05059a69ab394d0f9c8feb4cfc1e69c7041c5545d0f0b7fc\": rpc error: code = NotFound desc = could not find container \"79a16d285fdc867c05059a69ab394d0f9c8feb4cfc1e69c7041c5545d0f0b7fc\": container with ID starting with 79a16d285fdc867c05059a69ab394d0f9c8feb4cfc1e69c7041c5545d0f0b7fc not found: ID does not exist" Apr 16 20:37:52.118014 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:37:52.117967 2577 scope.go:117] "RemoveContainer" containerID="e0dfc49f864f31562f0cea0d7b98232a4a21a5167238e21e2c175e4c0dae7e34" Apr 16 20:37:52.118245 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:37:52.118228 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0dfc49f864f31562f0cea0d7b98232a4a21a5167238e21e2c175e4c0dae7e34\": container with ID starting with e0dfc49f864f31562f0cea0d7b98232a4a21a5167238e21e2c175e4c0dae7e34 not found: ID does not exist" containerID="e0dfc49f864f31562f0cea0d7b98232a4a21a5167238e21e2c175e4c0dae7e34" Apr 16 20:37:52.118290 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:37:52.118249 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0dfc49f864f31562f0cea0d7b98232a4a21a5167238e21e2c175e4c0dae7e34"} err="failed to get container status \"e0dfc49f864f31562f0cea0d7b98232a4a21a5167238e21e2c175e4c0dae7e34\": rpc error: code = NotFound desc = could not find container \"e0dfc49f864f31562f0cea0d7b98232a4a21a5167238e21e2c175e4c0dae7e34\": container with ID starting with e0dfc49f864f31562f0cea0d7b98232a4a21a5167238e21e2c175e4c0dae7e34 not found: ID does not exist" Apr 16 20:37:52.876162 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:37:52.876126 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69a059df-08aa-4f24-b614-6ee2ded6900a" path="/var/lib/kubelet/pods/69a059df-08aa-4f24-b614-6ee2ded6900a/volumes" Apr 16 20:37:52.876634 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:37:52.876618 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6705e2e-53fe-4fc1-b5ab-082ac0dcacae" path="/var/lib/kubelet/pods/e6705e2e-53fe-4fc1-b5ab-082ac0dcacae/volumes" Apr 16 20:41:40.939128 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:41:40.939102 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-gbsjj_066e6004-882a-4b2c-a626-ecac83f8b502/console-operator/2.log" Apr 16 20:41:40.952911 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:41:40.952887 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-gbsjj_066e6004-882a-4b2c-a626-ecac83f8b502/console-operator/2.log" Apr 16 20:44:19.426736 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:44:19.426693 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-cmcsx/must-gather-5lrtr"] Apr 16 20:44:19.428542 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:44:19.427283 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ad65e36f-272c-4e78-9d50-efafd6038367" containerName="storage-initializer" Apr 16 20:44:19.428542 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:44:19.427301 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad65e36f-272c-4e78-9d50-efafd6038367" containerName="storage-initializer" Apr 16 20:44:19.428542 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:44:19.427318 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e6705e2e-53fe-4fc1-b5ab-082ac0dcacae" containerName="storage-initializer" Apr 16 20:44:19.428542 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:44:19.427323 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6705e2e-53fe-4fc1-b5ab-082ac0dcacae" containerName="storage-initializer" Apr 16 20:44:19.428542 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:44:19.427332 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e6705e2e-53fe-4fc1-b5ab-082ac0dcacae" containerName="main" Apr 16 20:44:19.428542 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:44:19.427338 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6705e2e-53fe-4fc1-b5ab-082ac0dcacae" containerName="main" Apr 16 20:44:19.428542 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:44:19.427346 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="69a059df-08aa-4f24-b614-6ee2ded6900a" containerName="main" Apr 16 20:44:19.428542 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:44:19.427351 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="69a059df-08aa-4f24-b614-6ee2ded6900a" containerName="main" Apr 16 20:44:19.428542 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:44:19.427364 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ad65e36f-272c-4e78-9d50-efafd6038367" containerName="main" Apr 16 20:44:19.428542 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:44:19.427369 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad65e36f-272c-4e78-9d50-efafd6038367" containerName="main" Apr 16 20:44:19.428542 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:44:19.427376 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8dceeca5-f8d2-4905-8cbc-49c8603e6697" containerName="main" Apr 16 20:44:19.428542 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:44:19.427381 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="8dceeca5-f8d2-4905-8cbc-49c8603e6697" containerName="main" Apr 16 20:44:19.428542 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:44:19.427390 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8dceeca5-f8d2-4905-8cbc-49c8603e6697" containerName="storage-initializer" Apr 16 20:44:19.428542 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:44:19.427395 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="8dceeca5-f8d2-4905-8cbc-49c8603e6697" containerName="storage-initializer" Apr 16 20:44:19.428542 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:44:19.427403 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="69a059df-08aa-4f24-b614-6ee2ded6900a" containerName="storage-initializer" Apr 16 20:44:19.428542 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:44:19.427408 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="69a059df-08aa-4f24-b614-6ee2ded6900a" containerName="storage-initializer" Apr 16 20:44:19.428542 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:44:19.427415 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e6705e2e-53fe-4fc1-b5ab-082ac0dcacae" containerName="llm-d-routing-sidecar" Apr 16 20:44:19.428542 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:44:19.427420 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6705e2e-53fe-4fc1-b5ab-082ac0dcacae" containerName="llm-d-routing-sidecar" Apr 16 20:44:19.428542 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:44:19.427487 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="e6705e2e-53fe-4fc1-b5ab-082ac0dcacae" containerName="llm-d-routing-sidecar" Apr 16 20:44:19.428542 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:44:19.427495 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="69a059df-08aa-4f24-b614-6ee2ded6900a" containerName="main" Apr 16 20:44:19.428542 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:44:19.427502 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="8dceeca5-f8d2-4905-8cbc-49c8603e6697" containerName="main" Apr 16 20:44:19.428542 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:44:19.427512 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="e6705e2e-53fe-4fc1-b5ab-082ac0dcacae" containerName="main" Apr 16 20:44:19.428542 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:44:19.427518 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="ad65e36f-272c-4e78-9d50-efafd6038367" containerName="main" Apr 16 20:44:19.429885 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:44:19.429869 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cmcsx/must-gather-5lrtr" Apr 16 20:44:19.432480 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:44:19.432458 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-cmcsx\"/\"openshift-service-ca.crt\"" Apr 16 20:44:19.432624 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:44:19.432515 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-cmcsx\"/\"kube-root-ca.crt\"" Apr 16 20:44:19.433230 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:44:19.433213 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-cmcsx\"/\"default-dockercfg-kz82c\"" Apr 16 20:44:19.437305 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:44:19.437288 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-cmcsx/must-gather-5lrtr"] Apr 16 20:44:19.505050 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:44:19.505012 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b90ffcc5-e18f-42fb-bd1f-0ada1e47fdf4-must-gather-output\") pod \"must-gather-5lrtr\" (UID: \"b90ffcc5-e18f-42fb-bd1f-0ada1e47fdf4\") " pod="openshift-must-gather-cmcsx/must-gather-5lrtr" Apr 16 20:44:19.505050 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:44:19.505052 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpx77\" (UniqueName: \"kubernetes.io/projected/b90ffcc5-e18f-42fb-bd1f-0ada1e47fdf4-kube-api-access-kpx77\") pod \"must-gather-5lrtr\" (UID: \"b90ffcc5-e18f-42fb-bd1f-0ada1e47fdf4\") " pod="openshift-must-gather-cmcsx/must-gather-5lrtr" Apr 16 20:44:19.606021 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:44:19.605982 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b90ffcc5-e18f-42fb-bd1f-0ada1e47fdf4-must-gather-output\") pod \"must-gather-5lrtr\" (UID: \"b90ffcc5-e18f-42fb-bd1f-0ada1e47fdf4\") " pod="openshift-must-gather-cmcsx/must-gather-5lrtr" Apr 16 20:44:19.606186 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:44:19.606035 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kpx77\" (UniqueName: \"kubernetes.io/projected/b90ffcc5-e18f-42fb-bd1f-0ada1e47fdf4-kube-api-access-kpx77\") pod \"must-gather-5lrtr\" (UID: \"b90ffcc5-e18f-42fb-bd1f-0ada1e47fdf4\") " pod="openshift-must-gather-cmcsx/must-gather-5lrtr" Apr 16 20:44:19.606306 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:44:19.606289 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b90ffcc5-e18f-42fb-bd1f-0ada1e47fdf4-must-gather-output\") pod \"must-gather-5lrtr\" (UID: \"b90ffcc5-e18f-42fb-bd1f-0ada1e47fdf4\") " pod="openshift-must-gather-cmcsx/must-gather-5lrtr" Apr 16 20:44:19.614092 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:44:19.614070 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpx77\" (UniqueName: \"kubernetes.io/projected/b90ffcc5-e18f-42fb-bd1f-0ada1e47fdf4-kube-api-access-kpx77\") pod \"must-gather-5lrtr\" (UID: \"b90ffcc5-e18f-42fb-bd1f-0ada1e47fdf4\") " pod="openshift-must-gather-cmcsx/must-gather-5lrtr" Apr 16 20:44:19.740120 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:44:19.740038 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cmcsx/must-gather-5lrtr" Apr 16 20:44:19.857685 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:44:19.857659 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-cmcsx/must-gather-5lrtr"] Apr 16 20:44:19.859226 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:44:19.859196 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb90ffcc5_e18f_42fb_bd1f_0ada1e47fdf4.slice/crio-aa5ed2a6e666ed907f0524314d40ce43ecdee478d4d65ecd9cf9212d0b38fa93 WatchSource:0}: Error finding container aa5ed2a6e666ed907f0524314d40ce43ecdee478d4d65ecd9cf9212d0b38fa93: Status 404 returned error can't find the container with id aa5ed2a6e666ed907f0524314d40ce43ecdee478d4d65ecd9cf9212d0b38fa93 Apr 16 20:44:19.860854 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:44:19.860834 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 20:44:20.521610 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:44:20.521579 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-cmcsx/must-gather-5lrtr" event={"ID":"b90ffcc5-e18f-42fb-bd1f-0ada1e47fdf4","Type":"ContainerStarted","Data":"aa5ed2a6e666ed907f0524314d40ce43ecdee478d4d65ecd9cf9212d0b38fa93"} Apr 16 20:44:24.542377 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:44:24.542338 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-cmcsx/must-gather-5lrtr" event={"ID":"b90ffcc5-e18f-42fb-bd1f-0ada1e47fdf4","Type":"ContainerStarted","Data":"efecb3f346b67b2d9128c64a99599e696e952d08b04cc087d8de231646e2a7a0"} Apr 16 20:44:24.542751 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:44:24.542385 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-cmcsx/must-gather-5lrtr" event={"ID":"b90ffcc5-e18f-42fb-bd1f-0ada1e47fdf4","Type":"ContainerStarted","Data":"86a67c474246b69295e8023ffbf5ac26aed96d8a6e4dcb8feaad647d2d3bc8b8"} Apr 16 20:44:24.564807 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:44:24.564739 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-cmcsx/must-gather-5lrtr" podStartSLOduration=1.186222933 podStartE2EDuration="5.564718948s" podCreationTimestamp="2026-04-16 20:44:19 +0000 UTC" firstStartedPulling="2026-04-16 20:44:19.860975403 +0000 UTC m=+1959.537243615" lastFinishedPulling="2026-04-16 20:44:24.239471411 +0000 UTC m=+1963.915739630" observedRunningTime="2026-04-16 20:44:24.561171874 +0000 UTC m=+1964.237440105" watchObservedRunningTime="2026-04-16 20:44:24.564718948 +0000 UTC m=+1964.240987181" Apr 16 20:44:48.053861 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:44:48.053830 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-78d5944d84-snw2w_09c2bd17-9feb-4960-ba71-c41b908e03e4/router/0.log" Apr 16 20:44:48.895298 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:44:48.895272 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-78d5944d84-snw2w_09c2bd17-9feb-4960-ba71-c41b908e03e4/router/0.log" Apr 16 20:44:49.670889 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:44:49.670854 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-68bd676465-xrzmd_aad07ab7-8103-4561-997c-1467c2b60d4a/authorino/0.log" Apr 16 20:44:49.699521 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:44:49.699494 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-844548ff4c-wzs24_bdaf6977-5831-4576-9c76-41b5f95830db/manager/0.log" Apr 16 20:44:49.712357 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:44:49.712333 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6c886788f8-rlglg_6570b1e5-6d2f-4da4-a328-7d2ae8157394/kuadrant-console-plugin/0.log" Apr 16 20:44:49.788791 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:44:49.788765 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-67566c68b4-h7fjx_1908133b-0394-499d-ae09-9aaf7ea5ddfb/limitador/0.log" Apr 16 20:44:50.650628 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:44:50.650588 2577 generic.go:358] "Generic (PLEG): container finished" podID="b90ffcc5-e18f-42fb-bd1f-0ada1e47fdf4" containerID="86a67c474246b69295e8023ffbf5ac26aed96d8a6e4dcb8feaad647d2d3bc8b8" exitCode=0 Apr 16 20:44:50.650867 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:44:50.650665 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-cmcsx/must-gather-5lrtr" event={"ID":"b90ffcc5-e18f-42fb-bd1f-0ada1e47fdf4","Type":"ContainerDied","Data":"86a67c474246b69295e8023ffbf5ac26aed96d8a6e4dcb8feaad647d2d3bc8b8"} Apr 16 20:44:50.651079 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:44:50.651062 2577 scope.go:117] "RemoveContainer" containerID="86a67c474246b69295e8023ffbf5ac26aed96d8a6e4dcb8feaad647d2d3bc8b8" Apr 16 20:44:51.445929 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:44:51.445900 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-cmcsx_must-gather-5lrtr_b90ffcc5-e18f-42fb-bd1f-0ada1e47fdf4/gather/0.log" Apr 16 20:44:55.165814 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:44:55.165785 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-8ng5b_e23b04fb-ea11-4ca2-9b81-1b9b8594f1eb/global-pull-secret-syncer/0.log" Apr 16 20:44:55.288603 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:44:55.288572 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-9qvtx_99d4b836-b69f-4839-b4e1-937780e53711/konnectivity-agent/0.log" Apr 16 20:44:55.347830 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:44:55.347797 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-131-31.ec2.internal_83cea7cfd9a6bfdb650e2cb8e7b4bc3c/haproxy/0.log" Apr 16 20:44:56.938587 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:44:56.938539 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-cmcsx/must-gather-5lrtr"] Apr 16 20:44:56.939072 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:44:56.938845 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-cmcsx/must-gather-5lrtr" podUID="b90ffcc5-e18f-42fb-bd1f-0ada1e47fdf4" containerName="copy" containerID="cri-o://efecb3f346b67b2d9128c64a99599e696e952d08b04cc087d8de231646e2a7a0" gracePeriod=2 Apr 16 20:44:56.941401 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:44:56.941366 2577 status_manager.go:895] "Failed to get status for pod" podUID="b90ffcc5-e18f-42fb-bd1f-0ada1e47fdf4" pod="openshift-must-gather-cmcsx/must-gather-5lrtr" err="pods \"must-gather-5lrtr\" is forbidden: User \"system:node:ip-10-0-131-31.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-cmcsx\": no relationship found between node 'ip-10-0-131-31.ec2.internal' and this object" Apr 16 20:44:56.942324 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:44:56.942304 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-cmcsx/must-gather-5lrtr"] Apr 16 20:44:57.176668 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:44:57.176648 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-cmcsx_must-gather-5lrtr_b90ffcc5-e18f-42fb-bd1f-0ada1e47fdf4/copy/0.log" Apr 16 20:44:57.177050 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:44:57.177028 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cmcsx/must-gather-5lrtr" Apr 16 20:44:57.178854 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:44:57.178831 2577 status_manager.go:895] "Failed to get status for pod" podUID="b90ffcc5-e18f-42fb-bd1f-0ada1e47fdf4" pod="openshift-must-gather-cmcsx/must-gather-5lrtr" err="pods \"must-gather-5lrtr\" is forbidden: User \"system:node:ip-10-0-131-31.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-cmcsx\": no relationship found between node 'ip-10-0-131-31.ec2.internal' and this object" Apr 16 20:44:57.362492 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:44:57.362453 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b90ffcc5-e18f-42fb-bd1f-0ada1e47fdf4-must-gather-output\") pod \"b90ffcc5-e18f-42fb-bd1f-0ada1e47fdf4\" (UID: \"b90ffcc5-e18f-42fb-bd1f-0ada1e47fdf4\") " Apr 16 20:44:57.362647 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:44:57.362547 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kpx77\" (UniqueName: \"kubernetes.io/projected/b90ffcc5-e18f-42fb-bd1f-0ada1e47fdf4-kube-api-access-kpx77\") pod \"b90ffcc5-e18f-42fb-bd1f-0ada1e47fdf4\" (UID: \"b90ffcc5-e18f-42fb-bd1f-0ada1e47fdf4\") " Apr 16 20:44:57.364649 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:44:57.364615 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b90ffcc5-e18f-42fb-bd1f-0ada1e47fdf4-kube-api-access-kpx77" (OuterVolumeSpecName: "kube-api-access-kpx77") pod "b90ffcc5-e18f-42fb-bd1f-0ada1e47fdf4" (UID: "b90ffcc5-e18f-42fb-bd1f-0ada1e47fdf4"). InnerVolumeSpecName "kube-api-access-kpx77". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:44:57.368283 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:44:57.368261 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b90ffcc5-e18f-42fb-bd1f-0ada1e47fdf4-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "b90ffcc5-e18f-42fb-bd1f-0ada1e47fdf4" (UID: "b90ffcc5-e18f-42fb-bd1f-0ada1e47fdf4"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:44:57.463739 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:44:57.463701 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kpx77\" (UniqueName: \"kubernetes.io/projected/b90ffcc5-e18f-42fb-bd1f-0ada1e47fdf4-kube-api-access-kpx77\") on node \"ip-10-0-131-31.ec2.internal\" DevicePath \"\"" Apr 16 20:44:57.463739 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:44:57.463737 2577 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b90ffcc5-e18f-42fb-bd1f-0ada1e47fdf4-must-gather-output\") on node \"ip-10-0-131-31.ec2.internal\" DevicePath \"\"" Apr 16 20:44:57.680975 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:44:57.680884 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-cmcsx_must-gather-5lrtr_b90ffcc5-e18f-42fb-bd1f-0ada1e47fdf4/copy/0.log" Apr 16 20:44:57.681258 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:44:57.681234 2577 generic.go:358] "Generic (PLEG): container finished" podID="b90ffcc5-e18f-42fb-bd1f-0ada1e47fdf4" containerID="efecb3f346b67b2d9128c64a99599e696e952d08b04cc087d8de231646e2a7a0" exitCode=143 Apr 16 20:44:57.681366 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:44:57.681299 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cmcsx/must-gather-5lrtr" Apr 16 20:44:57.681366 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:44:57.681334 2577 scope.go:117] "RemoveContainer" containerID="efecb3f346b67b2d9128c64a99599e696e952d08b04cc087d8de231646e2a7a0" Apr 16 20:44:57.683530 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:44:57.683504 2577 status_manager.go:895] "Failed to get status for pod" podUID="b90ffcc5-e18f-42fb-bd1f-0ada1e47fdf4" pod="openshift-must-gather-cmcsx/must-gather-5lrtr" err="pods \"must-gather-5lrtr\" is forbidden: User \"system:node:ip-10-0-131-31.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-cmcsx\": no relationship found between node 'ip-10-0-131-31.ec2.internal' and this object" Apr 16 20:44:57.690218 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:44:57.690146 2577 scope.go:117] "RemoveContainer" containerID="86a67c474246b69295e8023ffbf5ac26aed96d8a6e4dcb8feaad647d2d3bc8b8" Apr 16 20:44:57.692287 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:44:57.692267 2577 status_manager.go:895] "Failed to get status for pod" podUID="b90ffcc5-e18f-42fb-bd1f-0ada1e47fdf4" pod="openshift-must-gather-cmcsx/must-gather-5lrtr" err="pods \"must-gather-5lrtr\" is forbidden: User \"system:node:ip-10-0-131-31.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-cmcsx\": no relationship found between node 'ip-10-0-131-31.ec2.internal' and this object" Apr 16 20:44:57.703577 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:44:57.703559 2577 scope.go:117] "RemoveContainer" containerID="efecb3f346b67b2d9128c64a99599e696e952d08b04cc087d8de231646e2a7a0" Apr 16 20:44:57.703817 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:44:57.703797 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"efecb3f346b67b2d9128c64a99599e696e952d08b04cc087d8de231646e2a7a0\": container with ID starting with efecb3f346b67b2d9128c64a99599e696e952d08b04cc087d8de231646e2a7a0 not found: ID does not exist" containerID="efecb3f346b67b2d9128c64a99599e696e952d08b04cc087d8de231646e2a7a0" Apr 16 20:44:57.703877 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:44:57.703824 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efecb3f346b67b2d9128c64a99599e696e952d08b04cc087d8de231646e2a7a0"} err="failed to get container status \"efecb3f346b67b2d9128c64a99599e696e952d08b04cc087d8de231646e2a7a0\": rpc error: code = NotFound desc = could not find container \"efecb3f346b67b2d9128c64a99599e696e952d08b04cc087d8de231646e2a7a0\": container with ID starting with efecb3f346b67b2d9128c64a99599e696e952d08b04cc087d8de231646e2a7a0 not found: ID does not exist" Apr 16 20:44:57.703877 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:44:57.703842 2577 scope.go:117] "RemoveContainer" containerID="86a67c474246b69295e8023ffbf5ac26aed96d8a6e4dcb8feaad647d2d3bc8b8" Apr 16 20:44:57.704063 ip-10-0-131-31 kubenswrapper[2577]: E0416 20:44:57.704047 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86a67c474246b69295e8023ffbf5ac26aed96d8a6e4dcb8feaad647d2d3bc8b8\": container with ID starting with 86a67c474246b69295e8023ffbf5ac26aed96d8a6e4dcb8feaad647d2d3bc8b8 not found: ID does not exist" containerID="86a67c474246b69295e8023ffbf5ac26aed96d8a6e4dcb8feaad647d2d3bc8b8" Apr 16 20:44:57.704116 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:44:57.704067 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86a67c474246b69295e8023ffbf5ac26aed96d8a6e4dcb8feaad647d2d3bc8b8"} err="failed to get container status \"86a67c474246b69295e8023ffbf5ac26aed96d8a6e4dcb8feaad647d2d3bc8b8\": rpc error: code = NotFound desc = could not find container \"86a67c474246b69295e8023ffbf5ac26aed96d8a6e4dcb8feaad647d2d3bc8b8\": container with ID starting with 86a67c474246b69295e8023ffbf5ac26aed96d8a6e4dcb8feaad647d2d3bc8b8 not found: ID does not exist" Apr 16 20:44:58.881345 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:44:58.881302 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b90ffcc5-e18f-42fb-bd1f-0ada1e47fdf4" path="/var/lib/kubelet/pods/b90ffcc5-e18f-42fb-bd1f-0ada1e47fdf4/volumes" Apr 16 20:44:59.082006 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:44:59.081980 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-68bd676465-xrzmd_aad07ab7-8103-4561-997c-1467c2b60d4a/authorino/0.log" Apr 16 20:44:59.136331 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:44:59.136260 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-844548ff4c-wzs24_bdaf6977-5831-4576-9c76-41b5f95830db/manager/0.log" Apr 16 20:44:59.200841 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:44:59.200819 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6c886788f8-rlglg_6570b1e5-6d2f-4da4-a328-7d2ae8157394/kuadrant-console-plugin/0.log" Apr 16 20:44:59.360406 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:44:59.360381 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-67566c68b4-h7fjx_1908133b-0394-499d-ae09-9aaf7ea5ddfb/limitador/0.log" Apr 16 20:45:00.758065 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:45:00.758032 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-x28rw_260cd2e6-6dc5-4bf2-bbc0-56935883c1da/kube-state-metrics/0.log" Apr 16 20:45:00.779106 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:45:00.779076 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-x28rw_260cd2e6-6dc5-4bf2-bbc0-56935883c1da/kube-rbac-proxy-main/0.log" Apr 16 20:45:00.800648 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:45:00.800627 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-x28rw_260cd2e6-6dc5-4bf2-bbc0-56935883c1da/kube-rbac-proxy-self/0.log" Apr 16 20:45:00.860129 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:45:00.860105 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-64wr5_d427af70-58f2-4318-aa05-1933c3bcd25e/monitoring-plugin/0.log" Apr 16 20:45:00.962520 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:45:00.962488 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-qmmmv_eb507573-fa4f-4085-a62c-3ffc4585f0c8/node-exporter/0.log" Apr 16 20:45:00.983746 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:45:00.983720 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-qmmmv_eb507573-fa4f-4085-a62c-3ffc4585f0c8/kube-rbac-proxy/0.log" Apr 16 20:45:01.007172 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:45:01.007144 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-qmmmv_eb507573-fa4f-4085-a62c-3ffc4585f0c8/init-textfile/0.log" Apr 16 20:45:01.125154 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:45:01.125125 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-2dc5g_6ce0c9f4-cc74-47c2-8391-1095355362df/kube-rbac-proxy-main/0.log" Apr 16 20:45:01.151750 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:45:01.151719 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-2dc5g_6ce0c9f4-cc74-47c2-8391-1095355362df/kube-rbac-proxy-self/0.log" Apr 16 20:45:01.172540 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:45:01.172521 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-2dc5g_6ce0c9f4-cc74-47c2-8391-1095355362df/openshift-state-metrics/0.log" Apr 16 20:45:01.490373 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:45:01.490301 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-75f7f6db87-tnr4d_2db51ca0-6cdc-4f9c-b747-6a7f612dba5e/telemeter-client/0.log" Apr 16 20:45:01.516998 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:45:01.516971 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-75f7f6db87-tnr4d_2db51ca0-6cdc-4f9c-b747-6a7f612dba5e/reload/0.log" Apr 16 20:45:01.541893 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:45:01.541873 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-75f7f6db87-tnr4d_2db51ca0-6cdc-4f9c-b747-6a7f612dba5e/kube-rbac-proxy/0.log" Apr 16 20:45:01.601908 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:45:01.601882 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6575947857-tnl4r_9e9c1e01-825d-4a05-9b6a-a90f26ea32a0/thanos-query/0.log" Apr 16 20:45:01.629104 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:45:01.629082 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6575947857-tnl4r_9e9c1e01-825d-4a05-9b6a-a90f26ea32a0/kube-rbac-proxy-web/0.log" Apr 16 20:45:01.651262 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:45:01.651239 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6575947857-tnl4r_9e9c1e01-825d-4a05-9b6a-a90f26ea32a0/kube-rbac-proxy/0.log" Apr 16 20:45:01.675662 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:45:01.675639 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6575947857-tnl4r_9e9c1e01-825d-4a05-9b6a-a90f26ea32a0/prom-label-proxy/0.log" Apr 16 20:45:01.698884 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:45:01.698861 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6575947857-tnl4r_9e9c1e01-825d-4a05-9b6a-a90f26ea32a0/kube-rbac-proxy-rules/0.log" Apr 16 20:45:01.720827 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:45:01.720800 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6575947857-tnl4r_9e9c1e01-825d-4a05-9b6a-a90f26ea32a0/kube-rbac-proxy-metrics/0.log" Apr 16 20:45:03.439604 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:45:03.439578 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-gbsjj_066e6004-882a-4b2c-a626-ecac83f8b502/console-operator/2.log" Apr 16 20:45:03.444077 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:45:03.444059 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-gbsjj_066e6004-882a-4b2c-a626-ecac83f8b502/console-operator/3.log" Apr 16 20:45:03.882162 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:45:03.882126 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-d58bg/perf-node-gather-daemonset-xb85p"] Apr 16 20:45:03.882619 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:45:03.882602 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b90ffcc5-e18f-42fb-bd1f-0ada1e47fdf4" containerName="copy" Apr 16 20:45:03.882696 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:45:03.882622 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="b90ffcc5-e18f-42fb-bd1f-0ada1e47fdf4" containerName="copy" Apr 16 20:45:03.882696 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:45:03.882644 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b90ffcc5-e18f-42fb-bd1f-0ada1e47fdf4" containerName="gather" Apr 16 20:45:03.882696 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:45:03.882649 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="b90ffcc5-e18f-42fb-bd1f-0ada1e47fdf4" containerName="gather" Apr 16 20:45:03.882814 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:45:03.882711 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="b90ffcc5-e18f-42fb-bd1f-0ada1e47fdf4" containerName="gather" Apr 16 20:45:03.882814 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:45:03.882721 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="b90ffcc5-e18f-42fb-bd1f-0ada1e47fdf4" containerName="copy" Apr 16 20:45:03.889244 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:45:03.889227 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d58bg/perf-node-gather-daemonset-xb85p" Apr 16 20:45:03.891744 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:45:03.891710 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-d58bg\"/\"default-dockercfg-fs8m8\"" Apr 16 20:45:03.892621 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:45:03.892603 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-d58bg\"/\"kube-root-ca.crt\"" Apr 16 20:45:03.892914 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:45:03.892889 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-d58bg\"/\"openshift-service-ca.crt\"" Apr 16 20:45:03.894741 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:45:03.894725 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7b77b95f98-xqbnk_9511771d-426a-40a5-9e10-90e1f3c18744/console/0.log" Apr 16 20:45:03.895568 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:45:03.895547 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-d58bg/perf-node-gather-daemonset-xb85p"] Apr 16 20:45:03.914330 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:45:03.914309 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e8a3259e-968b-4600-a60e-269eac5c216a-sys\") pod \"perf-node-gather-daemonset-xb85p\" (UID: \"e8a3259e-968b-4600-a60e-269eac5c216a\") " pod="openshift-must-gather-d58bg/perf-node-gather-daemonset-xb85p" Apr 16 20:45:03.914431 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:45:03.914360 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e8a3259e-968b-4600-a60e-269eac5c216a-lib-modules\") pod \"perf-node-gather-daemonset-xb85p\" (UID: \"e8a3259e-968b-4600-a60e-269eac5c216a\") " pod="openshift-must-gather-d58bg/perf-node-gather-daemonset-xb85p" Apr 16 20:45:03.914485 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:45:03.914430 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/e8a3259e-968b-4600-a60e-269eac5c216a-proc\") pod \"perf-node-gather-daemonset-xb85p\" (UID: \"e8a3259e-968b-4600-a60e-269eac5c216a\") " pod="openshift-must-gather-d58bg/perf-node-gather-daemonset-xb85p" Apr 16 20:45:03.914527 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:45:03.914481 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/e8a3259e-968b-4600-a60e-269eac5c216a-podres\") pod \"perf-node-gather-daemonset-xb85p\" (UID: \"e8a3259e-968b-4600-a60e-269eac5c216a\") " pod="openshift-must-gather-d58bg/perf-node-gather-daemonset-xb85p" Apr 16 20:45:03.914527 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:45:03.914508 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fr57\" (UniqueName: \"kubernetes.io/projected/e8a3259e-968b-4600-a60e-269eac5c216a-kube-api-access-6fr57\") pod \"perf-node-gather-daemonset-xb85p\" (UID: \"e8a3259e-968b-4600-a60e-269eac5c216a\") " pod="openshift-must-gather-d58bg/perf-node-gather-daemonset-xb85p" Apr 16 20:45:03.926421 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:45:03.926400 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-lzfpw_3362abe5-749c-485c-998e-fa0949747cac/download-server/0.log" Apr 16 20:45:04.014914 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:45:04.014873 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6fr57\" (UniqueName: \"kubernetes.io/projected/e8a3259e-968b-4600-a60e-269eac5c216a-kube-api-access-6fr57\") pod \"perf-node-gather-daemonset-xb85p\" (UID: \"e8a3259e-968b-4600-a60e-269eac5c216a\") " pod="openshift-must-gather-d58bg/perf-node-gather-daemonset-xb85p" Apr 16 20:45:04.014914 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:45:04.014925 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e8a3259e-968b-4600-a60e-269eac5c216a-sys\") pod \"perf-node-gather-daemonset-xb85p\" (UID: \"e8a3259e-968b-4600-a60e-269eac5c216a\") " pod="openshift-must-gather-d58bg/perf-node-gather-daemonset-xb85p" Apr 16 20:45:04.015130 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:45:04.015017 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e8a3259e-968b-4600-a60e-269eac5c216a-lib-modules\") pod \"perf-node-gather-daemonset-xb85p\" (UID: \"e8a3259e-968b-4600-a60e-269eac5c216a\") " pod="openshift-must-gather-d58bg/perf-node-gather-daemonset-xb85p" Apr 16 20:45:04.015130 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:45:04.015063 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/e8a3259e-968b-4600-a60e-269eac5c216a-proc\") pod \"perf-node-gather-daemonset-xb85p\" (UID: \"e8a3259e-968b-4600-a60e-269eac5c216a\") " pod="openshift-must-gather-d58bg/perf-node-gather-daemonset-xb85p" Apr 16 20:45:04.015130 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:45:04.015017 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e8a3259e-968b-4600-a60e-269eac5c216a-sys\") pod \"perf-node-gather-daemonset-xb85p\" (UID: \"e8a3259e-968b-4600-a60e-269eac5c216a\") " pod="openshift-must-gather-d58bg/perf-node-gather-daemonset-xb85p" Apr 16 20:45:04.015130 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:45:04.015099 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/e8a3259e-968b-4600-a60e-269eac5c216a-podres\") pod \"perf-node-gather-daemonset-xb85p\" (UID: \"e8a3259e-968b-4600-a60e-269eac5c216a\") " pod="openshift-must-gather-d58bg/perf-node-gather-daemonset-xb85p" Apr 16 20:45:04.015130 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:45:04.015102 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e8a3259e-968b-4600-a60e-269eac5c216a-lib-modules\") pod \"perf-node-gather-daemonset-xb85p\" (UID: \"e8a3259e-968b-4600-a60e-269eac5c216a\") " pod="openshift-must-gather-d58bg/perf-node-gather-daemonset-xb85p" Apr 16 20:45:04.015320 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:45:04.015144 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/e8a3259e-968b-4600-a60e-269eac5c216a-proc\") pod \"perf-node-gather-daemonset-xb85p\" (UID: \"e8a3259e-968b-4600-a60e-269eac5c216a\") " pod="openshift-must-gather-d58bg/perf-node-gather-daemonset-xb85p" Apr 16 20:45:04.015320 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:45:04.015214 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/e8a3259e-968b-4600-a60e-269eac5c216a-podres\") pod \"perf-node-gather-daemonset-xb85p\" (UID: \"e8a3259e-968b-4600-a60e-269eac5c216a\") " pod="openshift-must-gather-d58bg/perf-node-gather-daemonset-xb85p" Apr 16 20:45:04.022858 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:45:04.022829 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fr57\" (UniqueName: \"kubernetes.io/projected/e8a3259e-968b-4600-a60e-269eac5c216a-kube-api-access-6fr57\") pod \"perf-node-gather-daemonset-xb85p\" (UID: \"e8a3259e-968b-4600-a60e-269eac5c216a\") " pod="openshift-must-gather-d58bg/perf-node-gather-daemonset-xb85p" Apr 16 20:45:04.200388 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:45:04.200300 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d58bg/perf-node-gather-daemonset-xb85p" Apr 16 20:45:04.376554 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:45:04.376527 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-767l6_cb38b8a2-445d-466a-84c3-eac1bca4f425/volume-data-source-validator/0.log" Apr 16 20:45:04.527976 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:45:04.527931 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-d58bg/perf-node-gather-daemonset-xb85p"] Apr 16 20:45:04.529856 ip-10-0-131-31 kubenswrapper[2577]: W0416 20:45:04.529829 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pode8a3259e_968b_4600_a60e_269eac5c216a.slice/crio-a54fc10d9ac2f3695ab1c31a20565c23fafad87511028a620a2778359bef9d9a WatchSource:0}: Error finding container a54fc10d9ac2f3695ab1c31a20565c23fafad87511028a620a2778359bef9d9a: Status 404 returned error can't find the container with id a54fc10d9ac2f3695ab1c31a20565c23fafad87511028a620a2778359bef9d9a Apr 16 20:45:04.709110 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:45:04.709077 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d58bg/perf-node-gather-daemonset-xb85p" event={"ID":"e8a3259e-968b-4600-a60e-269eac5c216a","Type":"ContainerStarted","Data":"2fb8f6d97f9889ca13a434e1a99b144265b01df022963b38014749ed925ea3bf"} Apr 16 20:45:04.709234 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:45:04.709115 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d58bg/perf-node-gather-daemonset-xb85p" event={"ID":"e8a3259e-968b-4600-a60e-269eac5c216a","Type":"ContainerStarted","Data":"a54fc10d9ac2f3695ab1c31a20565c23fafad87511028a620a2778359bef9d9a"} Apr 16 20:45:04.709234 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:45:04.709204 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-d58bg/perf-node-gather-daemonset-xb85p" Apr 16 20:45:04.726623 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:45:04.726583 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-d58bg/perf-node-gather-daemonset-xb85p" podStartSLOduration=1.726569768 podStartE2EDuration="1.726569768s" podCreationTimestamp="2026-04-16 20:45:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:45:04.725544022 +0000 UTC m=+2004.401812254" watchObservedRunningTime="2026-04-16 20:45:04.726569768 +0000 UTC m=+2004.402838033" Apr 16 20:45:05.079322 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:45:05.079296 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-lbx4w_07dfb99b-8784-4865-98a4-3e29b5215bfd/dns/0.log" Apr 16 20:45:05.102270 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:45:05.102248 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-lbx4w_07dfb99b-8784-4865-98a4-3e29b5215bfd/kube-rbac-proxy/0.log" Apr 16 20:45:05.268680 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:45:05.268652 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-w8lw7_9c16c53d-2e96-42ef-86b8-344e57a267ee/dns-node-resolver/0.log" Apr 16 20:45:05.777249 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:45:05.777222 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-jwfkw_56120f2f-13d6-48da-a3d5-a472edb09dfc/node-ca/0.log" Apr 16 20:45:06.655758 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:45:06.655730 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-78d5944d84-snw2w_09c2bd17-9feb-4960-ba71-c41b908e03e4/router/0.log" Apr 16 20:45:07.099541 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:45:07.099519 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-nv4vl_9865442e-a577-4b94-be71-e7cbe1fa9199/serve-healthcheck-canary/0.log" Apr 16 20:45:07.576676 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:45:07.576593 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-z9g4t_f6c48392-b5b1-4068-b4a8-8931576660ed/insights-operator/0.log" Apr 16 20:45:07.577450 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:45:07.577429 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-z9g4t_f6c48392-b5b1-4068-b4a8-8931576660ed/insights-operator/1.log" Apr 16 20:45:07.737887 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:45:07.737862 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-v5l7c_215df8f5-e00a-4e0a-ac8f-587008d6c91b/kube-rbac-proxy/0.log" Apr 16 20:45:07.759486 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:45:07.759451 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-v5l7c_215df8f5-e00a-4e0a-ac8f-587008d6c91b/exporter/0.log" Apr 16 20:45:07.780722 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:45:07.780701 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-v5l7c_215df8f5-e00a-4e0a-ac8f-587008d6c91b/extractor/0.log" Apr 16 20:45:10.280982 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:45:10.280937 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-986f4df7-frzwd_ce9b2f6b-9ec4-4a0d-8778-697c8af4b720/manager/0.log" Apr 16 20:45:10.722675 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:45:10.722650 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-d58bg/perf-node-gather-daemonset-xb85p" Apr 16 20:45:15.898173 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:45:15.898147 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-t6pq4_2979838d-4d67-4cb3-9880-9d8d330a873f/migrator/0.log" Apr 16 20:45:15.936001 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:45:15.935971 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-t6pq4_2979838d-4d67-4cb3-9880-9d8d330a873f/graceful-termination/0.log" Apr 16 20:45:17.279023 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:45:17.278997 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8gsfm_e372e398-739d-495f-a53d-0b5eea55bc76/kube-multus/0.log" Apr 16 20:45:17.307929 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:45:17.307905 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6gf96_bd1cbed9-748e-4677-b9ad-5f73500b5738/kube-multus-additional-cni-plugins/0.log" Apr 16 20:45:17.330171 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:45:17.330152 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6gf96_bd1cbed9-748e-4677-b9ad-5f73500b5738/egress-router-binary-copy/0.log" Apr 16 20:45:17.352597 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:45:17.352573 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6gf96_bd1cbed9-748e-4677-b9ad-5f73500b5738/cni-plugins/0.log" Apr 16 20:45:17.374299 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:45:17.374280 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6gf96_bd1cbed9-748e-4677-b9ad-5f73500b5738/bond-cni-plugin/0.log" Apr 16 20:45:17.401222 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:45:17.401202 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6gf96_bd1cbed9-748e-4677-b9ad-5f73500b5738/routeoverride-cni/0.log" Apr 16 20:45:17.421927 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:45:17.421906 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6gf96_bd1cbed9-748e-4677-b9ad-5f73500b5738/whereabouts-cni-bincopy/0.log" Apr 16 20:45:17.443842 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:45:17.443814 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6gf96_bd1cbed9-748e-4677-b9ad-5f73500b5738/whereabouts-cni/0.log" Apr 16 20:45:17.896850 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:45:17.896818 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-l9pv6_c7481f18-a2dc-4df2-8b59-4184bc5c7283/network-metrics-daemon/0.log" Apr 16 20:45:17.915982 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:45:17.915939 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-l9pv6_c7481f18-a2dc-4df2-8b59-4184bc5c7283/kube-rbac-proxy/0.log" Apr 16 20:45:19.029938 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:45:19.029912 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-985k4_909cb17e-abc8-410f-adb0-206e00624944/ovn-controller/0.log" Apr 16 20:45:19.060620 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:45:19.060590 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-985k4_909cb17e-abc8-410f-adb0-206e00624944/ovn-acl-logging/0.log" Apr 16 20:45:19.080396 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:45:19.080372 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-985k4_909cb17e-abc8-410f-adb0-206e00624944/kube-rbac-proxy-node/0.log" Apr 16 20:45:19.100965 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:45:19.100928 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-985k4_909cb17e-abc8-410f-adb0-206e00624944/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 20:45:19.119267 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:45:19.119249 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-985k4_909cb17e-abc8-410f-adb0-206e00624944/northd/0.log" Apr 16 20:45:19.141849 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:45:19.141829 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-985k4_909cb17e-abc8-410f-adb0-206e00624944/nbdb/0.log" Apr 16 20:45:19.167520 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:45:19.167501 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-985k4_909cb17e-abc8-410f-adb0-206e00624944/sbdb/0.log" Apr 16 20:45:19.283282 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:45:19.283210 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-985k4_909cb17e-abc8-410f-adb0-206e00624944/ovnkube-controller/0.log" Apr 16 20:45:20.774300 ip-10-0-131-31 kubenswrapper[2577]: I0416 20:45:20.774269 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-t77hx_1280bc20-0a50-4b20-8b90-8057a3627648/network-check-target-container/0.log"